var/home/core/zuul-output/0000755000175000017500000000000015156512414014531 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015156526534015505 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000355315415156526444020302 0ustar corecore$ikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gfͅ >"mv?_eGbuu񯷑7+%f?7ݭ7֫~𒆷7̗8zTY\].f}嗷ovϷw_>on3cvX~egQBeH,nWb m/m}*L~AzHev_uαHJ2E$(Ͽ|/+k*z>p R⥑gF)49)(oՈ7_k0m^p9PneQn͂YEeeɹ ^ʙ|ʕ0MۂAraZR}@E1%]˜(O)X(6I;Ff"mcI۫d@FNsdxό?2$&tg*Y%\ߘfDP'F%Ab*d@e˛H,љ:72 2ƴ40tr>PYD'vt'oI¢w}o٬owko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;/u7Taqu5Ոӄp\2dd$YLYG(#?%U?hB\;ErE& SOZXHBWy|iZ~hal\t2Hgb*t--ߗ|Hp(-J C?>:zR{܃ lM6_Oފ?O1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾg}\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0Ac/T%;m ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d%1}Uhs;H?)M"뛲@.Cs*H _0:P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&VOK(BSsǽҰ%>kh5nIYk'LVc(a<1mCޢmp.֣?5t罦X[nMcow&|||x:k/.EoV%#?%W۱`3fs䓯ҴgqmubIfp$HhtLzܝ6rq/nLN?2Ǒ|;C@,UѩJ:|n^/GSZ;m#Nvd?PqTcLQMhg:F[bTm!V`AqPaPheUJ& z?NwpGj{VjQS,؃I'[y~EQ(S +mpN, Mq 70eP/d bP6k:Rǜ%V1Ȁ Z(Q:IZaP,MI6o ޞ22ݡjR:g?m@ڤB^dh NS߿c9e#C _-XѪ;Ʃ2tStΆ,~Lp`-;uIBqBVlU_~F_+ERz#{)@o\!@q['&&$"THl#d0 %L+`8zOҚƞ`wF~;~pkѽ)'cL@i]<ք6ym®Yi&s`dyMX](^!#h k:U7Uv7чd)wB5v-)s蓍\>S[l52, 5 CۈP$0Zg=+DJ%D  *NpJ֊iTv)vtT̅Rhɇ ќuގ¢6}#LpFD58LQ LvqZDOF_[2aޙ-did˥]5]5᪩QJlyIPEQZlY=! I,($F{ձ7*Oy 6EK( EF #31J8mN .TTF9㕴/5~RxCe,&v3,JE- ZF5%X ?A?gU3U;,ד1v6s푻jÝ$ >8ȕ$eZ1q}lrCy u)xF$Z83Ec罋}[εUX%>}< ݳln"sv&{b%^AAoۺ(I#hKD:Bߩ#蘈f=9oN*.Ѓ M#JC1J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd=d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^OHmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PW矬3yb.63>NKnJۦ^4*rB쑓:5Ǧ٨C.1`mU]+\+܁<lW Gϸ}^w'̅dk  C 7fbU{3Se[s %'!6%d+, Z`ͲH-nမ^WbPFtOfD]c9\w;ea~~{:Vm >|WAꞭi`HbIãE{%&4]Ig Wjoru ݜmKnZ<X; ۢ( nx K8.|DXb +*598;w)zp:̊~;͞)6vnM!N5Cu!8Wq/`FUwWAֻ,Qu W@ Fi:K [Av*_918]a:pmQ&'ᚡmi@ zF(n&P;)_]µ!doR0`pl`~9Fk[ٺ+4Hhao-jϸ?x7F ѡZ af#rjcl ^2B│x@Bq"M/lja\b݃af LnU*P(8W[U6WX ZoѶSH:K:%Qvl\b FqQI.ȨHWo;Nw$͹O$oEE-eq=.*Dp,V;(bgJ!gF)892s;=^ۭ7h9 lr_qSq-XbsK، JBJbeOfOO}}<ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&V"J~7+0_t[%XU͍ &dtO:odtRWon%*m~(fnc.^xt4gD638L"!}LpInj2ɘCGOa9C1L PU:LNTPlI&N:oճM\Qe%*?vQ~W  yr3-2+=Щp!k2wu_~c9'\ॣwx"k%oTͯ܈'i1Jh`(D"y@ "0#7=OP^b5K 0Bt&n2hev/nw 'hEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8Ǘ{]oOtsϑ`94t1!F PI;i`ޮMLX7sTGP7^s08p15w q o(uLYQB_dWoc0a#K1P,8.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} 3V6UݎvxyRC%ƚq5Щۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+weaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{#ɠ^^f3Nd0 ~n~YͤBoK&9<{̻*RmသLΕbDOJx߭&~+WrVXӼSZEY|RyZc]/mm}àpGg.S[@AeE{0մ{Ÿ&^Ut (6{\٢K 5X\*wٗYS%g,0\ Rk k8P>x7v21՚H :[Γd!E'a4n?k[A׈(sob 41Y9(^SE@7`KIK`kx& V`X0,%pe_ן >hd xе"Q4SUwy x<'o_~#_?`i3GM"mZ-p)v‡~#mo39YBaZo@/xi*@ t-ö]um_.+^ Sɕle$yM`. #N8أzgۏjk_c51̃^cn ba-X/#=Im41NLu\9ETp^poAOO&Ack vz(vb$^Nyo$p[DtUCE9s".zɪ) ӓT)D:fci[*`cc&VhfFp佬)/Wdځ+ uR<$}Kr'ݔTW$md1"#mC_@:m P>DEu&ݛȘPˬ-Ő\B`xr`"F'Iٺ*DnA)yzr^!3Ír!S$,.:+d̋BʺJ#SX*8ҁW7~>oOFe-<uJQ|FZEP__gi(`0/ƍcv7go2G$ N%v$^^&Q 4AMbvvɀ1J{ڔhэK'9*W )IYO;E4z⛢79"hK{BFEmBAΛ3>IO j u߿d{=t-n3Pnef9[}=%G*9sX,¬xS&9'E&"/"ncx}"mV5tŘ:wcZ К G)]$mbXE ^ǽ8%>,0FЕ 6vAVKVCjrD25#Lrv?33Iam:xy`|Q'eű^\ơ' .gygSAixپ im41;P^azl5|JE2z=.wcMԧ ax& =`|#HQ*lS<.U׻`>ajϿ '!9MHK:9#s,jV剤C:LIeHJ"M8P,$N;a-zݸJWc :.<sR6 լ$gu4M*B(A ݖΑِ %H;S*ڳJt>$M!^*n3qESfU, Iĭb#UFJPvBgZvn aE5}~2E|=D' ܇q>8[¿yp/9Om/5|k \6xH.Z'OeCD@cq:Y~<1LٖY9# xe8g IKTQ:+Xg:*}.<M{ZH[^>m0G{ ̷hiOO|9Y"mma[sSbb'Rv&{@6; KE.a\}:<]Oyve3h9}E[kMD,5 %sO{킒 8.K?]i/`׎tp NvԻV4|<{H@#*h{Yp/E%dlh\bU:E%h@&SEK [ Ƣ xg{z%ǻViX~鮦w35QE~qp[ʕ@}ZL! Z0!A⼏q)[f &E1K3i+`JG P/EG 4 9LڑKL|`PОnG#|}qOR{Q|2_tH߫%pD?1%(@nfxOrs25rMլf{sk7݇fjӞh2HkeL'Wʿ}Ƞ%>9cSH|cEyQp 'ˢd:,v-us"Iidw>%zM@9IqrGq:&_p3õB!>9'0LL]M[lwWVR9I5YpVgtuZfG{RoZr3ٮr;wW:͋nqCRu1y=㊻Ij z[|W%q0 CJV٨3,ib{eH7 mҝ(3ɏO/̗-=OR\dIoHZ6n`R֑&#.Mv0vԬ]I˟vrK}F9X|FI#g.Gi)%!iK|o}|ֵ7!ېATJKB2Z/"BfB(gdj۸=}'),-iX'|M2roK\e5Pt:*qSH PgƉU'VKξ ,!3`˞t1Rx}fvvPXdQSg6EDT:dׁz^DjXp͇G|X5Q9K$)U?o': .,wؓaՁ_ 3]Q16ZYafuvrq^ѷQT},!H]6{Jw>%wK{)rH+"B4H7-]r}7v8|׾~Us?yWfv3>xpRҧH-EeJ~4YIozi:nq Vq8swHOzf ̙eX-4`TDGq G.tݻgq74ŠqBFf8 9Fk Afq#ϛa$!qNCJ4bnvB @W,v&- 6wCBjxk9ᤉ ,Asy3YޜZ4ΓVYf'h?kNg?҆8oC!IMo:^G10EY↘H:L@D+dˠUHs[hiҕ|֏G/G`' m5p|:9U8PZ7Yݷ/7cs=v{lLHqyXR iE^1x5/[O6rpP40ޢE_A͝ Z5 om2p)lbp/bj_d{R\' 礅_}=\:Nb{}IStgq$<$ilb)n&  $uT{wD]2cM(%YjDktByxVl巳1~jpd1O9Á%˧Byd}gs9QNʟ. /ӦxbHHAni5(~p>/O0vEWZ nY3 cU $O,iLacoW1/W=-kqb>&IL6i}^^XpCŋ݃k-$pxbڲ&6*9mg>{rtD)wQ`pkKyt1?[ˋZ5NhfӛŮ Qu8Y4?W֫/&W˸~%pqq{% ?K~,#/0'NZ׽Kq^ėSJ6#j8GO[ PCbʍN^XS&}E9OZ]'t$=tnn&nu [}Ab4 +OLuU{0fIb { wml"Ms>\΋"?|nKfֱn !ڄ`[nUgu$ B6 [^7 |Xpn1]nr CC5`F`J `rKJ;?28¢E WiBhFa[|ݩSRO3]J-҅31,jl3Y QuH vΎ]n_2a62;VI/ɮ|Lu>'$0&*m.)HzzBvU0h} -_.7^nya+Cs 6K!x^' ^7 l 2Jj.S֔(*CjaS:vp/N6I*x8"EȿQa[qVM/)fpOj4r!:V_IG^nILVG#A7jF}'qPU嗈M9VS;a+Ӧ8E8zmMs*7NM~@6 ' 8jp*:'SOANa0rӍ?DT%l)gvN}JTwQ)Bی:D`W&jDk\XD&?Y\9ȢG:${1`+i n8=%Ml%İȖb7AޗuV3A7ำqE*\qb'YpuHƩҬV nm=Ɂ-2=|5ʹ zi ' ׹U>8bK0%V\ t!Lku`+]c0h&)IVC)p| QUA:]XL/2La[Xѓ F;/-rtx-rei0hE˝ݸDt#{I} `v;jUvK S x1Q2XU&6k&lE"} Q\E)+u>.,SzbQ!g:l0r5aI`"Ǒm O\B!,ZDbjKM%q%Em(>Hm 2z=Eh^&hBk X%t>g:Y #)#vǷOV't d1 =_SEp+%L1OUaY쎹aZNnDZ6fV{r&ȑ|X!|i*FJT+g߇դ}Y(INBKhx2 *MOenT.a~.E jG)j{=u^K+Ȫcv/w#MivX :)ǪCZUnAS`SK6OSxa3 W; K>窜̀'n 3u0?K@BS %fee}i]>̤+*l:\歶!IZ5>H;0)N.w7ߍ|+qUߤ^oå~4en\.cY[s'wSSۘf ?.D s}Y~/J[}jX^ޗ_-/̍ݥ*n./cus}]\>\\^'W /7nI_wgq㖟Eu-?'/8no!q݅l@Ud_nnzΙk0hAϏa$ X )@VW)2&?ul_MhYRczr?㐟,v~,b6)up)3K,RLW"Qd9JgT\1f3@Kh% a4x,kA k ^d kYj5Ah𚄓vXZhX1xҖ51Y +Id ZZ\C| fD>hB֡#-$+Jpሟ,Cg:6 3 xH "}C[`ӨOAFn5ʬLHϰ:N@VcyBI#Dr. "h hg ۃm-qu>V&൘ G7qi#^tҒ[JI!{q*lrD܇Gk@;oI<5xZ4xM"؇'k!>V|lk'{d+ :sXӄc)?W`*|\v aVT0"tMًcΒVz]T.C$cEp._0M`AlF̤@U' u,—rw=3}resLV&ԙy=Ejl1#XX۾;R;+[M&Xi43غgR 6 ݩJ$)}Ta@ nS*X#r#v6*;WJ-_@q.+?DK១btMp1 1Gȩ f,i}iOkEᇨ`;MxB[CzRĹr\8czD ȹ2NKLjp*: 'Sa'.uɱL˿ˍ-s:ViI[/SE unEQFa2V%Z?H-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCE<UbjԜ`j[ ygz[~S [j3+sE䅰2;^CgcԵr솜[/V`+@;Τ`5d0ϕ_Lع`C"cK>JO!;W¨:Ä&]䒿e;0:|7IIc( (1ZRÜ:OUM/vư{'j(hz)}_vq@nu@$_DVH|08W12e_ʿd{xlzUܝlNDU j>zƖݗ&!jC`@ qэ-V Rt2m%K6dX)"]lj齔{oY:8VmS!:Wh#O0} :OVGL.xllT_oqqqLec2p;Ndck[ Rh6T#0H Q}ppS@ώ@#gƖ8sѹ e^ CZLu+."T#yrHhlكʼE-Xo 2ze31cDB/eWy!A/V4cbpWaPBIpqS<(lȣ'3K?e Z?ڠ8VSZM}pnqL f2D?mzq*a[~;DY〩b𻾋-]f8dBմVs6傊zA"daeY(R+q%sor|.v\sfa:TX%;3Xl= \k>kqBbB;t@/Cԍ)Ga[ r=nl-w/38ѮI*/=2!j\FW+[3=`BZWX Zd>t*Uǖ\*Fu6Y3[yBPj|LcwaIuR;uݷ㺾|47ߍeys=.EinE% 1zY\+͕߬VͭW_겼cazyU1wOw)Ǽn@6 |lk'Z6 &+Bs=8'kP 3 |}44S8UXi;f;VE7e4AdX-fS烠1Uܦ7I`22؆DT!/j璓P åiw@wgRCsT~$U>ceއE)BI>UljO|Ty$ŋrwOtZ7MdE Ͻq*n]YoȲ+_K(q';8'O|&hlϯ?U$%k*&V`&bWZFbXaNy:>BU\D 8;J"2<?R"cUL6K@c#XmRcU|e~jPpq%0.< *D)E\i,2]icS-1VEd" HZG0ZŨy4eUo>77$TSmXB2?Z5SB$Gk)Xb5Y@F bqWx7XU᪙ખZ:7AAdt9Xϣ AO=… Kc|D0Jg:Ӣ^")8r지}Zė>9JJ NI={:%e+NݰBYa//CgFhKq7}| rr /<,\Ӱ#?6W'oY/+1Y*#`ўMD(L`Z PsZ ;FL;mo,yY|ej,z|9Je .lP GM6} w[hښ5SLy Rf! `RgW<*'"oL1 9 (IUȰ=2a~L'RNu]TgtWa5yu^U- by:Dy9Ia-&qW-ped I7T׉&}KJOsक़<'x_b:3N`)| b%䓨(q']ug~uVh]`pA!9) r ~A7w/ݡX̤>2}̅ydԉy^< 0vtk(=BlGBnH6|ήt4ؼ2i1w`zO ) OTےQ#~ 8$0#也e'K m}7_.x^ GIވR ''X| m{w&5 pvwood =s.%և8;ɧwG4 g/D$3O_>~ 抆cwtyҖ}MNeؔM e.U4MU}ljŗ[?mLeB2ރpݤ&-5O*er@_n 9.娝aC1`@HsR8{wCB8ˤrSR!+sPCŐAڥ$8!C2npE~f7<;M|ɔg9 Zx:ѭjqxO">_?wXd-sl> {}49bz]^ `bG)u%1 fh8ye=1Mo\z?lg"d&;?>oI򢍣P M[y A!D ^\ujw`3޹琠ķMHz.Mc,>W~<G vmB.%OS,% cQw1gcޝ\?QsqV}i1a$MXO?b;q}݉0XD9ZarZ gSIأ4צxիi}(f͈+ ߓ_߿DӸDms7ӽԒ$BkG7uHWegWiT<<e*qSw2+y&t:Rk4q$pc3;ioq :i CA{{4QE$ʶ"Z*H6* ~5pR4}y*jy# h6>h!SSiIXh"G $8*k*wŁ~mtVFs/ <3r=J)r)lS鲧ҭbthKpDjrYXf_{斀h@mhOeUVf4ʆ3DfE",,-*+6EO 8Ee ];C"J'yMfi}1m(n 7x]C  j*C2hsx{jl+X6ByVמlwt>USPjmW6M)+zaUFЍmHa` GBFIC*})W"0TODciƖ0Ó|ZY5maNxRqঔZ]򬊻zީ)vPy'4*,/Z`"7!܂ƗcrI6ϳk F7w:apıJ j"m-1C#r}TRT[׽q eDݶ~ P9G!N#gl*{؇.e;v&uN]wm+1`p?hW۵ %_xQ扜޺fk<|RǺϧ3I^\9Q62ShWپ=ݦ%e3EVP1ƈ{a,~љνG|mU[+E~.gyXhK{kV n:krllW&gQ=ˏò4Џqװ<ϷB݈B&쮣ZC sT_t7+)c)u~GUڒ U(P m8c7rnL9}ͫM*qĭCZ%WIz*mpSG( VB&U<{c~$Sm6ZԹ%蹪]kz]ɖHW[OZIt gKb]DO%g*mѽCﻪ vFRI81ãK]Eb]؊nWY&˳pRJV{Y7eZAZ*]7FዼH ]0ؕ <-V rgf}O&I(y*{|tV5ykfؐ94u=}%Ley=C4la*﨓;=@g-x{V5qN$TU?\ RU@ȓ|~52_uF+U;0-G$N8FAZKwŴLs,$PLVYsm5۸νaSzlmKn8a鄩sCa:aܐw2ދ /5%р=ތccsX9Aۗ ;{2m鄻'/ P羜&W } ޓ0=p ؓEωW D`]Hfn~[ fF 9x Y0ۓA{뤙g ^fLؿ t}|l~Olo3rd,3mf#uY"!"5 X'8?IxI01𣃙E( ;TYަa#MYuo0Y7Sٍ5g8q3|caʞJR4 7|Y.-]JQ>O2ٜl1ÅXVdMt$l?}zGߨ[$^|Y1Wl`w͹&W`$/0@Tyᾁ}rLpzp_v"~ _4֠N-6K"=! P7.X=!S!'qt`~/WM~翀] # ^NIQgF 0;J(2N $b=8kpRp^"^m-/n.ţ/lG ~YQ,H!pkd`FPӠwx~ekyL=rPr,P4h]^HõBC0~bMx|/Dh' p47z"{}` ®Cv`QʼnK`!{sX<ĞPckyu,8'Ơ Oxp(+R<"}0` %[(4S'|sm-[u@V1 /s_&}(rrgDA 1];ks˥/W cfejxw/%p,Z=Pl_RYW9ț(EՈlf#"Uvߋ!* !c$P}#j"\菖!dD-uc:* Y5?qW/_Nû)jB Ϣ )f9|¨Wb0EРB:WZuFvOz3w6gfͽ\&{Xw>օqaz+ {Y-oe9<['Ey*ۢ#8b<ߦg~sX`ѿG,u ([Xfp/4k ݭjk|F[ߗh& mZB,%"F1s`o{F,L(a^as%@hٍJ3~$#0cC,!X"d.ʼnyʳ <Z }|{RMvXx#Ga> 5:!,>S/tN>6>i6 $ G_ fеC:z܄M9d@o j=VUeʞ<xhE83P JBo;JUoe]dKt]ȇK' XƖwӠz܂,kzNBlLǁBnQ7%u-uڏ9Z+:>1@ۀ߇`Ӄ6<(5 \&q_ȫ2u5巂UAeO((Fе&e۴)ۡQ[ڬlv]oحZv}jܸ۴*ZOQRk AUA-~[ l!uvBPwUPw Au&zz[m/4A_BP{A (|UP|{A; ʷ4X4B`{A (hAХ\b8/RY[i!󗼸Ή AՋ;3jl5Om;ϋI6ª*J<!M\gb1'EC'!$j 4&8 FeZV$*SG_-ee+xF:RH~3I0g@M.Xt;{zụ,OUhN^2Rk/nVqlV-ZI+eA<89jFF]8mӪ+ܐ:/D 0N ge'8d\2)QrurW 5TeI>bi߸bt1Hvd%BL5 TlQl(ϐ+\\dZ"8MWf9nKf"qvxuPp_ T-}81G"nXriN SJ*lPx%5ODƒ1VHvnlrҲJEHwB}f.:%DGUHvF)T ]`͝'+ox+D ɾeIG0j8n%d,IЃ'0bTC6H{Z׏$g?+}{ۿٯ&:ﻔP>8ҹO5O&4CEڊ[JE2?Lc޾~ߴ+RoY/LM߻%tQ׽vڱF Nf1lj{уw>DC0(!Ī)m<KVsϖTk#ѾY(%lΠ0>Ijd  (rd!PYC(O*8A!l# Oe2XYĜ۲t+kmZ94\پ<*Ѩ5Q8OIQL+dh%ޱWn*&[+DiJhvpb)tο3 ޺=%x`K&\~~]fͥM[8hUا#ډNמx]x#xiWd.Lna:^1N]AvwoBϨND)=3R,?I)?9KL't @LkoSXx1n:7$ƽtKgævZ%8PEj&1Q1aYbC % 81p'{Hp1 X!TLԵ(uj>mHb} v"Ap^=AP&K.t1­)iYk7&oG-[;P@m4W4ioKZppJ^9)%F| ?obתqyV}-aZ!#f7洓#@KMHc%rx[9 [1e@saİi~'qt4P>U+BH  ΀#QY׌Pq3FNjA?J!B8$"ཥfĬ1jeĈaMN,'d|l ~ L78WxsFXTIgÆG1*q`Ge 7Fk߻i~Hz֏$ 6>eQE))X4$!ƈ!v;56ZbS%4 BueUF;ZόU$ZjLV4fK>C?3UܒP3WZ+YK=rhY(㧽"i4G3ˍG?"&U!W$ g(N2G#J&p|vkp^!uLc34:dEn,Km12Df# .GoH۱sBj{kS Nue"d5Ռu ,0.Qj) 4˶ML8$}L֐ٺ%if &SzGv$Z36W-O. %>H0]ɱZSQ!bUVK[VO*鲫*es)Ч>קຊR<Ig|K:x&Zs݋-pwFR>KT؃cIg پ?\3VEz+Q|QU*Dz:V~Et_Tt| zX")0tB.jxU88tAc]D 7 Cb+K.csmNh+A $"u[CNy]oVBphe<I,oV,6q ־OH5;iXj 3Nr+xc^+MkL7#F ,45;0rGfS7b1vOTRU&K+/ǵ̶Ď42$қZp:!9)n FCKpö}ǒ}>?81pڀ7!Wly SŠ1 / $96cY_ BZlcVm%[d.#<,=IN5p lA ,e<&ob ǤHyZ$)&9tE4hǞk,(QL)t =\c IoCU n Cbx,!ޡ&ĩǂj'ࣣbBoV['Ԧ <#g)xݤ"W(BD5#`c7xYq*f@k ,$(yCkÎ{s2D0A|N74x;M<ڤFS^:_RO|=iZPp!J`Fɴ噅QÕ<#1BC%;5^̳ n @hGf/M\:G˼,}6Z# Ig8H[rp:pzs00R Zpsَt^r/w$8mtm3DpM V&kM9p!oSa]U0tLMzG\a&*J񫞭T ֖S1ԁ\w1E_΅K:3}P/$8r#k@`,\9^?yoCߏi:&pS`.r7~x")] a%k ܖDnQ[#38ӛMy6Q<]e>cU>߷bIpLsi GTYǜR8ti#f= \"1}\OAbx# )JXKxV,JdMTs]"+UaWHR`fa>S @&F\:sT:x1'nQ!GP!SښIj*XNbQ )Ij$Bemve1td#]r 4nYjˆ5z b}PLZ LZl(ϭ^VyJJZcaR؝kؓ+Rs5)njJA^ #E1^ O0ICCGn\lfcL$.EVCa*E2S}ԞEoxW@'577]:gF~4'!Z?,R7N\ ui3qI})G<*vl;7f.aYdo @LDu<^M΄Y M==i MLn,wbLv?8A bYgEă /-ȝr6ќf~MuHI6~t08ur =덑$8X荚iD< ^S(p3-8D1YƂv­)iH:;$8e?L|Ex{k ȱӿ:OyR-0|4ſ^$W!͢R`oaQg5f$l6BJ"1BE99_n̋!Q_0+ҹjg+4B 0QTd:ƒ 4KTQRIF'-zU7*;5/d7bK;`gǘسi'h{z;ok( N&38HV0r׺ɻoSsڗzb7S=l׫LtكCz(Yz>8_)& h}ovAhK:/&qpIҾã1D5Ҏ5}=Y%dJN1/ƕuD-9qKOjzۉΣQ=Kog.1gp. {tsinjU{B7'8İ&ULYmy;jCS Nh ~MpK6Em %Z>6q׉ (5w 냵U6 Ow}h#vޚ`Z`y ԦəDу NOevGUԃ[6pH-:P C2J6?ā;^U!E9 CawWU_utS~b}l;{⒑it_6%W&&+Cnҏ!f?N{!ěxWiXYp?;21e'ieZ{g]_$A c%l.9=x Ì.'!U'?X#u__G\ '.VƃLbv0 Bx48g (;mzW_ w\a;lFzkCM=Q7sq-2X,l\up4'͙p?AoӃD 45BWQ{i5Q~ʒq]q ۪ `&X\p_3Bizo x䳷`PD7nD@2 M,z@ʋu02UQXe'@*?/kNϜ=8.htkwACq^{86d0ϦϽ錯aQ$žCĽä7B^] EI;UwvTG'+?*P+ qo?Ѩ\ǃQ$f@ʏAc Q-Nb<.~OJmlǏ봻)Gdh@ 2$AF1c. VV=]OFp;CD/k^{$+YۿbpWB[ќN*КS\Q1.ceQPQ*{WR)pYa&do#K{{k!30g^3fᖚWT+r 4+"#0UGW;+<ĸu9W WdE8^#A|PVQlwԨ(̫7W} zit`tgL\ D2)κ=}%LJ^ A-.g60Ӄfxi'mnN%UA1_лGqoJ+oB©$J%?8+BF'<ެ9 aq%xkKQ =/A~1;lc)KQN-O_@b`d?=I?C],OƧbsz֨znQLA.93;Q\š|9q<>়(aw|73LZ4F(j8),|ESJZSƼbN+va:袋՟Nbu>ݺ|2&pJ7 >{}KIz>Gz `Ɠ 05"De dG&Mhɛᲆc{/"jDuD`-NX0ѣfo%\ҲҶ8hG̟MMs,P)Q9V9eG&g΂ڥFP#W+jsl]-1fA[oL(̓R\[) ΙՐoTtSFRQɲdC5)|#RFXkxw6 m{H{] PXD* 3ꐳHsp>9SQKڻl:4@Oަ-6JՁqf_=ד)1{oGvqfXǏ7H51~%#R<"ƓkߚNy@ϵh+N>6, _ Қ}KrO,aYsjv/B--ZαYŀ3%"/J%@6 hb{㋰=iOD%<ϸBd*eQr["yÀ~:i6=GVžR,N4HԠ[98 ]~ȣBAΟwo)`olxΑ+QhL\. vWfB4r9' e\yi>ycC.SB(,3LN&BB(W(wrhK!"e>?*zړh.]^MǂȜ ܮD cecnYpiLέq}C-4lɆg\dݼ=&kNli%˔vJ:sw%nK[7!i5Y֚9Bʣxi׏Md>I/H" DŽ~Ϭ QK%{|[}&=J8, R#zG.%`>i=qo^KZW)VwFjU#4bkn2N9hfps>.ff+{\M\T8zΤeRrA_ P_fQ]\ɮa{ >4tLhMnǫ߈ʖ[AhT./ & ,JOriT 5> `]k>)3!:QVS|0$uTӺ`"U|y>EA{s\౲;b2>Je+o엟ϲ;So?o><y]m?=?sh[on^@X_m:|3k-ÿǴŒ[S37C7g AD!z4!qbr]A{7t,(#TbJ4h,> ‡B5OsXLA˭/j~~$"{on} g=W SMiEeG!6:)2Kn/!)4V8s)/Ě&u8ҋ`8b4>չ}&.IlϬSEs8dLk-9jP2V1XQɘ^QZ} d߶t|ַQ/ EIBsE42 YD@&-̥ è&i#߶kɚ\C0jjW눉z+c$rƸ?Z}޵[Z2TY6TY7\p!"W"(ƱsR"p \F2ݐLzx|Ǹ֍|L %|@WW.( _[@4D9C`[xR4ժZ^@.Xj<9fQ:ylQ)-8HAAZTP:@wT m<0=|C)D)J5`HT.BrCW`k(!;"yΤcpQD\Q Ɏ`^&4 +IҸI)ҏP-IyVFϦ|PC`')dR)㨡.s#: 0 DXa$& O5 3 b=@e09F.X1&a ́M \>ͤE!m\~0$(01TRPMuـq SCc`/[e WX-3vC v`xjMqn]KZe=ᤄ&+T{L9yxĸ˅"y"Ί:.Ը.arෛDt\fw[rm,ҭl3y op["9sk)cTJDBF%kM8Dl)>A#p&}. S;aleJ5}tkxwR}}S v@nJ"2E1x""%QM ٠(?LfC4d,(Q+ fҽ2~;i4Uú8 8ocaB B ܓfhI>B b ^WhPq}%ÞʳkK/-訃; FY rppBM?2RPnk𳟮Kf]Ӯ eJo{"d4q2r;A+7+7DWCx{QHB!PG[/u^^!8с\NZ{%6C0z䜘V,sp7-;qwI vdwGXG@^05 F WX۞uH>^use=3ύC?㙽ꓞG)6Gv}sU\*ٰ9`1YVL/( up1,OIX.ͅ_8Fs정Gӎf*L`&q_C8+gDIB_A⊕~_w7MɯͫTd c엿\N6bZWOO08 k?&_d?,Wèuӵ)R(߿߃]Q"_3?" dP)zӐH̓Mv7CtzPx{YIiwe$z1/z3]L1};ګ$Eՠ]T]ךk-i#{\։Ajs۫08Z,r[ЇFFUMZ+ҁ?{cZi]ƾ|TdNNO{keO?QdKdI@|Ouu{C(s{7/?cн$wûϷG\PMN^=wo:A|֨'#qdcNM n ^m][Y a:k8kp:55l"9ͦ&]Kn#Ea^>CMb2˪<4Am%!*.zRO*Jy>W/<iL ́ 70wZ$2=D{ԗ Um8ySJ+zct7x}N8:ts:Zs V. ތ;>( ɢG]L_:ŚK.,"7W'{]m<q>_ˡ^:ώ4V7V!`*̓$=je%n3yU3hCd>0)!͠*LY`6uO}2adePclW]A39Δ3Q 3׵v\1!U|b9)l&N9Y< ~Î "A֭EڭEZ%/8}ar򖤒8 J(PN`2sc\סl4٠B7+Y<҇"nT[G"n? XSvdTR>tbDEם]쮙K0LN7x西}  ĭY &]@>܀~*]@ / Ff0(ۂ(ƬXLK1}% =`J a;Rym燖~`oi;OO a`{?$ KzMD=uØ]ߵnf|+ALcB:+L)rAayGt$kܒ`9H4BQCAGyv F.ٯ&h(!p܄Y{4|[?> aO Pq9ԗ 5!"xLE QBu@_z^=~*+JHj'(Pvmް+HBwVͮFE.ܱZ.Y_h ;ޭvP SAjsl٫ՍȎmnג([8{Y645R#ђ^U@yZk:][>xUu6|gZ\#lxm~=[`K*Xph:fgkvvvlFPt',klyv) ? \ZI4e:$wUЀ7= ւ˄<] ON+jt9hi+I8z"M@8aTJ4+,h4OT"aІĨ";g?[≉:IѸ-|Q*RMy$qKSquGz!Ǚr`z~TUa\x<ޏT${ҥE8@t8?c YuҶ^P׷jތ۷7m=<zzF丮piRJ":NʅT$EƔ5Ok%ٲb4>k*:-,WJ k(cü`{_,6YR.Z"ZR4(J{ݞ^ǬRn+sQR6I߹8ZG\Tu.Kжn?2^w o紎+g-u3U$-hЇßL {~xV\ßڲ[RW h%EKV1Xx^1iVڽj;/4C?Q ~~[R~ 3sSCV$֒E8(}eYzss I%NbF!_Hd N᪀[h<h k umfJy5P@^&#~_Dy7d5ZwazH >0NSI1CwN ZJ ]gf6_;hJk;jA َ q̦EA2L)Q14QZKM <)Tܦ،f3$_Y/=O4[8Njʰmd'>Uj :P 0^Ӂs/:dg[_YHej[{/PEL^-3RX\*//IrIUWsI4>{7\zZ7_O|N7 [)GzH) GF.cD#V=y#Y_Aq\u;5dEsj}7"1TkJ҉#IUOvC(Lh*ҩ0!Z"%޶fr^#H^Tʘi,w^J`e+2ѕ[n2TQm[}jfњT@V&!ckLPVIm rS1GCA@Ks\c1GMpɨ^9jjyv׌}hO23ZKʒ"M5Iјw77,yp{Q3*xc$%h8.ǻEwKpw9)D5V9k4 k,j%d얔Ϛ*ZMZ1"ѵM -M-0nR2S ̔x̔1.~ʹ\Ea4%hi#&]j@)*zzZ4UJ֦ *0 z7ƬW\/3S|bGdYe]K+Km0Gd꽇.޽e*5T*=Y#0U.1n23B3r)Ozs!N쪻>H椹K/yn;yo-QF/h+FZD=>?}CLOgeZo$zźs=99" 2ZlOBi3an=Ζ %` !͊d<)SibΖTx4xrh-BC^drhțxցې,W]׿nBKNٜ0X ynlK]mpطc^Cch)˘ ]%ƨXyF>jBB]9 ^v李GNmS>i,V 5 2RGĒ襫if,>1!ĉ!kȰ1)ڀHYD 0L4`1YĔb@1WSoZcU+eKe#H 'kj[2qpNibg(F]s g<1<8#jbH9|>\}'SsVI7 4FAZz+Y4R GR45rX9LK{@vSVqH9Bv:oP;w:h$py}zv G-wA`ޚA/qSF؃ Ucqf0R K,R]ȒʒyY2%PC# 8g-L)%\LӎY҉$T{śOzZ{Rq6}C.!5+pMvVZ iq ~h!@7mp{O֑Q+H1w:Aָ+EWo}}not*) 69YrB`5ZuAmm`[)e)/^| q%V/7{KWfqn#D򦦍CԜ3TڽH.Xiwڽ>j[cwވqi66Jdtܩ/콾I޲0PjF1 Zj˲M*ײ̬e ukk15$H9pp,TٶeyCCƀs~${`1=&H9N cJK#xVv7yd:nk~轄&`퀮c/iطc gQY4R AQ&X dM`By{ƅSoWN`$Nm jvw !Pncv z 1:i&mye7/hY\JFpO! I#pAQ=w!zŶ. u8 aZ6hu9{2P-k#Ӱ:?8?m; BҮ4*3IsN # lS~H34.Mٺ/?Չ4ҿ3ӈ,L)"N|mQۚy]v~AH[y֨_w&!FAH %0v}SlKmK_kv2i]HCuP5iT!:RgHVjFKWVOu2МBQeǣgo,)1㝥jfwgSsZC?`BEb@Ai-XsPHOjםbl>@N]5}R2Y6R(]5 TPl=?NB_MLRŁ pL`YL&dž6fNY ̉X˟CAl#(+p0fE?"wW?&SEs׍!Lo X_;㳏͙xv}ߞmzX}3G~ 8.лvyh%eVi/ˁr6eׅӴe==\vemw@q Q?`}&24b E"VP ۯ(PRK3ָ#0A( N(l@ѡj- e]]{j1"u'Rs/YFa^]:[)y UB0!s={g9H1B {΁k*5B%dK\i>II0GFrʃӠT!/~ag1 DpP&n0P <~ ~ N-9deH8H9w@ֶJxUy,&v@2R#i#H i~%u.^tս_1dgbt<#EW佡 M`\sHZكW V՘X!'b0R-R)^# `{FIg.S [m؜>H9KZJ~X~TyXGh }0]K;QEmDܻ]LhOyOA`0m+⻶i )r 4madtV-re)Fh b WAot^y i'Ճ^~Bg)GЩ K\}B? cuLa<2. *FʁJxP }cB%8Y@*"ɐrb5EsR瑲hԍ~uU!nD#e1 4ހ϶1gzoS=\#*yuQ;j=^nz=F>뮏Vι`ɟ}C 콏 ;UttuɞC ~tVQ`pi A fzKZrW;ysyP98 =>,M) ?gL3I^]w^Թ,u.˖sYNBY1~c?=HMk& cVYuTgNcA@"'P[,6K,|1P+kEЛd$@*֮)VĖW,wGUWf~;wݱW?Z}'9~=7q׶Qq.{X5w,٥ 4 [bEZ`L+ESw}e[\/?pFV'%lyc< L+h}JFK_Zҷv4GV:Šq˕Kp_ _[xc{p~=,jb' X /4dm^WIH?P}Nwh>/5m'ZƩA99ɰRJLX`VۀoPxnN34oG-w}h)خf -b :8~7O:N_dr0`sbx݈z>Qyk @{8Jy/mJ+?ܻy?$mi@71m!<{ )zYI6hk[͙3y̙ݻ-)ȇŔjeX WJ@>aVGhM!KCfckigΌ%)Cb *uoLn"2h"^vz= 4"|9կ7~)Ûuzoz{d4%}V9x.H=k̳T>+]~xߓ@A&vr KoEbBʅ^ˌO 0J,BoϠ̹3 sQ{{:a z n$e!s`$$߻ŧ~_vKu]6SMlewAw: UJ@[rR>4 ")k+8jUp𶂣8 _? aד7 1oDOGpĐG>h-h 8,s 1,FFno ()x &t.NBgz3bCx=:!&:gSnSE^ |L0 Kܧ~%*-@ϗd3˒8FIr1b{šp*5Jձ{4`&ρ߬ G;!kx]Ք!.@Jw0L7UۿF_k.:twnNgqTEFpWp%yUa{6]Tݛ wSy a&%7}ʊO$ΎƩZ$ʎGA#BʐګMaeC-+LѫagK3ݛW՘-2}qBʫf2ÑX$H&eZc)%L0a!Kv[9xikY񫯣gXvXܝX|>R0~;f.76Bcxnef?^CP>+l^@;D- 1*@c^<{vd4>#iqIN3'n~}"{ ꋹʼ[_o)Y`x|l| WF8.L`>5aU{TR kdF,8@JxٲHVfG*Ӛ4\Sbeώt[ήEDŽ6]IC# кa!Ś1h9i2kE%'fN60x#^bVbĨ$m%Vsf=MmN,¡T5 j=9b7k/\H4L"Zvש4:h00,SE#5kvU2Y}ͮˆd.Y K8Rn<4ƥS6ʣw|TyL a+p< .}WR)snk-j)Ac*m8ӱxoZ)hl@z*BP;$2CƾU7R)ɛ0,\pOl@+' J?C}Oe(j#eؗE M}fY4eѤ-Uw$Tپ'5//} H /x Vb5/ 1R7a=5mᑔYU0B4#͆qpqYH?]F q~/bvB0/{I}n㞫:.nq'T5(}糉ٮ(rn!.& 5ָa. BnЍB+Z>ֿU-'e~0A@w=Iw0Ũ ct 3d ,Nï!uq2l)E0ɩJ2ǤBs/G<8+{gS7G<=ϖgV6 G𘡟S5K{P W {a 0T3?G C%Ieڰ_~TP0V_~v߆~eMRY O.r Rpnw'Sʫ!Ë+31G&e.υ@Yf. eQADVC%X#PM0,pt@ϲPR%qkìs R.`L"=u$Mܐ+`hكX1ڐ1xS886EO{\m舲3IF+}rt8@~'2=jR$x#Z+d[#}N^$]|>3b91⁘A+j#Jo X5]wa'ĠŜj|XV~ UQ+)X[Jkvc8,Vbf7{"vŘb~*F5E$M鑀egGm8mN5O*P3EZwݱIqm) [!8yk͟a ʳF.SbZMl(we= AGni@fR&u\V18}}"ٿLz|G+ugP?s1_^pK_̗-H-qQ>uд A2ځ8LT*O[?6m*׀&7LRވ-d埌},-ˊL3=L }ˣҕ3s4ݻ>}=۵z͆F@VXy\|*䰄W_+4~Nup\& "/ 3b1ޕ_`p?!-9e.|*&ǥ7LaZQ Ml+CYMVM< fU=YyK0 iz+ȫ{*xtW4 Uajc|D`pHM.R񔯤"!ƞBv}ȧOGhߐ}ZlzѼ^"wyA.$82s)PAU6*T(hb}VFRy.R|cy8ڟt ߾/|^u@;!HC0QxG1 O 9)R^LǏK? :݁R}Cf U\*YWO{3B׋X. >ގ >ba: ;S=L/c0_}6oeX 2'߷)!Q^j "Awc&xS Lh|1Oݷ\f iT3Dn&Q $ c|~kNV\t|&R$ +GQAXJI6 &IRbSJ2p\Y&Q*; sBaO5d6MtJA!9'TJP L!&C ̥llE˴ˆ(&V:g % Da\ Ddh̘jadh-VX"\me4!TȌ#{$ cN𤸓LsqeZÚբ&u5tiTpX0!6q+4Ȇ&dYG0"tfco@ء_׌Kx-CƩ:!$.6x&)͆#9GRD IT XuJ%DêԤYdfu& ,E0)yPʗ kZD,Chkٞ}gƟXe-f`I&> ^{@#DzMu\Ll?Rb:" ΃@r! #rlI5Ӡk@)@d D`R^I ܝ=BjrcAXx kTg]I`j&m>rFQ Xj5YZ/x)<ߔ,"cWn;XTk)QȂo&Ce~Ѓ\@ vPDYjX(JüOD @gI&PB 9CQ:5Z't},ҽ,2A5j% ,G& R.E72譈TSm+XPˠQ*w+(a*0 [).&).mmM:\ew7 bz\EctvU79ײ2Faś4gf8z_CؘLG{ ೣɠMŰ,Ykd<JV ]"=@9Ea7fe(&%<%*` r(SbHz(GȍZ}* DS{r9V2*XԐ H5.9$Ti'OK ,xoc1l8l '3 XQW$osLT!F:JPj 9 J^ozU C {?eQt@G Y7g11,r>"[`pOTH2mm*AcTe7V*1sۙ?4X{z`(-gO3(j 2rQ$jd-ҵJP{ʷҤoF~f1H wRU@6PH` {`+=0ҲVā3^\%?NčrhƇJFګ5U =ϼMe(T qV̯^!pHP"l5fc)+p X9eT$Ҭ42匃HrS$wQ1[ UeySYGtE!B t!#`B=1@[/n-n͖0(_|E wuUPi8P2Q( ~I6WcOu"Q1?6(Շ~[L~╠|zz7&\kYw?Ny^'Zrr1{ keX؆hvsoWݧ`IANY Xٜ!c6\Ϯoa:طqX}m ^ |t0iSɊ/ZN)=8ZAK3h$`Qw#k4X€L0@ HyHsPZV*9[o_v,#+#XJR+H~HI#k]pBI7 mԯ6u{V%hA!^WAf7urlKR[Bϳ߷ H0:!>lT *KGV+c X`V9,??ra$` Hrؑ;0yk` Pxk6z>KGIr*v~z#7Gw,\<ݶ+לOx^EpShn(c<KZJG#+UZYFVkm\wXZḿo59w 0j$4r!f^= h#] 'v1\^-zֆ{OeXBnw9P6Ʈ~}oxSUZ%%2XՊHUV%W~q!~uo9J h7`d~z}=xwonsEy~ \ϦWe<9+[;_&zR&葀q,`yC O#C4BD^e3AHq`vN X-za:i6tXqd?Y\^ai#O>b &> ;XϬH*aH 500’#k~XF >^XA`wl l,i3$e#H~pJjRt[55Ɍc}#w9( 朕#K)k%C͒1L~^\MOӦ_}XUwC;^W 2wu(]X`W!3LRm{ <*Sm~GgY]E\,2ո~zK!P俿E?CǫO?oGm id&[4ƨ'?[ӫ/PN9̯nÜh&oGMImz 'ux4_MOI9= 'g7 aGs/Ϲ^/<|~ ǣ3_w1׋zv u(M7U޻M(Cou?l;3C =u/.f9.Tvwf` B\nYs<^/9es EVtspjCVtoxtq˷+~boQ'_ hJo>suunpQ;9LYldu6x{;l:a|u#Wߐ6oh ʧòeABvխcsY)@">K]BRsTJ*YJ4'x3T))Kٚf}$y;mAz^W0;dmcY_7D+[,DfKշ[;B4J&CQiE XԞQ|/b%$GBހYXjq1"sQwtsؘ>d` xHN7es)& B2P(!Xx[D.ѵdޔOc@nJaOZ mf\WnwTSps$28VH-~*hbZ,TT`Eh-H@,`1rиS,xB**3tP>qnj6Z+b&`UG|ОW"=\]rq 2[0@׋wZ)HU,cck`Qn`aZC*e?pZ(Ϥ  *`D+0'Hty2Z.-%u$8E>/)ppejDy=ՊBƣUTXIcRD á֞ R*A2a+"ƛ1sՕBY#&'23b_6ha,3Oo̪N2/%vN#O ^xߛiglۙK3Vqİm.xQ\015y<՚ka&a%W0K6#3 Nz-(] &W$ZʔUzD6L<&#q &`E:.%*  LĴyE,C`rQR`0~b yq `}QPVǃV&x[ -&Lt# V''biV0ۂl2`$ a b|}]sNy'SaA[w5$`QD>z ^{@#@ xG8S׌K z1h$!]%ྫྷbT5ʪ+P PjLFR{X|(#m6d D;#aytҫaT%l`{KSis^u@XK4hQi< C @N^ʀGh`ޱu= $@frP%F/JI>2 ~ЃA2C #RFdoCAisWs΢b! ӉTIپ `9!gѝ8M`Y5 3K7 \YUi-=+^^! P̤'D~^3&%T0F C:)&k1Ҙ;yVG NiW>w2r5-1St)骅 ÛIpd*lt&PpRuZtkHZ\UGCo dJ3| gr7LJ$𠗈ś !u>fDsC < %\#rI(Bј2!R;G$ DQZ SԃV#6Q22' +Ap "@ٛ)o}lg +F.TB!E 9HbQy%#>j &U,:e:F@3QEJS]dD).XZ#e̍Lgjc  |,&LA I 2t$ 1Xgþr5WἠWtBe_j%z52m@~A'XtnYɌB3 ^ԀD2ZbiF8|o,@ɌS:(d %.I%n% E~V;͠6rIWŪץDL0زK1!:Fff%ؤ1%\$‡Ȱu!ŒCt\NklB: ekHX;g7L0P5a7[h#y]\ōjӖN";di< [e`[~Arz:$CBXhAX#;useR7M)fcu^nvun|5]r;Fo=^M~X,Wӻd A螿m7-Ny t2|+ 8/i <-lY8ZP3 5:@Y|Bɔá[Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:O$ԑYG$!ZFCpP:܅:ҫ&:Gbք:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:M3PGŘ:SQp MEB+M5 uAHmB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&y6Bw+nnOo~/{7?bR>Rr-櫃b k7-~]>_d6w {@*i<.XB5HXzɲS:n$`;Ξfwv,`B# Ȅ Xϱ"^G$V5 Xl阱cYcr$`I;`F7\V(< XqcYPc1M+ff$`#( XK+b$`ih\ג5qG%qxwh7utX\7ڧ``"X Xnv$`°ӎ2,}}nOKZDiOoCzW^_i [¡ֽ]M踜.jBw{8J+4}B=߾T'OzB=:$'aYO^_N~7#rWjhJ⮀VYPj ݕ?#wwHa\FgW@j+tWz7*wExAX+6w+3jD{ť\qƌ<PTsWcwWCiMH=Z_ aߨ6뇬>Ňo6hݕG0s:%?M]/ߋ}bĐ$?򛿿Pߟ+ÁqͿёuk3霎Jդ-ix;_,WӴ<]I5NQtV'zϻޖU){%ѵvNxϋ;]S:aNͿA]t ~/~:dƞt#d9.>f!0\/½Xi8 'yAvՈj%:1}f_O7 lgL0.߮x/9)N''&M14PNo^ )Ʈ":%YժKΉ0 2Y*n(|OnGNJ;|ȵU$yU뿐?;}8=Zh=uLAS!Mm;u-wtک|utwZCroc8YgHaw)xq3rLNlKa܆@w!T%6c fl..@vgʗqS{<+ڙeu@ܭfCzz'c7+E?t찞=勘y?]/AwƏl9oOq/R>^i7Y޸J'nzN2ӝ~~ޔ1! pV=usH|M8kuvq ?/o]a~pqL4oϸCqsRLwEʒ~Z쿟c̫?w6[R.~ܸ|s0̦*y%E 7^\?"89CmSl{&ߥ6Sh<]N>NW&g~nnW駝 4Uz=ʫ~y}0uY5no`v~ACFŤF;Ή^|V>Maj꯸ot>]bj~6];%I6,#lz]5z;S8*drt4y:/g9"C7Sۛ\5݇-U6-דar]iIW.^^a4f|/ί ?cw >!mor%bnsU :]lyrek+ uN@*} tK9;—F_g^NIQ>I3X1 p&[֩UKQo F )w@tĔ` 5r) Є81bRUk>kUdGO(bMs3rh*|)!3tE[_7B` ^&o<Ŕ3#Ň!RpFνڻjU 2-5SJ. Ve:SG"_ S ?쾠AD7w[fd|[W*#8(4"!M?eZ&/zD:,y`G M83$jzJoCa-xn`۠B N}]B]V AHs薄"jZ *k=2cDf!6Thk֪_Ru6|)Ȇw_@|a P¤YDS><ʡqvDT.R-9'9(6/SVECfxŽ"ӓ7N84I A5 '<S?~;6ֆ/+ /H3qnPS!\Xͱt %o&d]=Ys >+x䅨UK \p?%IRJmj_)з?l+m2Aƫjrp.#/׵yz"e#AI9w񌷋ws\h$^A[AYבR~wq( RToXv> e Gx9e%J Z * }\b<\a~rsn|)JgϯS-p N"ߋ+,oϭh>mơN%v ܀/ ho&dr&|[\Rz$È窝M}+ cˀ1D.lV$cTI)gA8Yȗ/V, z`śe@0b(z&P a Ȅ7)."SUר.%3IkBFK:DH^E#"S?_/]dͧ;RlBFϩ3tRy%"&m%h@ !M=Y88qQSJC(Ka+lN^#^]3]-4rf~}Ϸثerrd]tGCd^ⳗoƻJI^p9lesI})OzD^ ]bъAF4]̰Kyr89O"_㾛Jk,l5x9sd9KkA]J84DTګ +?W*z`mF2p"ZL2wqYG1YG"_ =%Iq$;]0&nnhzw;/}ͮߡ Kg | -n(I!&=C(F[/F]vō`g;LwBnXW-UVghvg8^3p&U%eɫs?RhbΥ)o8d;,&թ۩J87umm~I "w3] _PJ,.p΅׆S΍3$J^m RxBFb\td#`b7wmT7i,ws{ÒcVɘD=[ML(0У$qn6+M<ٟS+6݌)[`\Π2~1 \ZL9+s_exבȗBp\s n7޳١y?ҥsbbu_3F\/B,j'A͍K, y.2NNqD:|ɛ<=n7i3g-)8tμ!7/&5~he ]Cp%* #_erbگ77z"2zC[nZZzС{(9OaFiJ^i$_+84085Tjq|dV&fxJ3ф~r˺[/!^+OyoR!~+s =[{tvZ!ӵ3IoLY7Ha1œLj żhXK,\]3~'"c,9ȗƞf@-eWЁS4>1(A; ADtLF0% 0gHKAo%w`SKڝfS,˜va7\#<2f{UftrPbMЋ@F  S:`Bu$$;M)*-0pubqYVcJ97HH q|1PY}}FaBF ͈no]q2ݺT2!._aM D4åWl +e:`~uB>jË.~D(Kaֲ}mɔljكTԜxF:-(>+W.8* gǕ$zB7ll4+ %|J\ ] Sxz_eB]-% 0ATj޳f cz y 7nq;Wb]1fʞq$PѾ(0HObTƭ;xƛO\xQ44`>n DR x˾6$wzҦG_RU.ܪG~CQ5? ?HA =e[6ۼouFJT6!b4V,{z}g*Ug~pPRf)Rt$5w& |)C,"PH0T 4'bܶ n?ȖJThRw R":A@î2nlU@ؤk&6)Rȗ.TgM-B‘WuMRF??=Ex,'d`ʆzsؤylaa˟,%mqQ挶hx:%tB7Pc$򅠺@%aѿ殝2 >>5h{wފkWŅa&:L` ^&b?=oDm!u_zB9T*$-#%Mڥ3!:[pؿLx&^gRŘkHQn4HnFwUA䆣`Rjnrr-|)nB?{13% NO5

!@'Nf(oM-FK4eLc'9WVoÓ_TR&RÙ]8%$g\Qs_Q~Tˏ꿺$3Ql~Wd8~s2]n}@b]xZ|<bf;ۖr ෡V֭NjaDkOK1qar~es\so(W-/|_u1vvip˞A2/ZZ>o~͚._}3^&D/۱_n?rݗ'҃_-h:ӼUhb3w uoĿV-=EWGQxk/]otu#17@M-ѮSXe` /C=.%zd0 <^S:։d+VM8_f)6rZ Qm~O#ut_V{ퟹs?@mKEѼy7u\^pwօzfܬXdޕӅIW3{n6{f.CQ[lR~(Odgz0!v\00Qa&ĥDo}l]O(.2  2%yYe _yR5')I _'Skا*`%cgdY 85T% ֵ ed6&Pu5t&/"ӣ @x $ KHy inb|')6PAAճ?8p"_}Vo"!p½Bg{k X[!N252y 0f6^"&\8FR3W\,txԍ >n!ziQp=*Q4uEBmSЪpwrImqʔ1:}\u_ڜwD#xtu#c@H#n-"18wO˟!g3-Cus#RWuQL)];uCj:|~o7Bu+?ŗ?N ԥTψ@{Qcf:Iv_J)x 2 _={kŘd /a8 Mps#WԄ^{S}KmB EcDJ՛\9P{еkk.Fm/ F&.=S.PaLBIңE|Oc. z2_2Ęd3Qum@8n"mP5XuQ7<^O*5u+z٥? >rR:m]2("r4gpiC D|,g-:|ySJ˼I/ )y=䊺Nk;Ro-NCV)?oHb-9BB2χ$g4=\HHk}"PdUBM5P4Kx~C=tL5e)ȭFhU4!ccSD׆h ݀y;V4lj0r:a ofL ÔNE(urtpMw_)N"Oy4 ěWaL%0wRL 11@2z~`)v\ T=IC!c g8 bYCԘDZLR‰K!s|X8|l<;OyBd*(dp@ iyp >?DHP}J?,C.r/6_ׅ"O3b$"6 f%F1QpXꠃߗ CtMq(1Pڎ +?g<V [W^86g*XCHQa9c(F-0<ł.US ކ!b c/ŴNfZp%xD]h!'1%2RxC,7XjKLț,w@!N=$ U]{e/-u|bL`8 ̍PQM2AII*3lhb)7i+-7&4f`!?U!\Kݫ+$BA3bϋpO{SɣW#ZKe5 (HxޤCyNZ $K OSwiD|V^&ŦI`"b]_ٝ7_I/eu?k{Z>%ZvwGh fۛof增usƨTt~!/ JLF]LiLiʁ#lK%儢#3/u(X~ޡ ! `RQ>ET)=NG#j^a2d83~}rڤ^I~xWGIKYo;:؜r1@Y9ʬA!%Zo0gLiH}u&p*XhiP3Te:s1fG0 "wpȱL@ѷ$! PMn%Wm[=x'磝DO$YTHmP*:dr vBPaU/kTY6 e8qYc3m $M}(ҠtT&.(Jpra݋"h&)1/ OIjpڡIckܫ|*A}Do#`ݩq.S/aW@)#*rJ RUB42$88Q,'jj~WVױԼ]m+K !gv%BB#]:NaW-uEuS8n?"H3IQ*E,)Ԇ>Y:!_{ OO1V)O*{BCh;1bqH{"C*PspqE(/:=x< hS:WK^)5 dHه"г/=nHqôl,&L2_nP"r1~Eɶd[鶛rz5:RUdQU)=j|ⰚpIBNiZFkoCL,L 2ŊEIUϥZ q)4'{ s]|e IƆԧ8-˅pLq&R$I;Ta&I;*|ꁯO}vTdQRגe_rn2z(NBz3{1.yYǤyLzEWF{ 㭺e FU&8'> uɆF)R ⽈<H%:vRDfŏ-f/wݜzaGٝnFOS[&sF-N][PJF ]3<@Wru_|s ڼp>[jr Îwm6g *G)FwBwߍ{?vd#P5+lJ& s`'R"c ZW|zSz4JZ$5Jgd #3М2 j%XCϸoE/`%X?Wͽ.kåV#(}2pIfHs8IX c( h )I\}we!ϼʈNY:xyn2HLU[JPpه{ TuL3On2zK`m a)Ɯ0k3ЕS"9"TI3abi9}O&9g'V ~/ !\6>5$O5|z:qvŸ .P>r 8rT1 ,`px*f$`8S3BJ,JAtC[X/cDU/#Yhp&v #9'n_;V:d>k$LcɿXx*T"ȉm&M>{1dM<\\֮D/($Tef>񸜔@9*7vsN遭>}7qT&rgIY4YC|GҬmfs"9D_N dZe fԟRLtB:L$1S/^3# x[|\M| 8TyHy&i_ hj^ vbR$rI-8[&bO[OSrI1 +:bB8S;?E0t}M?wHvXI2X+$>;1U܉ԠOHBfYʛѸRH!X'CBġl{_w֍q0qȃH*x^Fw=5ݍl[ Xp ^wNvi]?\ ZGb:NQS~$SF,lkl8׭dl>h&ck\\v:-'! *b82\Z @BE:6JhsMe|Zl!)zku|ЌlVhptY4s75W XTZR)sbDH]6 "(W FHX[N~\-'5_#ZNj2Ppr\o7w k஝dU7)I,P镔p#;mjcL`nYhJ Q!P2r*H8J_n>J> .PH3{yבܞ9{V_" WKS,fD\I9 6dZEhZz-|^ xGe yxnIWޛ{օˁӯg*y lfW*̶k+\BVC11Sa+6fT1h^&4?!m =)J5}H\}nj̱QCu&&T,Up)Qj> ^p}O !c|mT٧ubkUe- `ð!!i $0 9\'* %/s_{.臣2䕘xupn?o7Gki腣 c'/|eRabQkr$`TL(%H)(ltDl4@`0RwݩVAxhB@/[Uh1vGY'ѥV,1{JdE|sz?޿ﬔ'Ӂ~lorLlrho0plekU~ ogw㷥.QIeW#hP߬R5ea+6f:PCC=)z4޼64s?É]{n|i&[Eu4t~]Dp2u qvCǥ3k-fGm՘mOmKۆ5^!jgp{ߓkMxrE9DŽQsS r "NRȚOD"bE*b^}Ac#6tW#xdg&0շ`4[Y=MNvbTʩ!>_:ϝ $20:!Av:9d}͛_F7#8/쇙ӦzsFV'ZY=r7qf&bczEۘV'˃YÚ;S?ԛb9>+ą7^4r;[;=wlm=Mx4v4z?t>~ٴ l'^t431O.W ϕ#AY^&]Lѐ%Z Z YN OѳYrLk+ů.^Asykr{Wdy B]>.$6Oy&M( .(YTI# \uԨj*EZ=~eȔ^ A`O*xeœ+sPmChEBr9Z4o)#8hlme,YE樛}8^& %b=W)Eԕ퓾|7*fi,U$l*\ؔ ̩sZ+I9BL`*܌>8ިЃ%]] 5ʟyu팛ôV^qK%U'qI?P> ŃR0Q${)QPcD J:7aLnͅ=&"`&:dDJEx^c_Yc-'zsa) x۳7 }>{Cgo7tpoh{޿uφq0z.ZzJa&j0X)-ZƵQMvc>j.(}4!7߸ހ jsGDzw@$"`SI ]ZEyt*j$ 77K{h>]Wضcpƒ)=ݯok=*は*ZуotG'vWGk\Ṳxl|6/ǹ 8 RX>ܨ%NO¨:Waj]5'߃Q $3 wU$QX!(qJ詌\3QP|gLMN͘jGVĥ?~T_ m#F8/ggw~Mb$qٽ((.VeI.RϷ)c25O`wZs m6.XdafV}pQ>\n kxO=y;s 9*a ~=x6ɴH}QCqZ/KXe3UX̜r,TU]0SkA'1t2xץ@ͰiUbX~ 9 SoFMqՍ_p3sFFu-49-Fzoыׯfӹ'qVfs^\f<3"X46hvCX/ w3"$($E,)bRTdUۤDKYrd tsSsx HHLQɅg ٜrĄ0liR\z僣>+.'\)Ɯ0kaHe.Dܤ0G`iڱMJ<S4l5K߃u^ۄ (ό8104U k:1g95dWN))hn&Mxgwnsn^oq)N#} dh 7Sz4iMٰMn'Ac!4!B-Cb2(e8FQDkB8M%iHxq{'9 ,H? AKvd8X_fS)88Yj9M`RLK321XKt*ӳ|м2Ϟ|K`yxr"k])z%nq$9>,bΓ.OW߽: Ag!raڝeLKߙL>lςP 1 A88eГl|{;UIPz/M6wYLD>&+mf> 6z YZEvK=E&N0 V_;ߗ忇'k$ 2)qo&X}*vaGWQc"Tr&HQF$`Mx@ UL>iW@8<-A R 1 Y8 i),[QVRkPh&bLZgNHHEhTd]x I)Aie!Љw^RLraO3,F))j\:jb2S{S񢭝/7#O2,>6Hl"āeG3!dCϗglk> H48'lTx k?:^?!R@凬fxlWB3G ["<  #mO }?q)jյC2.3hPѰO,с@VJ J#zyʸ-c 7 /Ay7ާ֓+!h4ceo Sl™cNM~CQKqG;7CEC%6YY@J~ve0TP‡s=թw9F`y5ͩ`FmptCNG,q.%qNL%$P&A=Q-OaemZig}K7 ~Ha“l1X^{ 7ƈ=/y@]X|h7"OPJ)/ X"b]Žn9׈n<\'!K21ʉ}Ve)%2R+'Fz|}aeN퓵+QT{w#tr4*Hqṗ5v-4jS(ŭ֎&@E_ُO⓹[sL<@28<8S>8YhnLZw8e.^E ږ :>Y2-cū틤* H#%xO XxX~T V,w 8BѸw5ʣE+# 7ǨodRiLp Q) >FO ͣTnk<9PܢHJ6ƪ<*uiV) WQ_Γdbu#"9 x51jdYI S^&QOp\YЫfuJ'1V[ml$$ L â:q3`FepJ8E֭ /exzWŷƤ1:čȺ lÅ uO2NNz.>W94*#E pUwO^Mzt+XlIV?a"`"&Aq=22`0:?+PvwZqh/\]7m~yE-2΁ oa]`xvͺK,T$U5Na`嶱`NZ.\U5m-"W6eVɪ(gւ)9"f_'{mmC'潸~m\vTɀ%\@B1xL $0 BùEkV~~.O=&$G$iDhTG$>˲TK0Oi=S9fA }fU0?YKn\rQ]7"&KzlY1D䇸)05q+#R x1!Or|gKWF׿(=&':@28~W{4Z+nG,|.*E$hS˶Ex ],h>/":>Kছ/]F 9֭ (_!~?>1sNA=['/FnY:zFep 41hpM3m2,!D?pz >f2^ʦS7_u qBR+bI 9v4CGENQĜ4J4StR /=p,E7@68$ɴ*_/qٍkY?[](q~֏Bsܑ2Y Se=ԩmۊOHc2y-*#A;e?(Ywh8bLC!jQ5>ɱJVdZCM[#mB94*CO|i!Y4F.p;\X'*ْVCNZjɦ;dRnFvjTXB7K=@lV8F u F66onb2 QnyK.8 &,pSviβ'lhPzG+ C@9iV4*7 ='+,9@28"/^=q=ø<.t#zf% Ш NIf9֋dѓdNqðehjݕglʀh|}n5>!%&RpA A`fkrZ))ГºEDM:9> ̌(Z3'8rXpTh;jI1cϑV3bPP\q$-Mj8]2p3٩0|+]NpF92@28fީ9Y4 9KN꙱6 K öpGoy3bQV)?rh=mf c\-Z-p}uaKtsm9O'5k`YXDCZe-NR)FCIN K"6hlN1sHbLrUKO1Q,EhUc&w*aXkv 'V.Y?Z>j^Ղ9)Pu=:".+5@.89Y c4g@.Zy(xPA݈JY hTG\fEdAQ]>4!9 iTƧ@{Ol+~e\ n+ĝy\ t6#$ rpNKt1e 3n׬N?`,+=cty-E{;PDl%4QEoiTc\hUvDu Ш N"~.k y|eHNUI1 yP1glx0 zZOS%U-Iky;]l:AN<F}!QJNѭ d+ 85c<!656ƛɺugNU5'曥Ɠa˿K2V:vєݵЌ_gt Ш ˛Y5Y1ՕvFepei8˜jS !)aYûԅayJ3%f υ*Owz.(~7"Ғe Ш Nv~LVe =Z2 Sf381BR\PG.TWP.Gkd5Q I緷cbQ 4@*8flwӾNNh%?[,̺:.n/^WWZżpZ_훧 Qƃt5fq~ӫ_ϐ@ަCKy6).^Р$ZD`OsJEgoiyjU#*|mή/WΗqnI_K7?}&Mo$:.^# MxA y ؝LkB<.hMĒ8ygLX<Gh =O3|镥v},~Uor–xyX:RE\-V06.ru gw}}^f@%7dp~o-/_w[=}h6p$~n6rا1wa6 =͗;ިۿ& x>Hnț7v0b Mryd߬m~y֮߬o[?}(݇ʓ#:j駎M_&zN\pwՔg70I@:~4Y3S̾Y]Ŭ?<:@vtg"t 3;Tzw?VZH/q6Bk 3pXOM6-@:={|YK0v vg6LM:Jr]zw5E9k߃N5'x (̶bI^?&k撷Obn&u7^vWǣnX5tu߮/ެqK؄~W|*w{ogw!&6l$NWσhh}koɼ:ۘjZ=AoK @$ {X^ؼv܀{{P ms;|^6mnqs>_Qήo3 0of8 $n\DOyr^7gVS.l귃_=LM#m{g.[}ة ojR߯F@1pw~Dit{'z.- |R\L"vZH[Z$ݥ2"\d#͋n+PCB^i~=Z8fuߏgxGӾn~E|^~נwTk[9;O'=vl*u&iI>Sө"o01VY}4qQq͢u#v6wy_ 9l*֍3 L]O<tp4|)w,#q!y6<;m\+~v_ȲĻЄv9%!ڲ`n9H6$ !7y[Aډ=2s+kzm߫ [k脠aߏz\i:~Û+ǚ/"BKBMIH A{{x%@1+yxu(*JtX(Ѯ$I^Z/P3f#_8od,5{87[Ttށ! @Fzx{eRpTt *{b>-Ņor^{ ICA#cM1b@߶皈Y>C mND_U?m|3Й EA h !UIG٨nE!ǥ\bDs13a a A"<47oqM6fA\ޥaݱ-xHXN̶BxӪiw`AvsӺUs񷓸׊a<(+==+oUZ¼?eAoM?$f?^ɀZ`9ٔW0M~0:gs9qHCpҋCO;K&LNsys l,gVXv㡃lnb`Aܳ;myl 8u2#6~Sǣ~~7=eM< 4@iXpF ,a/QHXHdq45N؃OOԳKۋʋʋ+Mf:]e ܛC%-S2OdS+LDB":#*+'عx@#7Lr+7ZM?D=KF*7,Ce q,bbZ =l^\ڼt7/Oqxݢ~ͧd0緳|"uoV2%sRWIVY46/> #YTʺHX" LO,rվzObZ.պuJ?6o]qPG$d6Nkp֮uNGG4&}bi9Qѫ=zsZ? EhxQ?W#d[<>(+:V_nA h\@[ 6Z, RJo Kmj]Vkq\U'S;KNta>m]Gp]G#Gzob1hi: [ $j=qkW0U`?s('JP>gP଺LJ>2ľ{ PvLK* ~(2N#{g r *>Sq9.v`@Ɍ'cY#F-,)s.m Ye5WjVyi{8ܖ Ux&2N^xӗ|[859}1+cǏB+M/*M:7Q#fPw|#\-D)߇i.c&޹ŒbƹVpF'y=K,2}}J>6C+T`C#~}IKQgezqę,Ypjo ͦPDLR(lFlu.BljN)v.ڥkİ.9jN)v.ڥlyA0;sDK]9Ot xIΓI\0 ɔSݞK$;7T0R AR&Vy${XOP"ۑTTN tuedauܓY .c:|ޓ8Kgc5( |ۡ##:,y'zy/̪ZQcdI$s4^Sߕ.͚"/ꕽ8:6-v"jԤܵ]=ۗՏ*+?J[?} 5iŸ4ۈ\h%۴䴏d4˧J$)k'0t"}:zWQ`+ki𣝷ߨQF: .{ddpU3We-76G~mkWf)}L8v%j*=nSaA]Z}xVXWvhbMЖ5Z.!.3 ԍ:O\'"Y/۰arQΔmIfʶeqX/J㒚cY/2XHN PX&z9uzkkP]b_L(m~K󻼔9JVM$2:aelMΥZhBFJAu&ރfO͆?{WǍl Z(~w}1bKq6[l4Rh8RS6ly9Y<*VN҈ͿS;~hĥN\8* )zqouDdO8 ߜ||~g ,2tY<"<'-M,ܨmzj`%55I`U8-E*AMH`m8o?}-wuCr$;gwڥK:5囆|lMgE{m;mjxij~̶={8Cp-E*틭Oگ6[GMoxw{2]O_+#C',)CZ!΁VCw2BDۗŔ/yE^nY;m0za\ƯtlBCQ`nZ\,rY4_fvRp52O'y7'URjM!89_]r~Zs*By@{M->xLIZ57-egX CFO|'Ђш[4ZQ#X$\dj-[ M24\,K,/qSFhBf/B9(K{W_͂5ȼE6nZw ]kQb҅b`,)=槌Єr` N'W/>ՎT|8?7H^H΀D$\vIMo|;]vKډ>}[{ %QDd^fYE<ڀ- 3&¼a(S` 4~oő~-Yd&EjY9ABz;M y+[,ǐy]A0CRB-a䧌Ђ|m ޚH $[jIr,9d&FhAh ɔyͳy,qZBGOy^#18d{7E$J AFHj-;.[kAi 3 &b92Bڴ>03ڐB7А)jdG|4:Pv)x&Nzs yUtQv%c"c"v4rͼK*ګDVpC ?n<#Х9,\tqoR6]Q/]J&y)'/OrrM\ջǿbȇ*tǿ- :cE*GT:Z'ͽ$!Nݍa ) tkǣV 22G"dփ/^x]SFhi՜|T 9ILJ;dat,10F~|9\hL$kJaA;͢J4F~|ꛛ=yE5c)x^g6lkcτw5yz$HB$.)T@#XkG H9e]N ^y=G$M 'rsMo^ۈV,؊f):FFhAhlnXǙ5*(3yjDЂu:K&_ xsb!fͼ1$1*BJ#aZG|͛*SKD ,U63ZZw˂ ňlYizAI(\,afoH(6䒖LVE 5?I9=}~-Kϼ_>}[L6Euѭsӭ?eWTAe:fO[?e n*~zW{(j"a(S O"V.Y)^2 z |\*(6*Vѷm/`AhVs|֮G>|K˟UU׋ YoVK1D O׏lIt+}ѤgT 劏(KujtjPrb*Pث#e{SFh<МQ0B5;IZ 0x{2Z+ '={Z5'o#@$WgXZ{Ț2B D3T@k-j2Ս 9S<%emaZW\N{@м%/F1bSFhAoN^QfV]/aJ1ѻ1FhA^^BKڜ*3"0Kt%hGЂ 2D=K^(2e&DҔ90'|Q:)# o7Ku+9^sZ "@\ƪ䦌ЂS+ g[8 Bf-kcЧd}ɿ}훓[߽ ?>w,F~{Sn^B"y߿, Q!O.sEQ![RFO ׹d8R!4 l K8g8|5 F7[nK,1_INn\^Inh-&;tjhwhh&Oz6|vO)ͰU\6/ˆm٦a>`vKw'~,lm#@wIw4J?HQ+u(!ֻj?)|$ UB.P7w5=},>LiɋG}AHs#ppY,MqEN['?j4V,nVOZbz, v'^i0t}Et1륕b5 ` EӋ\ZZ]sri+e5#m?UD/rEhwUQX4:+lF.r݋\U.WU\\i#p]7ੁ媢܈:2fv4YvN}*`Keh矗(IQ>vuSFG*Z).W\\ ,F琺؊t,rurhN}xVrU E*Z5Sr^\ĩǷܡ/]wk_ػ N( w;.rԩ)EOrUc?rU:ۋ\Z%g/W%EP'"]7rUE*ZUE`#+@w$Wx[[ԹU+ +Z\U-rurv$Wǻp+B+R,c+#ב\`|w"/UE)"WG(WV rޑ\U V VN][NSؼbE6x x~vdWoWo?nbk0^uXі;HU9/=ȃÛ|}~&x~x<߯]wo %ʫe_b 2ЧfgNlƫ0@۷LW'o!nNc{0ҏ_]]su+(UTw_r%f\ĮdCZ)!U\}z@@!@rhbjI}93szy|{}t7ц[P?Jx{Æ%a_y|F7##?=NQz23Gpբ{NGzYnkuvG=y,|wOmoG>n=u۾Qkh ē#[ jAxFzi8>~ ֿz{Y]-<}$CS^^Zv7D Psn7nK-?_d݊^~BN~.$ߛdd7ɿZg~jv:,MoMOrkdԝٷY?o t~n Ov5zP{ύطe9*W8FjOLLt>߶/y94gE+\G&m)Z㽳OlM޺XY+?>(׍k\ג?ŧ[ɽCz yK zWD إSjT%`}w>%;^2tZLe~0\x^>aWn6a]LT`!DKߣwSOK+>xҭk2NIvdCZ5Xnzj:7=!J#a%j<\!`khઅԱuKyVp+yQ}v.r\l)\'|, frm {ZcRhzg/'4Jk A>e>S'fA-#ugOGvDFY| ,ė?,\h2aQ3(c^%0k}4pWrxzW!qDpRGW-\W-ZCW-J';\@"7GWjᚣ?x1(C ؓ;j;Ei]W5 ;:1e:j}pբbewX{|ˌؑmjof?7W?q͆7͛hN 9yyzP+/RXx^>6"]|AJ'Ļjؾ\EzBl6@hXfwC6bڈ<[7+\t0rtǑέv{^tw gSz&pOw36d-fgHH _o㎪Vm>/-E6Q1VGt<< <.kG묔;s#v8{ncd[އ͈֡+D<_/G>sAis} ]u))jQZf Xd$#ո M9gۑW]MH 1!?M݊vprUzzukr7.gQ5$dh7[}Фud8e.Xf=m\))m(qs4fUVX٨R(j-s 3F[TNTcrYv:Vһ)0y+z7xSȝ C!x (( E?̧Ih=V)&&JEj%2ddOmOFEio21[E`14i]L9'aW䌬>I1 y4XgU(TǶN(m!jsAuQVm),2S0@<t) mDm xBK v ,0:Y-y9TU qb/NLǒ5XrUhrJdBG2cHm̭*țǂD(h1fujFb pKSs+EQ6g a5FGC,Z%d@ri+w%Aq%UTJcbmEg [Tu&A骐Xhɂ"c ldBAˣh`Ö,GI((١!4^i˔A-΢hi=@`̌ JC3C@z PN,H(@Lh-Wr5:)dDw.U[cT :,,pL:M;:* -&qaoJ vq66@% `5ve `|D8&VJ!(ޜ]LEwF-%JE5n4XI{ u#KQ @+dlr@!O9BZEzPW&[+(ti"d5=))IE,)l5f^,.zюjUaPH/GƒDii"j$$) GDDM[2rZxZ댉H0Nd3Ht_DňYvƬGq"p */VF0&x?:d lәxc߷%6LuZ nXA%f=fTӠ: nSja&a% .〰=XYst)\] ZMUzr AMj œ 0A1JP$bDMˌʫf%vQq)HVxl1F@;S ^\D,/d r`Zǣ&P܊`m@m[ Ug'eSU_}*yEaeH5Sb貒I";> ƻh>Zw"yP'SA[t56}+xmp e, {إ!z=IbRmB%l Ze}tj+t ((BkL(۽'XRr*#m6(Zю;fꀃ2]jxrI'#XHzRΨ ƈZF$IAՠ|ÁväD,TP9Q͡xy%\CD[tRʥP4;%T*bH`!H**z x"b0j@==ji^úIA>ONWĊPD\d>WE3*iIO:*@خP(dQ8QML =^ցN,+~(ʍڈRfQ$7&kfn<H 5i+E\#|rRvͤXXLT@HTM@?)+{֐D*U;jhtٗ%⎄^vHǃ O:rF9a2Sa 5@+jׅp4SQ#Aj|d$jO^ I%bnC3FV̯:h JC`vNIs6bt-DL0K1:Fa6VK$C(C$M]q9۞9KSk]]!dgg,!ztk[ͭ^F]{qb\ 8CD?V9s9TZ{u V -g N}4G9JX 8s_޼foڽw+JnH8z-?vlդUY.˳凓`@?v%v>}Z*{"Ozmn+\_F붧|ra%-7/W. 욎iN1 \ƨh8xܲu 2xtN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7QcLgA%Gci*q,F-zLY7QuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQ`: NdAQ:4Fpި(e7HnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnF5;u]LfpZ d LˬUSmh: pe6#Z& ‚IZ<&2QG)dLԉ:1Q'&DubNLԉ:1Q'&DubNLԉ:1Q'&DubNLԉ:1Q'&DubNLԉ:1Q'&DubNLԉ:1Q'&DӘD|޼dxO9VOg߯w{_d e,!!`n@@9Z㼢/^Wwҕ!W8!.FWH+UuPuB]|B++*]! iU+4,ꪅ HW,d0BpH Ru!=(^]|Um\6kUmFEWH5nuHS:RPt-uJD]PWFu@`Nx0B\Crt]!0QW-g+V++ĝ+5]WHbtF]a‹i ӉT8A54]R8wF])B S - gq1 iEu:ꪍ ?+4 3Վ7]WHieU ue1+F+ĵ&]b UMR8F]Y܅+]!Q h|`+uepղi_LXTŞݠȆ:x>ĬTbPtB]%W?**Ƨnڋ{7'|Op~O%. t2:?-79/k#xxo KU]H̄(s!m][vx'²t::3'x9޳O4xm}u ttZ<Ғy Nj)椷Rfob!/Q fZʲWO=e*rbLf ׌,,Ub&<4ϴ;-Y\Oے,ΪԺLq7*W9BYtv`-lp9[\zJ5kDCkhh,ZPڠtEqpt+BJaZ+p̀t`t+Rʨ3 HWl FWx(BZjC8l'\ʀt`t̆+B8wJ]I' M8Aĵ huGUu+B]!.ӡ i?Վq;YHJXuO¥d ԣek+uhS) f q- EW@k(oiwEWLjـ%1גPt4>BJFZ+., ]!+B EWHKMu<ꪍ؄3D\BR )j6 4.6]!-MR6mGѕR+X<“` "l|tM%QtQneJW+h=Z+q2.;W#F'l8V*RAw?b2ugCp!azyoy&VYxIS9ZDoHm#Dž߇Jt1:|ܯ.e4;ɹOk1*?j16LɜOe[ KRK\Bݚ?&ukq0jn-j\rW̝?gt7v%g.K6<{q&?p1|Wb?vUHY915lk;?dz *o|o1J^-?^͵ sSn%Ũ c;骱T`f'na9ӂ_tLk3N;ͪ'0@f3 4w|43|UjC,<۱]!{:jO\=/L8VF!qX~68t6-~:{ķŻ\ rU78B,U܍ppw/[o~آaVfUV\..sݥx_b<6UАyC`N*ʈiMS9DgD,z!D1񕮪`a'tϋ^QAn%笾Cb#Zn|=\[hŚףl1qDhSYm FWm(BZE+2ꪅbB*뵯1WWPt&QW- HWlJJCҲR認HRtjkå]!C )Eڨ+)!M0"]!ĬG+?wF]PWU@BGx|UB\CҮk;zFD]PWZg+:]! * )j"a`h0B\LtwRŹ6*N33D\T;Ҫ )M3 ]u(zk^_37U9 Lc =wIp͟O_?CںI§d8;+Dž>/0"vc0?O/Ep5wL-O31DdzB-y`M_&o$iUq]Qz1:e%~PKiKCZy4[s|c̯Y>' ɲxTUb Ml ׹OizڇNϖDηۣw7o7ʳ폶6_A_OUC͟ Rl~^a><:yU{ڍ>^:l}=yov~KtЫg[QYUyr'GfZìJ)D唪 G@q[sF(%ɼWg;+eePJp 0,/Y] m`*pOMATNs%w<~qt|J#Kf <+re )+IQX]R^ԕ܁Θr{!V ̈I/h&,^b>@arODa.xY@nzq*2,2f)c ,ln.J”p*(•wO>^:5wJ?i!”dY-lN0VChV c!xbX^/vUK A#."/RJR[[JQ*!dHNj'dY_&xbV:JQY6U"". Z(0蜔+qGTܽ^tWD:@T#1,J ,< <ԓzaa!|Zaq'/ogr뇃_>T?}Cۯ^Ѹ?l]w֙M\gW.ӣ!N5eҝ?{6Q,\T ߏ`.:})/4)䦙})?dvXZ-E~"9ک.| g'oݟ]n}}վj{Y?emxVDo-ZϜm_oO UП41Gۭ_]3=~1+_.|^׽a7 J?6eyrc}ܜ䃡{ =>(ɠ7! [ThqChnwQŷ@ 6N6Be x0gѲc ;A~/[Gߎ VY tyEVWƀ:?Qo_. c$Kک7SW JE-C>uzEݰyM/cOοGo|} 9>jm׺o >h~]o#\Q ~{ջvCr`Ϝש!S˕gYΕE!QPףaj>l#W1Kyz~{cqڟF(1x;e"p9SI'gr7AZذQv$QY|.Fn\ފ![cOɵyu?f=p_c'RNO{G'5v(=oq/W٣ҭx+N6kc#$Ļٸ`u#r&mvWQnR"gΊ E3~צs^'sp1h{͕GsuHV/vsUs ]s* P*$suJ4\ژ+W+*|͕G)vu)4ژ+wכlV}7We鬎\=^|#ژ+h{Qʣܷ 0s J(Ӟi}A㺘+V=(i+ubW1WLdh wsQO~ o?37޽}hrr|%Ԋ8zպO^=7zW:Ba.Y+,(+ҦXPVg]._EӬ(~TK{sקov\kP6_A3tr.9{ᗷ}_b$jM'k_xye7Y ]OxYәg蟃ް?Bҳï?ݤwaޕ_"~Դ|ہ-fe1X/W`oZH@Q&&J3L? Gk.KބhC,K[X0 r1"~ƢF1qF=\Y RZs~qG٬EbrͦG 4h$ x1}(8 {w < ^z6}6w?u{r{^0c>4!/VH+eBnrIsJ63˜,HZaq/s)Tqhdz[YTڇg7 (4#4!Q\ L%!lZ uaĩ﷑GluwY >@֭]5fz \r2XxN}^ 6.\09*yOyeI4"dtd'Z2z a F(~k6i:OxN8p Wσ+`w.:G 1ghSb"i46n ԙU&u`ww߰-X6cu`YW:l= $hҭN| -#U1Ǎ QlȨj5MHnGU>^}$5I5θ!u\4baBŽw#δƑȑ\߱/s`ݬwn{KK%X torcrΘm3 e})><[ߏfoˬBrfE:\J62Zob&&.oRAK%Q1h[r-DL"(,t(irrzNu"kwH|;)w)V; Zċve9Jdj ^J&JQ 2Bu0ĩ6LP6ˈ.$fZkʬBsQFYz%WC2V()&ŗQtT҄HXƜI1,j;3$oCR;G-:4I;džSW\mɝ^ɇпSzh1@6>%JڃSRΥGO ަOٖdo}?k;8tKSd4\ib(ObQuVyt;2V í8._Gؚs'[3-Z+p8"aQ]HA)d;7GYfut47_ VjuFCZ\Zbh4@yU{qu_9'gN 1|r7 dTKP 8/EJEqɎ:q_LaAEЅJackF[EgbI[d_.<-`S2_ݴӾ,ji͖E h<|zi/ e%.,%]-+,-FŔE0q^QP}OvnM竬x9MR Z]%]%(n!`73^insVT ߋv˶ g=Ox'O1QO}z~u5;./ `[E U 卋Q4U)WTKʽ> m'=kP J?^<&ڙ܌G)[&k1i ^qۓ.oI+?Aͯ]Yˎ^̢8 WI0! hdˣFZC\1VH\bw[Iy2ҴŐo71e a~l`짔2;FXyJM~^9._z_4=lB(6 82F<3-"9[Ʃ{>*<=NLA]ni-H28`*84`QHpj¯L11(%;WӉ+d:]j?i$p:ƌ{ۤpçƩƄ?Ƅ4yV^uآ8Qi)dXD 2*hc <*n(f%bSXNNgw:ڙow ZBg7LɎ .#=pv3\PJ_ ́wmzQ{;pvBU¥F_i41ǦTQ씉Lq')&VE6oҸ` 69s#CwQ0\F{gFy', M~hK+/u1#)ph;C0,ASӉпQJEI%gU/Ԅ[(6 V31|3\wj_ҕ6g1tXa^Kt]-G(kskd<`.kc<\!baW}ۛ+R5!+F +&jc<\bx͕GIic\),03+\spvEVʣ%|͕GɚC4WkĮ?>+ɺ+vz]8Jf]mw6C {Zm`Tcpf9phwsQjҘ4Wٻ޶qlWTߏŽmg3b-ude'M؉+ըEFɏ9!IZd"`)[c"\vh|msQbܙgh(E9s?RfS*}긫;H(34W*vVC7WR"ٙgh8[5g0y}M8wC]EjJPHV#`[c"G5C7W%s4WC8na2)spjh ;tsQ\=Gs9-2Wn3m1W-9@v+\iƙnS=a.k`D(E7,͕_j:bSTv{Rl8}:?đx 姻WZ;8edSV IqjER|Fy;TE?޽y}GYm#NbVb0]\KxuVx3A@yJ(*U29u_0*)˒Ϝ/W%{_IqU :N2D5"G(u@VB7(&Pזּjc Nй{HBx,erA9+õ^;ʦ!\_z7O~>QSwC+Ju`g;acAtUw+oh :tsQ,wg\ģn'\`Ic"\Jb"Zv*3W\Q.4-2W< n \>4sQڝYzsBMsW]E5*}Rt9+.E k[c"\b"Z~sWmգ+!VE [c"\s*r P ڱgil9w5~N8_̸b /KGdrŒa8^orS3a|Ƨ4cHAF=iR;w@ѰOcڃb/!LQʲGp_QO'3P]WYE{|s}۪"RE|Ӈ31Fk8FHXG,2F`5T8B[gH*#%!3|:0| 6t?_FrTf=;s#~%K7 eXt$n=BXlQ;ߟ>}51O}_?v-"~V9/s/0odc( ;^%W*EzH.Jw9dhoKRf岄~: ]V٬ .?mέ46YMErٟ%Ӽfpk ~^]NZl>Dmޞm]I~0YH^?Ƹ1f2M]s.L@޽_QQOGhO7POG; y.͠bq'luf0<2سkRBͳL ܂Uo 6c.,N2:LF1&/>ђH;z%FEj7H&n&-c&D`2{0H-v6i.:cМV{YOHPGn%A7]lj8s0ێm&BZTVDd8>^re8Is>Tˀ)2xj2)p$bґȴStfgN3Yo(3sQ-^؄!/W3IMI 0֊huxX]Z]eL` O!a z+}+[3-t F1t2(p/ϳƘ᭔o!uVto ӌE`<Ֆho?'y8RaR?7a/>rKPS>m3Ujsk`k.PH#˘8P=t:0giZ+I398%mV0~hs([ً` >]y(kf;㸑)314Ϣ%W I&Re=rgQߖC⠸,4W4LθgTJi6[‚ֳLap:mLs2Ώc#pkb)#zfI_FQ`J3G0F̠PI DHD2=HΑ+TBJ(a=k ΎzVCR3{Oĭ7*~ ]E,Zqa2 h.GZx =1F@RIu6.NxMŏ"3C~jw&0 gfF,2%f\eXhOuA?cD|ZQ-7t~R'-A^rQaȧJ4;y+\L EB+ Ha;3H)9D&wQtغڴiQdbeAmZNu&\3ܲwS~{Q.4G0BSTL"l O%ӗ aQx2)0W >t܌`HR*nXLvPoP}j?2cc0;̹(8 /%NpFScAR7]v4&!ՎY~ltZāՙ٬dI2I:K 83Em9v->hqͮzβitkHI$b<  B4{cv95nﭢӖ z )3T#zfhŇÎG?96AZ"N&CāR&4b3Cq9QF2-%RLϼͮS2)WKMqLBS1cs,<[G#Ϛ&M</>r[Vr4S|4# xbހw8]@PY*x 9D2b,Rb~=6 :#!!0( c.hOI (t'O"U:iiElNULMIqU :N2D5"1(0Pz( MYmIn蟕;vA>X̹7EO)uN?d 3sq U&C0`KfI*kט(C12.tgaSM[0 %w8fl7t,ޟP9MR y%L,J-0אX0y8v1i+K1dX\bUZݐʬW8N߾~?}/` $qy ~ {V^veSECyb;Mp5.&k_*mbd-oor_!K@i N1 qf"0@86gsP!{ևU:mU]#ƴf<}4gEjuK g y sZD)2lj%) MF)v&&j҆cc¸8Qy :cXd, 5A[R.#J$\"ؾ=[;gom[7-{&7ODC G>ᾁc>1C6s:A %VxbO g4`*$TʂĤ#"Asְ߱z{]5iÒeT8 NjO"k/%%\2N Y,-,xGIf6cwݍՉ{P~ α!0Hx"/IbXnm؜ ;olȏbCfUJL5=4MAH.@)`vgo`n{smntf{s[N ˎ@-4l4]Fa2ktAZL3ώq[U[^Cb\/ͥJVh\mEo ak[wgR[F ÝTI⼧693aTV<6SA/,Q0޵(7qe_Q1!d,|ḁb Sr)<&l:-X6;TRڽ~u7nYg 5Ӂ 'vfb0{ӳ5.ޯ?y} BRRhW 7Ny蕑,j\& >rn-W/{u4I3%z}oYP!^ݚڌxsU0Uq5Zح,>h)uҨhi)waw9Z/bdEp!IJ2CoZhb"e)*Q\U˲ i5*gs-$NRsv?u Uq&2}ZkK Bc.:K%s`xT\Cu41+kl@NӽI]Kjrت̪,.0ktU(U-pG,]eB6ɌaBhƬ:ȘD9)J8_S v_#L"9Y;`\%8e-Z(8 A3ڝȈ&1V * C)_a5T0*-Cٚ|AYs6:1 `1*(#L eD{՚E{/~-!7O9B"ii- (>"#AMb߿n4Jܲ2"JZs9VJRͪUcP1Ɠ#L}-w}乮nx޴"Up\@ 4_ }h0 5ZjOcw]h,.{I$b|f^C627mQL+e.p@W"= ^m..9Qa~)MVB]dxJ٧F4`,|ZÀCMȲq8-gHTK@yUm͕1"spy2m3p91DP.nj!f=0S]6؁f18k]n 9EMGl.s meI.BHwЩzPTaH>Kk=zP@zK`}vFu&iT z1_a+@1\ƒr\ \1+B~;$ Q1 `\ة !-JFJQ5H YZڧp.C1#dnІE8zV"exBFV#vRcڳJ{TOҮ˃>0tD$ƀ?\,_Y#9|sP*m.5 vG b?F W(5rfF8ae%3f@D2v]uO#0% 7aDE+9>X2^d5 L-C#ʭ_8^"p(`̯6j5fC{DUuӥ7`ebBw̢JI-bJ $ȰtbɾS@$ti@ui,? tjWW$"Neg,*B% #rW[}oRgkqtzpN 拟~-CաQ*3p@N~ rF]hiRحn7\N d6w}:k#ԁBXiH% u<պ7P:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! u>PǃDfFۥk#iJ uJPKK|:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBVl^B\oP^+'Η(1:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBV)u꘶u6BXGGCB7 uDj uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:oBwWK\nų'm%h;n{9}e*%ac7].|hsw'LrnĿow'/ 7Ga޸ז9>͊={Ǔ;aje/ upcL66Mtc vÏ8X3 3`cLgeպΟ>L@ v^8r^-'L&ӃnS9 ~{<".7u ;)N/;VmN&E2;!Y`YUO1Pk!:sv\Ondo >Y#*Jڙnꛮ|ͽypG7n>uoۻe'%:#; &!*l4h?9cqnH`dw{0LZ7?L%?7'3nY`p>JN>u`8Ϯ>Tl'ڔ ;5&`z؈ۂGE*CtLX> MwRllطMs[jڝE۽}zz:l|˳!Fm)4k_olMf?h;1ai=,#l7w7oz`맭'<~;_<_yAe7?.u/B `:Ύ<̚sGG|Q8KZK]X,њqw,N=) ܆ݿG'gt~=}=7]POڞGcCzp__8|_x/>zc3liy5?m< Wmwwx/uGO{~̶ù_G]h꣇WNZ0v-MË.|t>"qS^~ eL7.⊚Xxc[\>`vx.ogY66N`K2~s*.Tf@J/_(7wpY+ pwtw^Ӄ&{,UVлF_nqJ;9=z˝g8Gx(yၭ),prgsq.mltx:=}׍ɉڦpemǸOmS]xEuGlz/nwOyp` د?gr~'eXwNRYYBeJt ګe~}Ѹ-@ү?M/'2/-q$p|ܴ!ϵۻч~ݙ- rJ a h3r^ߔ,<3nJ(E΄ūuv4;+`?.zB̵`'-n 4:2Tp0hS)ac:J?=B2?uy0H3̥M_M_1}^ݤ^}ޜ;֍$ X`g0v/$  /g),9ldUŴ-٤D,Ǧ(ꧫnfuU8`yt{ 8UhMM]a~f۫ ^މrzWΪ3;Rʀs4WM)׷r=檩Ķ.8Z\4=GWK\\i%\Ze;+PiXUq%1•VQWDq*W=ĕbIJ &+5\)\UvW2ĕf9CW kIW(X*Bޔp\i\<એ2p) {: P]T.ઇ̃{EW(2B PCq*U)d Wx_`' R"BtWR3c')NYzڃ,~ӻtvirJ2b s_:O#q8̒!pBEy,w,:d]N{:8ps\6^\WZ7k1 qIGζ:ʳss&I!7OcVifuFz^apo(j(щWT"^WXg|:ΣeNJTUVTy[]TAm%JOm[ȳ Y$E$"oRDs/F;:O?Gw4";6*ZnRU竍_'бGk;sKgζ~")%P& #|0fPF;@"yʍlk%oEÝj1Ɲ12}ѠCRpЊ_O0g}b^2F%>xVN!.bft-J峒)n1O[e_4kE.Cl:]Di|ʥAɂZn.㧫Ajx-b =˷zngm<è6qͩkouDp.?7e(o_d:w'󮟝̕^i^ܽߞK&9\0bznk2[*_*߼}} [Ak K4^yz_|?byQ>k 4̊)Uэv ˃O?}=o\s#pP/*_9TDCz:7Vo7ŧ%Mg~4|p^y%.5'չ#|IWDCov'4e[K7QLڲܯ:,xGOlHK4Gn#O[Q&c:"p·o7<.wuip|`1* 2fͼۍlc~<}3}l57Y>P7_d765Eiql|A:Y)l6&p[.z^,mG~̰ԅ//: 0B@>df?\a~Boq?g?9 i;>Ǔ,~$Ix)̫\A`;i/ J̾KyG*ީV*E`u3=|K՚=L魍RN ƺSI'8iXm#ֻGm̷RY+ϙ`VtR=9ju'}oKDSOliz*K: ;Mυtp=dpr+Tmq* !R)Bk2BSVvW`\WRӮz(@ PTpjB&}ĕ+JK @drWJZ&;+PɅ !b:Bqr+T]q*-`q}I3J t+k\ZT;#rsBBP#dԖ'/uW `$+l z2+Pk:P +5㔂Al \\ϩ r.+T)DŦ縬oW v\Dzj]KSٵܕ+poӃ'+&+\Zyq*KS+aPdpr Tpje+TeUq% @ Py=WtWNUpR  GǻUB&એ2Nc<W ;:trWVvWRۀ:e;x-+2BQUB%Sz2dpr+T{7 ^;˳NP0% c+J V#ñȄVѯ&2ij>qCM,=GL'`P!y7JUKվe\Փk[ޠZoU-u WdվMϽex`$\\. ն5yJ#z+ɘt@2 PuTpjB\WOmW Xq:dATYq*6y($2 ]\CƻBm-8^K%g,ઇ2yR+T\ܶ S+;+TEUqe4@U PTpj:PSW=ĕjEW 1AW(W]Zy U]WjT0ObqSG@s^ CJUlzu΄R}殺` :ײwUK-MdR3߱T+posᤦ+%+-\qduWk \WBo(@h`=\PPe : JB]7+SZu\J/z+%A JFWV U*pC\i%[ctW KOW(wjmq*]Uq\%PdpPr2*6| U*pC\Ÿ́jirGWVuWҨiA պ摒jR t]\<\Za+TٵN+'yib`ǷsW.nR0۹{@ӥ̈́tsVT6ӲLhYB芣>g9~DSKWmoROF4:NDSG%gcsuFfߦhGWh\\,\Zg+PJK5\Wx!\` MW>ઇ+IW ض4\ v߻摒RV˓n7r(PUdprO(Aɮ UrpC\i<]`K'DQqu\Jz+SP+^VZ+TpK\Ygj]\AWV U WjFi" tpr+P+yq* !W{BBudpj |U`TlzsΙ&ZKm?'T{&2Rɻ+po fp=\\ u\Jiz+ Cp`dpr%\coz8P5W=ĕHB^r2BJQ5BV\WӁ|k [O.Tpjeq*MUqeR0hcwr-T;B\WVցk ̐ PT]T!wG\9፠+SZ:OQp]>ઇR K 嶽g=OQSW}ĕLy2K*X9%JwqSq0R:ka}2$jY&7G{ΘȖ?o Y 1O1f"_RƋz}]]]1~YeJW?W~ҿW84]sq~? i#6k{/ 'px<#vR\pׇwx4#pv7^痢[ \._]K^ hq=8X筆/%KVl50k~^RUYrX\,/\GӋG Vy6W,;969OX:yT9|ẋ2ɭ[x2*:yp"N\)O!79Q|!MPx,2͖gqiGxqKcZ:gHxdu$H6}P?M&0˶4<>L͏RIKMnv[/Wn cx,MCZmy`WپgOw7?esNjE.x<7`^R~qKzQ|;,h(m<~Z+MˊYns#$9sxvJ_&! !D9)t.\J.3Ԑ3t:gm2=CJ|TiֆVϋx1+qb9Va>)x1@{Z 4)]<{MA>f9}"Q$erf?;D0Gpb'6}|]qNP]pp=  t6]vnY~-ۖד+ZTOn~=cOmxwkKK.]1j!X}-v:_n @AB7gvBĒ;mx,pqv0qeD_7߀{ԛH8'#$JDDl?DߐI?>L%HRǩq &Pq3TCNKy½s.$OXc<7y1;ڍ +9XS ?IKCt^2qKkUeL|==0dQ0Ӂ-JƏs}^>8v$ / fD!s.ONX[c3̔=e3ЧC2s3n0 >c3S_Z;${z1wGsyyLXU[0Ol'&yl$(j*AfQPa|s68Bq2,S-Xge"g*Rˣ8&9&&N6ciobZ9F`RAO0C5O=mlRqmi7eSv?ڑ%m>+[lM,gN `H>RL8ˠJXgF${/Ujp4WybgFs&65Zxf&ƙ].e%1/DDJ_ -[,!CohY^DosoǐtyiM|ƗxƔ(ގJW.}W&F;.}i mǣ*w*:/CBuvfP4TԌaP$Ȓ)59bHETB@6qVʮ~JfFyN:E͏aKR+N=-6>0{ i #`WIuHVx 4;5H8432`iAJģfReXhOu}ƈ:86CڠvrV?һ;c2&X L{~v1sWWQ EB+A$֌0ɝA@dB!xv8Am,}84Qtb-~q%sӳaIEu A Q"lBf4-_z\8R؀"շ"Ch/OMEj|fc[< :S9{L`9wǰ4`al;/Qk)QyLXzgߞuԦ!w굎" Vg:v&pDKYRw3;1DZ/Ю3uinti[$Rqۤ !-) }zIz%7?TB#Ӗ Wz%3T#Uzfh|]KO>N=h}lí/kcv%|jgQ'q+ . xcrnmmmOhJE slUv'gzar *f.~;Xq!$[P&?sW4޻^|6X-5G[^Wχ'/vfaJqj>>,3;P}s>^91vBв4]:G:]6 Y:&ˇUC0CG YL|Lbi{xiJN.r٨ˮU騃yXHtxXMuK^ltR9x?:Y;\ _[׍pv㰅jiކap$y/0Wi-u M04u^xWWyKƽb> ڶFʀ(X9K¾MR'fPgB1v]]w}'`槣ۃOQIT H=ʝʅ(@wօVzZ>.%!Wz扒3L35FI|0--R#gZEtlE>R *'J0yO͘I/%BJR쮞c]JR.*ISwWIJ;?Gw%#)Z$.^$-}DIY6쮞RTArYAGý?uYGœ~yD뻃i*Td=hf|+,my3Um}=i\+}w!C7 Xw y6Kxb-bwZr^/E=)]<.?^BsE"E9^i}Τ{*~@Q/>\/2爊ڛ|,SP@q1] &SZE}%@ /^rI[w+*h|?TyZƥ=^ZGEdQ0A[R.k%OFrkŕW&vU1&*ԽWuR]YğY7//KPUU.c>!^$U]u<$e|KAk2'i$͜4sfN̵9IfNlg4sfNI9I3'i$͜9Oʜ4sfNI-sfNI9I3'i$͜5 `r+@$͜4sfNI9I3[$ 4sfϜ4sfNI9I3'i$͜4sfNI9I3'i$͜4sfNI9Is\f^b4Il ٠4:pKXOj(4(RdKJ qZ-y[\֓XtK.W[y\T. Ӫ˷$\ ܆"ߦ_NB~)|1‡*o#'g^n޼~t$.QhczyAʼn՝O ȧy]e!2V91\)xʪKr3+kKCFL?.fDc{bƥx(AX™ [\u9w,'م~o$dD̴^ ETg:FrO,Z!g$q h2gp1]LGUjWk1+NAϱW %ȗZrV:A6B< ݻ< C!fH ^ @1[,} *A=A׽^xP$ L[q6fijw@<`6aX~B#*)UXG&zG֛iz/:x]e˘&ru:5D6@H"*׉fd}h듸T m}N[4?CzCu!FH)uЍƇoo-CRUTHW(W쓳3I}mvނmwp:lĊRRi2՚;΋r*TW!J껐^={֤W\ețG2 'ݏ?X`|9f7Y${b>m?=R5ETK6/R}2;p[&Mk'E_:Ğ>Y?ڱעW|kF~.>~Huo?Z-< ENzŨ Ƌ;YnodlaFJRI˘˶0jEYvRhKcn?Ǘ֯_2zo|*eOnk{Js;z {:yWng~0?Tm6ۇևFpV%>nu۝YUyh1>nj9ZlE0N3}XJcQ{(Al",iNmF,B0O# Dꥦ0+Cʘy)e&G eZ30{-#cԀGy)w&i=j75B2jE*t+0l x"ļh V^+:ֻ?ch-JUB)@\`!Gm7 XP1g&ܵ:xۺU42 Qk %N0+D{A-Ft`ZD"rR]+/GWZ2tñBb0FGT:łKs02O#1sZq hum$rBgS{^a9w8 . r"Xy4f{8Z LyPh$%6x4CaP8&e4@g'Z>ԡ}wF߾j5ctyJ'Xrq񍟚Ve[͚L*0$5I ˂ :,5I(Sb[)ǎmq9ln7ԡ^+k) X+wi oV;JRn:xhn4;w[ә>xw5Nm_m[SHn^nKe-L޶ 5qj'P#N ng؎t;9ҿfDC$ycC4k#i"k' 1x L1^[n}Ho {NDsI| ) u] 6FIqa0kG}k3-pX[Uw`@5s ?ySXxGء(z]>+>W;{#_ΓuR*!w'H֍szڡX-\_|0);[0_/}ǯ{rukj^+WB']+vf^xD{CJ+a ]'_?9_{? K@Uo1? ewÁ+g$q9r@odXΆx_zڏr)@zx5EqE[َ*$6 wEQ= Rxn{jYAQ]n>_e8Ua_o}Iv:'z,|t?_B+2do(⇮ot+/3f|(3&r\Pd1E=':4̒ ®XYT1q^S1>dzzj|*uQҰ2cD0Mx̮}V*_`7%/Gkr c+ְA鍼j")KۤЍjRd6?s7is;ʹzKZh׼hiSW@k߸[ڸFXѦȗq!9/ꦦr~+x湒~ {=0>u=ԑNrn|tAǽ/+IPlii;o+%y!o?%y1rMvS[)ddҦ$f=ҼW9 !;3v|x@Ƌ8計z8tL!4moǞxߎtuӴH#ږ  ӠF,tڕnm 3X-p24ShV@T[bΏMfW9{ 1,.uPd[S0ϕJ#JUbZqpdҜǜǜǜEvf xI8(aa<8o;Z5 PqY>.etL\֮e# lGۊR2*l'}ѧ'u*_ұ;!FdYT_jEXVT=' oao俱*jO[$'%">-0hûuP?{o|}Cf^Y{Rn PwSlzu ޛ×O+O䤏(3Zu_=^ Ewެzq$2f[81HTq$tG,](?(^I%5FG|Cv]wJkH d(^[7թCYUQ_rJ#uY$LQw;{iV8(X_̙[0.oXX){JgJV?RדY&[ʟ H| (SIҺ]0h%~"ӯ:P+[bM+? UR$I}\&4W_uL2`t@7)*hS2N~7?xŎ:1Dqcr!tăS1'~x}eG'Os؍SNszիyw;{;$ lй_:OQ! wg3vwYգJz-5W{ s0>𺂌Lg[ǝ(*hʏ?AO¸*je 4e\|~ oP. k}NO27mTL90dBϑK'{JTyrX0/aܧƃ>_rz2:瀳7e=jQ\_=s \e< a5'$f6+fHOUPA/ FS(s6<-p+x` Wz$vT5`e_< 8]!T@$&_΂8pNʁ_'Iȓ8u I#K)/R#|w@d2Iv;Q8Cu˅I')eHei*XGO>G]e)cxdԱ.j]`98]]<$v$yFy'u;Rwuaw9T\8bntכTy!P94C7]L8 pHz!1wGY =sNô hѠ'kOԳ(ly3g|4)jN@B:q^v~0JVonٗY-xM}}xN(4 4j }uerNmҬO,gxx?N\zdRt}o$p-8o=Q#twҏ0VO8_J/AO^DQ|1b7hI5|\eNPvJ2VI⸕B@}"eu$O-va:yY"wƈZNcޘtrNn!莧sj-zɁJ0tNnaNm.7O686{םl_ח /*3ʲG>}LVWHeH':f'Rjv%)6VY K47Λ%^ĶaReƀZX k# "W|h "'~\nV$b:EgEO_ '"g.szM^lK(2 K^ݳ{US6%EkL6)zΖ;Mð1FVfdŮnJfr@njf="m[s+PƷY3fdQÛc3֭8˜/a+em$}BA* \YKQ z`;Z&lۆimڴrflv/M2",po1 #y|R =86nZ p7|S cͮdq<΍Qa|Ů ܠ.NdS6$yH7HYxYxYxYxYxYxYxYxYfn he^ #9D~H/ V7BRI/jQ_\50"SUb.rpBJ%{q}sa/ 'l1\h. gpٓ õ1LjQcF<w0.PRZ` PF\W"rMX9a}O I.B6 44MWETOyw3[s?`N aͽW ZTyv\ZL4Ql&*:BԠhȁ`s9SDΩ^DΩ$v[D 'NwIU'b@sO|;ȃ0 ĄMsUq U kyhmLV¤lnělBܵ:mܵd5)h담6;+a]56? '3IsHAOל@"=po)>!h)3uxenl^^Uϓ_ 02Dci8k 1m ˕7m.kM66 L}d@u/!bPmF4,N13onYt bzr87T> 8IyW[7O?|n_ֆ960 H Q}Gkzk5!c+%J?Ǐlke#W8by( /lGP`C'뤁@ *Q"O]6\HHޫif^iNYƔ0Ğ`۲fCufG>@D5&6ŌdsMK=H]7'}g5SwvoGv3ձw%,"[$VĐi[ZHl9+Kì*0fsb}ȋ*z`WίP0&:>iX {>!x@I.%lvאٙuOYVe2d-֖Onow@ۜ 0zBX*VBibrG t}c 8ڔN-wSi[@ LjXFq Wc/>;.ʄ> z8Om, qoG~oˣisJX1ܮSҧ.g\ܿH08jj"fڪcR1ug?+%{&odGv#=V>:8[u ۦVcTFF+,`Qu$ LTb5xf>]:N˔Ua }DS-ZRnFJI>SS@jPκ.lnJz25zf^KB5~jL#0K0ui0 @?2lV"J!f,jQ1{sɢRvm"C|[# ,ר3Qq,x6)xd><[hRgm@M$Ll6ޭ[0f5n4]ow1GBS.Q"lBfy\8R"dm/#&Ln6xJMD|gsA= :S(qT0Ndx~nױ7pYsYv$ !iւ0%8O3Lj\4ZAi&*i4fl;[Ukyam3tqwYR3 vlo؊[<ז-OFMayA'C$P2:8593a Ȱ7Xԡa_X+8̀X FdtNCno5Pkq(Tv [C3%p x2V`' $*f>;6 {a{#1/1S2 #+xāz-saؕJ`{O?E5M5lJ!*ƭ 09'J\[Kf#UzfhVㆭ|}\eͯ_-9Q|Lϋ٨ǟʫ7vn[̻bJ}BQ3-? Έ +뚲"(%5ISC}$C$$i0- Ny5%@YDXiEҚ3QrSؐ҄9N Fpk$^ G@I(ˊԯ"=1(i@u[!djD$j`FQX]mjSa`tg|Vۥߗ98UiR.\`d #s q9LTza`/)ָ/iPRҸR*9tY\G@_M8f.ؤ)Fkv/Ẹ4V2p|^lRQ(?S b&Ob&f`+9+4'.ŕ¯u›|u笅J(.Q>7zٵDYGߞђ0z꺓wJ-tuyMo.*@ }= 8,x4^0Zsp1pλt]u׮g`by*$,4$ݘeEzۻX3VwL*@&~o&˅\8?|ߧ|0QO޿'~zADbo~݋L ֭{ZukTn݀sr_#w(WJ[QiǛ7>^*TfBTPF%-CF#"p}((]LWo3 qrCu]c3^;.ߖ\D&0-lL<A4bҗ zfoFS*p=7#soa`3p;)9!?NK'ɿUa0\o&g&N Q!vi8 TK#(r7Or{cjRĂAv> ] #z!Uxd!vo}0z +-a$ElH8Hw޷zDGW3nTOw-[Cauv9\oW;6h|NVȐl45nN..__56lR ¯ӏw*$q1VYE84ʌ ;6`#*$!Ӈ1\vg+?-tU+Yuw5fb]͔Zׂ ɐ7k&JeSMkn ^dગYΰ LP q"S+zQSl1`1)FS ٠h1p +&-Ȓ)59bHEUCF (D$g=덜(~T|,-J8`rbF;-,9'xDbvX c0gS^7駖kR# 1b! Bpa=qiujj0I9NsO\("HsMuB+A$֌0ɝF V2!D>yon.Ɠ՗_@L="~ԊSߏʵsЁZה Dt\>_ rA.8y|7*v 6$L)"zI0yɖpmi -~6OŒGVex80䄅6qT[w]Άi?ۺ('sYVvR}rviX:^{Q/7Ȩ1#|.+i߇n {jťQaHt>h?__{=q&KɅ ؕj/m.sAoje*)Uŭ k Ua8>V>+Au{z10 .fdr˘r#V^q~.pn-[}L 6N.PHx3jO5bĜE NVFo w#Tˊܛy(Vg'x6ut-m_è¸5UCS4>k雵}e4:vm)N_zǓ/wۤt~gzyxzqU2a)kFT:Bֽw9q˝r)w~ʝr)w~ʝr)w~1Sk$4`kx#Ug˘Ǜס@g:"'%2@JF/tJÃ&ΑO^;p)'OxcR;ĬW Z{1[,} \3QW1Q*.MfFSw{[iiZG.| {Ә 7-&VnSXYoG ;sk'q^WKHqmu+ip3LZOcLNr\  ;drR B&'Ae YO9Dbcޖ [@ei(0T;u\jj_џo""z AHRVa S띱Vc&ye4zl5Ej[Q^s4NW !A_c&.J݁\Gk<K>&\4\ح))ܜ-S-S/#ЮDIZ#K3 JvOfS .atitٌ{z`2rNwrF\Zw.ޚy5xeF-2{ջKWϼ̃ǽneyϪP֞lh.1tW i՘̄1¶~ Ⳝk k\NV†IɆ#~C%7iQh 5(O+wunM9ynxQ1P=ZGEdQ0A[R.k%%RHD#n|w{Sy۝ܝ#lp_+G~Zhum}vJ$x; n=mW8EC{,˒e.trNs:<YNㄖa{%y#@:؎t-{u2"7IS)ءfLۆ-^\^X ] bEsX82)A7b++GElwAǃ;~̸b5>HK*Q 2>gHJʽ0(=DvF. vGԈaM<}z&ReòBe}coQ.!n;C뾺kDխ 6>7z, 9=7r'-m#7r_hS_pt?#P :9؛dw}uty🝃A,;si5+ãÓWov^#_glcv_|F@=_~A+O/w~9 DOО@lemѾ(<0zc:]${YPD]#o?]'Fɱ.|bjgNz),MMՁZC;_RkCty3PYymM˓ňjɭ㿎 +m/phwXSG =O;kFM9mםIe(UhwTDDG[޻f-/PP0+oĬҭFvaBanԐ/X2,zpjrmsyLo)T$[2I&꘣h!Lκgv!M+*JNA_Lo7v 9ep\^=zݷW+5s?<E{`>/^gqf ֋zpZI yzA?_.}\׵`$_=b>~O`&\p-5߂,wi%|f?l Sݟ(:3=AE_Q~ѶwPD_^oFS8\mǠ624vLB,h%9Y"JI4 @]1# K&(X *.XV$=fi rҒY4g]ʱ`Oh#rKGmQȊ8/1*γt3}LLM0L'Y`^ NJf8RHVulFId/$Sn~RWCQSAi lBnߟLB+gv-΀qnv(ǫ|ꇢvXs\̅˫՝E* ;E@/[(diXs[]pFװ{%B+fk?Z=Z8nT '\1Mx+\zVw]^!W+7Ѷݫզ=hy ,o6@׮V;%^RGGW~k9th hbvk u$䧣MvKm Dmm5/ 4!O-kI2sbe&aLd(|H 49e(W~/"]?nZfڝh.&7g@Ej3(ZZ+Ée+pj]7-ҙVϨn.~-w(f}Uh:)8rBՂ8bP [eiMoqg&&]rNCcx>01_x"v#DاZIņe]+]/lJ'l8]rm'F ps)OU_!Z[dx|'TZEޯ-k$b%Ix[F`=R|: E`} uTOdlfiY`,|kvۄvրlv:EZ]U߾Dmј%Gȝ<O]msIQDl׎::MP:4biH1B%eL?틊?˩"HfA6L3Q3?BI8o Ö^wˎ//g7B!3|S8s5%{{sengg1&Xݡ<g+C+_]!x4Sgz_hNҙi}YˆBQѰ."<;Y?ⴞ HrϡQ8i刈8]OQ)tZ/|MfNӶmfuMhQK<>^i:ImkQFO=O\=!QݲbN\5y@k5c/t(\kf))cJ5bW)Ę 8 ,N+~{"0%?,f#a&>B݀{.gf1ڏC}l>)-<iMb* 7blҿhe$,܄vcsLoҐodڲB T_3ٕba+G(>T1Ж`&y)UTZ1̈ا"* }/aI]SRW"n9X7SqӺ o6eO^G*͍$,y@Q"$1((gad4bTWA ڢأ\SbcW,xAK)QX@(15)J &]<텒" "R\ ]s,ilIc. ((ڧz!HE2iC)Ah@D"IgAcKJ[Rڒ(zǏU,VB<Zxs1nL]xa>*^m^"\-/4 !qS8" ]?`XJ=*}1r3? TGkP%!͖&U+}C1"=@%jPv#Sv MxԏD|\Iuol-,?z>%[ M7{&au*-^=gq'r;(bX5<.TyȐCR0^Q[t"k1P펂cJǁG\zFIcEp,zZiD<ͨqw 6t;[61d,ǃmV́{8FY%bŠVuLĄ2(]] '6yI:fz5 IރJ6e!!hԀI~^il +mma]vfz`+ %)(F8 ʺs "W)Dq8U1ZJ8T<ϱ#:$Gp&<Hq %`2aZ1&1!eLX1xDD@Xvfuۺ#NL=af81lps5kB=PSKxh Dge`9ipF\69Gce#u F5jXk20'#JA9U rXSsј.w"?N&Ӌ.k}|W#wb+-B  |jPa~X2EcM"yC8> )w1 Ejلq6McwVMICM\_k0}gcX.`)$ROn yTV9DG+Y|; TԯЗcJ6$X}ݚr2$+TO+dʶ>q K$ v'e[;kRLJ}R)gJR$t9rk![k5TtA'񇻯&3 AZ Q&Q" 4yx:KB1KN ,fdB)ʒ-Hdr0͆d:xW'PvѬR7ƦvvUi*>}~;bDB+ɜD!wB!U(i/ )9y%&뤾Zs>g~2 `kL{_z8^m c̴6fAJ,M 5zaFaJ̪xq\3#\ݽtqaD2_X~$!=ڜ%`29 wPBr o_p2BI|A3/GSISԩ-y(B>No?k.2wL~ $G45._fO6ų쇐=O$2̢ɫ!Q.}BtJ!J>XL)? Zp93g7992Zp8$9MXNbcl'd}03X8 ]YMfv@$85A;/xQEU'4y:_o\ Q&sm(:;u G=n93x7gF㧎Kn61HhT]|=էr E&^. B \Ӫ;کr6 Ǫ./mn[ps5kGp a=Eck: -EӼb =DxLĄxܠxALc`ė}iD<<~Ow,m&pha,M=4~IUa-F =u'l~RI@>N I,cjמ20}g!m_0U'ֺ̀g sS&?DՉJy'M|r`+-sow|ʤAD/_v4úO9+7bgŮ^Cb>Sw|ʤgr{4(! 1ۉQOiWņteiذ'I;ւ$y/8):$qm)#?⛽/^E{^f<ڥ @tnxb>'̻{\/,C"} =H'b`{#hwUKv9cX9vWA9mQ4iy~GW@<ߪA&Wea2l^)Y5$tve:)_-V뫁9V]דU;.m'kK wV֝\JKĦׄ}y ,>QaZ,Mԛg7! XKE6]K$-J\aF9uE.&8 ϜmuNIg%Oi64n&(59݆YzlQ3hfT̨O+u zuĊ⑕b11u,/; "WkPr]=uvaN׍wVY)hmQӥOi9#1B*!eDk\6}i#[OkwH#L9dxϫ=*1EEpCuEl{dra]E_5Zh>` CQE\qOy;8`n+99؄l6;g/"o؝<#'vؠ$696YFsIY*) ry͔6g{Kqޜ(f8:`|eÈ(;`D}$:]?L6o{-w`6Dj A{$.DY S̙ k*$0dnuJ(‘%LRe*/!Nj'21ǎJ%* Ǩ則2)N4otpRf;Ax5 eaf@&LmZ=ҔgfnxIsW4 ݇[nfaJ54+-%Wf4[J8B -T h Ĉ#6#D[c60"jng%7589RZƩr?AYrP=/ ^)ZcI樐Io3pS=SWce@kx= 8a` /1Hc=l8{YWԬ"!itiЄK .{u4^Ig[9bД邅vyT.Uݗ"N?I8@¨LRŝsx(DAH!1i!F3ʈg# j0I1N+ӯl39Ceg2ijF1̑X%w |%5X+"ñi!!fc:Mo#մ{Y+a>(J?inH s3g( ,܀y )њdQ=S9Kxr\3 9P;H֭v}eEpF Al2Kg\8?d\1?!+IH sA8Hy.HCzxzz!ͅ{.M3A*wT*!RJ<cR¢X>XoMJ~e̲<-{*2%ۙOrȧ?ɗBy5R2VL )q'Zb 5^0+suXio ,AD:1YeoW.S ~d4%%MCS_Q2vWEb]G^&8cϮMxz4~?p?+-"h˷͏ȷ]%bZ$W+w ]Y&jy%qg3G kf Ċu;x[A,d]n =~jR7"{NXazqþ!>I.BZ:SE:iqb5|lҸVU:~:̝Sαc -tjFi-aY+G[V(,dtVP"mZ@ޒ5M&&RoWӻ &aMKpVӋ*N|qVWp B3w`4`}$Y7] 7s5$[L v".7եWeT_wo/ ɤ nVG%l\mWWeҘ܍oNw`6QG(9ml \suZ8!ERjh:N]JsdN=v8$9MXNbZ%~v;JȖ|q+|H\ ;zWaxMu8ؾUi.PPbԎ˖]ZJc`2 Pj@_) Ed)Te:J4f&TF|s2jrr/9CaBre S|9"=ESF8783H#)0kgb1x(洳6m+u뚵Ͼ} jO>DLQ7LDWwӏНO!LNEDJ"&X eON6AwNCEw]|)G}ݲna}{:ߔždg,<6B+<΢ RlHb`6de׆I&wWmvGKa%Əǜ }hGyIPaRD"TdJ)Eϑ2ʌhcY}gǟ&~D J.)ino:!|E++k=b?Va>" O&q3~vG0}Xzl3'Ȱf_YJũZD3ʥRBN,4Lm(o GeծS_OIq6rJ.5p8=ܶ@h=tVrzHH^w1fFjSʜMyʰDy>VvCԵ]Nˬ3PZG('Hܒl, phJC\6ғ$VWN'+Ft' GtvIG3.q&F Kr/u d Bέ< 9NZ|]Ŭ;4N$/&k\ LsaBέ[>gbD!SA ;ѵ/9DSJnՆd̊jp~8(fT ւxj¡ މ35&8¸aT3j9{@Sd8.+a~Asƕ"G`*8SD8(uaaaZPBl+Յݛb:u 1)P1N$=XN6: cbʉ B6LZFM5OM;i A):~S'V{xo\x߰N3N$1 !t2%p1=a4%(%’ĘP-v6d{Rswa(yp%28|G^ݨu<|ii=@t[K Vs?h+UʍU7/,.k ExvrCO`/ڹVU,仟w%zR*|8E5&D"y\<$Iz|`Rvv5"H)ύziC:/H\p(IJХy QJ?\@B+viM>fSl|,RLYО3D>U:OtH@0X qªw{-pޔ<>\t%Cbayby$R\Зws<ߦſ}v};^׆ >}7zW?~U__i\|qp3-8(fC݂wP nK֬ 4hsL+f꬙*gs~j2.l9VWD]b46w0:Q6Y7`)XFVffWCޙ;3~oB-eoÁ FNum1a5ʟVoWӯ\fZ?/G%+w!ez+3o'mM-h_4?]{oG*vGÀ*Md'XC3E*~3"g(J56`KtuO=~U]]9\9`GXH~_!W'dyNf91 Y/(s,-3ϯ[z 7w~@|P%bAΘ" NG9FV܎\rឿfŇd~-S7-owOvp7~7'0q @bmD+ڀ\kS?KP^ʓ MˍG% 'RB_mf7O )jf'${Q$fw?_ N/#~<~LX:{bK~fŷܥ*CS~.~,oZjɲoǛj&Ikzmlu,8f@6sV> PJ"7Xq2.g\kd8Xy8y-5n8"i L;ƭ[wZ?Lo?X6On*W#zLQ kv40% jQ9!ܽzxt'k0w/|Ѐ܎~Z+ v] uDE`ĈhB@-FD%Žqy!mPD޽iC2钊LfXLZ#:ۤ'gKͷ4zeXb */ FEKV3Mj ֫w.ƃ.u. Ҩmi^ǎwaX aŕLJwa9o!t_OkM5QAc栌aRNoivcc˂|̟_˓^<ޥR|=Ӗ˷l^'={ ) J6-ot?QqV>weA:s.aθy`9u^^bXn6^RmMMͯ ߻;]'wu/;A{BtXdzɿ~aKb?^ih(tBɤi2wOۜPnnNQauXcfl8JRYpattuy (4\q9W8pE >a03IQN)93Z،d;Ymdۥo&L:X5*Cz Y@SJx->cé{S7EYV"]iw8pz5-A=(\2„ƶGr@3acV A-ATS C !}4#XKD-}qL#^`E]ٕ}.ve(m{+y#7$iM 'Ls 4mOn%V,,og t!D,|^74jK ^Ȍ ^`ASM9f"Ha'^4MhI.L7?:W/I^TLF&h0Gd1]HZ^ƾ" #$$8XG!1z ",iP%LR#1s:օ-{I%CJQKm~8Bp]y>m2ڜ^d-6xw$o$1|N^ڤ"WƳye@ N<l9(~V"&v[ҞOx%uWtAQibjA1ب$ FAɂ_F^˴֔-4O!nȞI+PR2 &Rӕ@i6(҄HXZ,3BR vGb th [2 6K> It:p) ÂЁ?'1#s&p|Qc6"`ԊS&'F,{ i #VIuHVx t)|-?W[5H8tJ 1GO2,':=cD|ZA-7~R'98 \`PhU2 Ě&82!B5,G'j?hj*!(˜0˨>GD-%QsY3/pHXG"\JlB Oc6XJy>%[G#Ϻ]eϟQ$v^~u9€' xJ G<20$UsDdXzJ z(_u wÛ",XQ~u»Dp Z@*  3"4@-,A"A$ > TKƽ P+Na/uG`/H*El4#q`b$liQl.h@P JۑL65g< + (M [#R8J,DY/H .I{lt!IGq2v ;jMη Nb?+ km ~7ũRLUpfoC_[$KRL2̲e4.^y!ݔġ}0E{d=貸RV $U8f Y"c^lm61*o畺l\Q(? T`'&ObiblU*j0R%0>7Yu{¯e具7ଁS\OQ.neEhofe+3vFoBm#i:#] CZѤ}eVybh\(XɒGBwc JQlmLԣ@hXHH}ٸ:1ŪjxaQu9s7~>ſ~߿D]o|+0 .Ch!a~q/~ًL `mCˮCSŶ.g=_m]Ne5Q m\mYK}ͧ#ݠp6b22El&xŃ/d_WE \RT%L"uM Q vmq?V;Q^16Hė`s0_;J0B2+{K ii!houH'ܪQzB=Ó1܋gK{NiX: N#c)HQXtX|;l<x+Ie{ӉV&pUbԘb&aIt^6}4/+n;u~?rx ,Uyf1gr:F Uu]go*͔ IUR1qV0#,RFXp~]縯hrqPMLa4U~o!3]].-L˶ܾt #F$W#u6 ) Ͻ Qg1s;`Lq5@rd4i!Xz9dۈQsLǟn_^g,OiA[`T),DPVjʈhAwmXi|h,Lcf?,0hqd$'YêQI]D8h}C.IdoeFXd:2}an+6~` S/9GO;;}7]ޔv>GU 6x>K6;CQؒ=7FU Jd7ƼrC͗mw~eHǻÐ̙s&!mwXLc 8A NVljkWo8i{ZiWhQR:TnJd)#"r~fqmo;Ï=6-*5%#lݗAwt 2kZ~\|YD>qb͸N~>j6N֮"շyy3-LHzӽ4i~3:KD.L`۫ZT OzJ=bԏsil !GX! 5dJn,.BҒ"x^y\+kړ%ܙ{ ܵʛ&W/˫wjCZ[G}s}B EUp<[|^;_\NU}EOv׻?Lh}j-v)hif>U0[7gy~Sه1%2ē`Αd $V^K7+)JQnb0SicB+ ḀhX!Gʶcĸ},⇼)ﴞjxQe7umht2PVEtxh6[sӍteVO,RMWԷ:ncG:Ct=s"LK{At?޶3.TJU1}=}[2}wq8r6Pt;k߁_>io%&윖z)ޢSwP6by֞s9$YmS_ }ɝsQynbz{lĜEsW_<|~erœk`͠i\\j$g+~S푮L|zW˶0 y;B;[^k<ϖǏ|7Ra㝫SuӵiiţzyxsӫZZ\sշϘv̓vBYG:+Gn>QpmᯚzS͎O-IsεQǩ[ u}OY@K!0YH܂#wKx%.oZ_-Y5lg5B?L_o'lLS{h:S?Kɿ\yms/3!D&Gl`W/屹F__hmN4iH@qD.f#3yOmg֌3>si( z6~o/VcS51o,ns:OSN&#GsH{Dù{ز筯}Oq|V0,)`/`ڞk?hS^yRc=m @Q hxyr5Pd+ XIz5K*V/=W)l;SOJޯlݹotJ*]'uĈDlΏŽU@:[|-_֥ f m^19Hg{ht=LhkmsU4N7'qx8V ']lw162>$bωS1,i;n͕l^iY΄%=cY+-Tksku˧HR)MQsf(2DJ ) 99Xq> Ӓ0wsCEHZ1EaEE(% :)l]{M2~X41#qrZ9 '.xH`aC#M}ϜI$ /xS F׍W)S)֔d4ƬUd~l.ՈL/H DT0P() Jl $Z0a  ~l9ua iQy]l]k1.YbrHN2Febvp'A){H0%4l4MY,YU QA:Ex=<(Yb˻W-)!j0PxY##. fH9qꪤQ0tTh1UVcc(Hv9 *,(t]hM9(x$"H&rZ"d^eF rL熅%97 b="A0@}Ax@V˽Vƪx$[4 萰`ufT$ 8ъ`;2 8`)F/ˑ`$ ` `| vwuM~g'cf^"`Ax_]b=V yX@"Q1@2.؂ !JE@ @:R{@X|,mt{0Z2z*vL0  !:U(d&!a6ԨQy_B%- H+0(.Ͱ63jI8.d1%b zH+r`Ž ì 0@́g; C kDtIRT@k&O|@X^WHG,ԙ+̀5, yQ6[bq/8#ˠF018d@PY}I+y C*jj0,eFVV`r pq\07nX|Eo֖j0v#&4Śc9,&1'[u)-Ld*7F\2M'<tRJRJ/חHycI@OzГ$'= I@OzГ$'= I@OzГ$'= I@OzГ$'= I@OzГ$'= I@OzГW'D:'+>'R+=JI7= P:!uC@P:!uC@P:!uC@P:!uC@P:!uC@P:!u7@3ߥ3m޳7 f>wA|ċF]Z<>m&hVpE`oኀ\eWڽ@t+zȁZ ,9uzM''Y+Vqhr~l)˥Rh@Gn׹>Mypޝ; J]"+Az ]-zᶷsbc _Z!Lo{/!^J;K5sk/kT8[|_qxW'xҨ?0$% U򖌹`W Dk9@ c6mSh^?$ LicߴcyU(f E}&geYY)+!=@bso6"E*W|_2Oj͔;$ukF]!` fwiJH]=AuebV9? PR޻B*'uՕU ^ kS~/z姗9gߜ𿟁=B½TtR*D+#ab /D3#d-sD!A;ԉÔk~IkTgT&n7jtj͒WV;г" V-s˥f-gR53=*/O|M 7+ܤV74,/`@Ī\D#J.;0{/XTMe'{4kHC0^Cr?x f5qמdߕvNVNxźY,rdS:nyh9,3t^ʅ ~Hv ʂMHe 2Ʋ(LdpTh0zؚ!}߁h9 {EڝOr*aSYlR.RųD4WC+ ~!g6OalDT9&ƚBXYlYRhj,Y#L2Pi/V,DZc`WN9k9נexSiӀbo'7jtg{_lκyүbhT?Z[^)LMtE!74nn}5W7FqYSI.4K&),3+89b}ndI&Um |Hrw&x5~}LrṽXS0j+[S -b-YQ 뜋f`BNlŜv(g;#gK9k}3q71kuxz/!Kۈ@8Zf09uyE1_xFH46VDĎփX2Iv))-ckr#X;(+t8|$fDdd ֙3jJI'-8'?Fwą9|N/?yqmxmk/KLh(FX4X2M[ J",&ű1.=F{@pޘy OȘu&3k`:ku'{A/[Y{ro1ӂי\RY:1ZtxT) ՚K\ Cvm:* z/·3 CPL#mtqO\dYI,<V2FHj*, _ m5sUvy?R~;!wuMkguV;p6Űf+2ow'|<<qKX+\?ĹPբV&mNrvm|{q,K+/(ԕ?A6[5.IoE. ޟ&U |waƽߺ?>t?iOmqɖ\ȓL=hoTi&8㒉 vt7ypx\xj |5{60sJ>`c(x _\#<[? 4Yʷiuf4Q+zU%YӬV=~<98^›)? Giyq#F)؊30!ett:l}ۻ4۽0vݯݓN'g/^:gW^2yoAY3QكJO|%)w99tAcb~]bVݧ Hxk龚շJ־:rt]ӁWR*Ihۜ{_^9?FLFuv:lȱefU[y@ },:f#c%䡻L~A7g'\ (! 0x,rבgEF1QVNyƍuY獾m5.9\ګ܀E^t PNca/tpV `¸K!z:uvoZnm4fˁ_or{?CO.)߼W~ft78t۬\w1囷 cV@&>.=I o-XzŁXeõf[k{ O՞Xmݪ ۦB1`CygOcdQtkJIjEEKt \ZQ( JתԠc.X(j\5vi7m@/v7_`||1꿟uہVvg4WΥStrֽ3cGx\ҨL+gAEyh*!Lo>8-û&X'cųe?cz:mpbc&9` y(w; JHtӹ}S >1dv4κJٺT'䋘TIՉ6tjCg+r&/CS}Z%--w/ Eyǐ`xzj-hmE/G2Gkb15WNW7ɴϵ^jw]0gA?^_Qb2wvlƴsQdwՇ9S=߿]}]FrCrrч4)geK2ςygk JRHI6rJNRA. < Ⱥo6ټζ+PmȃAт7oa3Eup͠Ӽ 62&lgZƱ"U@?t;)],"31i'=lv,ۉNyH4yD7<̒B2`f|:\ VG^LԼhZXz[ՅItQ2]=@hDvdmqI'V>R 0ZR tV塮:q1;Y[FH(򥖜N)bxЄϣ\SǝF'?q[c^iGjd~sVk/;^s|| *K3Xi"Y[ߩ"{ #͙HMixV{9c9ӓd&3ۖ-|q/`bed$ZJ]13u$gnlt+۷un6?~FHؒ*KC)uy#rϡe\yyE!D!e!x0k5f,`ZFL&Z7YHKDg;i>ݵՠ(U l2lAU,ߢ ظW6Voj\i{)݈\"r-i 鶇)7vZ26j7QKg9@^ M#ƉGA`èd h(RdKJYs|4!|r!]<1DzsL(QmGs:2,"1rbrX+?B"I^ q1dq܋!vv'{ ~?܏hhc}>cɽt svjmہw"g4`[*$TɂĔ#"ᷠxwS!Iu$=^->ayuB\:9O43kKЕ{Po_>V?:K#:萉ʁ(ɔ -=<6G atnE#dT|KWW(Q!@+^&Q &SZE8w ɃNrA=wDSVVVw&; މL50Lg >C&G*cMÚXǼdro Gt.񜵛eXM |g&*gvP'(quQ2 &uD%ԖhQjc2&zG9B~ՑV2S_]ʵֵV].^?ёR79n[)ǽ~r(ߢ2oށ3$Shf€®nz8KcLz0z'gӶ2p,Sc̱i̻MٝSH̛~@]uT6T2<_N+<{wX2ti͜XLjA-e%$S#xǍ+ӦvGOڽZVj#sj )>"uh0:u%Urnu/DHJϙCWW@.FXUJT)լ^Xpr'LpF@U“UQ|[/LVի_sND<]yHatgK1)[ r46Kٱ(D-;xefder@|D*Ѩ+ "~,*QK𡫫D6( s~]$-N)4)Ud"XKA4戄hcտ-{Ԇ *n|T^JWsu9N.,kuft#!@X^Mk.׽,]9NOA-:EO$s-X?Ov(љ$? dz7߯ϱS '$"𫡆`)6F[ GH(j ,m ΐia9N`.0 r# T)Jk'&gZ]%sƯK97ܤ8l6lQXTuWK(֢JH1I  .}Ԗx93duaҵt-*4NꯥWss˃PM\=M^5:)Fhd@ar'ZK(Iz yIat${0- 9K!o># \H☁S 3p\  D9.#ȁ1]H؝$R,cHƴKjbKFb$S, A9,aK[$l,iY)iG-a{k=[nEU)Q.ה5rM\S.ה5rM\Kt; rvuN@cD/jm4,&MES ՝ʙAPI(00r%xD$$AJpDRQ 6Pc9댜-嬖$f?zn?,m#J8`rbF;-r(hb41xhdzt)ߞ OzETH8t`iAHģf' 퉣N;xQͮVgs㤙6* RiiNShU2 Ě&8BKJ!ĚgщjGѺƵj^ajL=*͎+߲{w];=m4#TK*|H Y%MKOG 7P䘂*9,meDQJxա_g,P MFKF= SMÜ;cX`܄7/Qk)ѝx̜1O"qnauߎ&eݤbfP ԫY]*t@VHH$  =dT؊L[.\.ҫ4Fc Z<8OA [|?c<EDͥfQϼݮ~!ap) 0: f0bAf:혽xiZG%hӇlY%Za*!i`ȥ+^HS|aSrI5jS<񯦳ޠ]_!}?ٻ0Qgw`u30.n!A"3 D;pgy=] Mm14]>d·Wt-m,k-@~)~ԫ\r3]+.LU[ Qq/9]I]|A/hUŴoM?J>3ܥUan@*ㇴH cmc1Ң/q8"zq3L35Fa *-R#gZEtlE>R *Ip282 Ӳt8bFxa)84RGi "ն:23x&ԇ=ff[55A&gȚ`ڳg~[%a_W74AB1I$}U8} |4{~ n3w? !R[* F{7rzvVR;=Kb7*ܙCa b]S&z( ^D6jPWoΫ \ϴ "~ l0{8HLlZ^OSpp3p쒞(&K֖h:3[]mj:EM3gUO?fٻFr$Wz6`a]`O(VU^`$Mْ*RRT$#bY]^ p6YZz7~S9g߾+==hu7naܜ1Qf]Hn*~UW"eNPIb jޒ\pi>T Χoֹs+-IqVPEp%m,1`rf,eJ k9ʛu*o>qf mI+9m#?/YEr\?ba[.yofK2P $2;kGn& &XF dZu6I *H =4Z) u9h87 .Fk%lGƌ9}(gQ)7 2Cʸx *3nw3""CUƹLm峁jx5ér>awy7>m7Kf oHzfޚ*֭E3$lT}&KQ'2Pi= 4w҉C|KV2ΪdY s%$+3f&pcZs.K2*Z;տ88NŵOO|r }i}~Es6;~ Q-,(=`!AU4(@ΉT*BJ@@yGr=7ox)5]lc^{?xr"P1(0$۫V=ğX4(nǓ8]6_?g G Ɓ{:g3[tMv{w_ozo|6iӒ2{acsh`^=SZpps)އP!F)"ւ٨V\I%i$ noYg y7^-;a0nxMv*}}r?d^k<\wCd6`6`0<-|g`5P1(v}bLsyU1m҇jj}B=K6o~agYIX g6n4i)rk[mۄYNKYoq(oqsKuir#Yr;Yx}dfcx}D3uzlst>vFf ,Rݔ|-4kz/2-tmaXV41]EQCx}Ϟ;uM%e]a4l:4\]V yH%:zr xN~7*mJ(N0^/Ucrp/EOh'TfX1-|sE0Zݠ}1,'&R$΃$ 3Iy>/#|Kv"Y\uI.gQ /JC̉F_5 /Ԕ-\z-aʲ5o~r8v7fw_9~/?6?})Kb-O_~qW:y퐾p&W#DM.<=/VR@&ajX=!UsO;'a,PXjeeomTDlT蘸7W'%?7ʟvxQ,eݪY-CZ8ųYH{tQ]Mj:BAu*O@I ƛSy3zŔ9{VWIg=s(#+ 7CWR}+maeX o$oZ+{͊fm6朘Q`χ \R:SR7o]K]=vfl*9bDP $PbH"p (Khp+1'M4N 5A@>ƪ:%ja6%"Np[4F392¡_2{d5q$嘮u4? [vNݸP=Ft%p q{}Wsft=‡w}{77C7 ~MȝMSt΍h?.P x5*qX|wl5Oz~l\#]hRPno~ujl0>=}랎g GΗZSs~BxhH]0v1tu w<ӷu sޛIqCsuw:jƶU#8wWz͝>Syr5Nu !]֥H Le @NcBkZi*2o<]hovާ[mN> wl6h6ʘ]n||`ݥ,A$扡HN/#QdelXws4j9Xwo۬U-<`ϳիH 0O]ҍ Vd!e );a r+k~q Y4q(%iq]Kg= U+xeDKH@I Tj^jӧR,ѳNYyV;!D'+L'7"f 6ݝQ$ʻƚݱ&ԬifRTPjgPjJrQ(5 F-#,cB:BQ(5{Q(5 F(>gqI8I'I$i$q4NI8I'I$i$[\g˘qcI8I'I$i$qM',gTQ l*Jq>E%(Q4U%RsFRPj'BQ(5F(>VPjJmk(X(ZpPjJBQ(5 F(RRSM;P(þJG<  ͔!<g.fR6\fr~ ޤiw_FBy55{->όdiY6Ch!d@+~{J Ƭai@5Ifx.@I%ŴH&4g9h4}۲ܐ֐֐vL5ŽgA}f䈻?U,?0_Pz,%Ġ9ɢiA"g~`DW.(w-ڟ)'|TJDh_*ȔZsJlNm yz Y, *E=Xk,[8RuPFkr $!xK\1%xpJ  `5q85TJ<&,t&:EЛ@PF[KRH)8C_;˥ìBhoBvZ]JqEXM@4{naJ$@26!rɬCp%s*F]2?Bg+$B/[|v5,hc9$)P[gf'4 DN@ Ӕg(`OZ[*TDMs#E - *fM>7Z}9 ,͒ \"VoGe'R)|զKg+wy <I%T!/Du!gb n=ɢo'^EtkHoH0y@B b^kY3޵'[kyJxZKϓRӪğQkZK/~hb\ƗV g*;'4M2lWߣg{|GD^*˧g˔$KLʁ1h0!M30/xR[ȖG=! Z3F%cB煐Iĥh-}Z:XMOf_bn.;&.B\.3׊?uի[p?O&pn{TqsuI=r~teJt#ڎ6L⍃Ul 0+8u_FMO +$Ŭ-~p%j$[i޵nJB_oֻeˬ!af(`bf|!zzt4csU{n}U#|o: K$_&!O}1E镪|Oh5_!ZsǤ iFqFg'H?o߇폷GQfypt5j3A' Z׃XZ4-͍`iF᛬rϺyŴC; K2_^74N]2AuU>ҙѕu}F6?UWNPed~BNc+B Z3"6|qtݖx$O]''0xx;h=wfltNb.2^Uvy[xːDz8s>cSR2A0${QƣRtߴI@B#84傩DU p@!CT ƹD0|QO\nK) 𙍧gO nolDuY: ]4]Nc:km"3%;pp6*> $UtH98@%2yU`iγp΀g $pTkdCJ }0pvw,.,=Qض7iǰާg+R;7+e>pZ(k~]-tڠR-7N2+_W z<9>β ܲ7%ʫfr>;U㣯W'y޳g @{=:|Oŀ)zgnQ/?-ɤkJ?@#}}aVdY7[7*(4$idZa6j͙DVLIǐK<՞J1p.!&+ 7?SAivX8٬jNs:Y G=Mf~iXM5z|—×]8s g,g]81cLqg--_WL=$BdUzPRC#?80@mq7NG)MRFHu|~ҜnP|Y NoOw1:}f@AX; ZώÚ=s^|;Ό b3QX]`Ab佬a~d8&nƼstN&fN]ޱP S=2D_`dk)uKT/uKRG/KRGMRG/ _:⥎xaRG/uKRG/uKRG/uKRG/uKRG/'x#^ꈗ:⥎x#nEP B3z~4*!PqQ"$@U$\HQtgK&3[LqUW"cah"X$=ޢ`: bDfiQ1/\e82EZ !* -!1@Xv^89_`oe-Wɣ쥕'QB{QQt ,f^(frqfS=GO|Z}~mRm@;ove\km&X/=PŁG¹")P^mL84gԩ?AY/YN}r5&z'N@˔M1$B|Hr\3+ ҄%XF 6JyU|68YW&6cc.xK#M-M4)ܠIrEE|V IRmO lPLߒ#26`n;G'!#rhRM Z7TPRp9)1{,:L.;Gr'r'/1z1͊5"D8n,S?lyԫ֐Y!]BBY b1×ғr*_9P~OzB㺶:0VDg(?nw~ȣ{tZ@В$HOMJjG=%1@)lIKzΒڊdYQaOKäPHB#06sR<Ыfj3>jV%Q m4Άt"K3O{k3 T:c(HtuBg#) %m ztJ gZ0 Jg9w8b\|:/h)BU 9# 6B@( N$T2XGBZ(v/npǖpy/0\f݄ 85whcO.rcbՐ "w@2Z Eg+S3&?lB6a$kd3IDi!ᖉ W6G?Ns L(IH2cBrIFj)J-@Guc֛!9&fA9Jy!7os&~iskO=cemj<4P]4Rì|n>@s@YR--3ޅyPf W90;v{^1Kȁ﷛I:os S>}bvx`8[5(NݗQƣ9fJ yys|p%j$[ޕ\i2wY%CdͤQŬ'Cjhv,'2r&V۫FSU#tH>LCb+U9POh5_!L&=lWatvM}x{eGo_10aMԓAIPn-,ZFl4CzMUCng[cb֡fm/hrt:q*\LG⺉U>`#[,UߥPeD%T4"Ġ83qyS|V)wHmIYQ''0xx;h=wfltNb.2^Uvy[x!K9T8s>cSR2A0$)720H^z[|55sMRf'䕔yDM><AvKZсZZh„VyY ۘaRR$*-<ך4?{ڛF_Arw]hnFN4JYQ0xhpnCc)G:Uuu!5`_]h7|eIq0PewF\`_MN:O'x|vY7m}7_ *G!VtW^iMΤ5wkI. 0ַI*jf_,op U%^i\K5hb >&\k.1Ic7@Va*ڬW7țWS~u]K oXΩ˯mRzvJ?mNTnMAs dibY% +1L}:V^.pRZBH 2!)`p~\4s袲l-z?E-멌{wyS3yu>֗Od_~ίG0ݨN03_ּ s.{>:U0+Og:8yJwc9V׽؂ݨY b ZrV8AVׅvۍKg[k-0cP0vY4a#A29H蘋Jx->c66c -m^wpڜ@{XaJh3gzT}|V'V]PuB /lP"UfARLӒl^{d}&3g4e;5Kf҃;v/krC7|]zhj^_p{^kT+F탍Mvm%ҍu:8g0zafrFi7e Gʹ0}DXtHg ^ qzոOETCFc [@ea(Q;ulI`]7C*؂` j`vQLJR\K.0&SIDySOg!imw1*AvgBN\>OwA>x~˶3-`Rb [gDL a*`sQ!mBd =!"}CLIJd 1Y,-,xGI4 GrTȖQ!S{Z4qA7 MrBjMw_fbmɳs;J{kNי2V!e% MTYN<̐( nӇM4P B D˟B{N,7IRX4()?rr7zL0i?J~R(\yQf h…+^$ML8΋5MHF!+-$-p@DHcZ")瘀6.f(r:iEw9lVJHH!O#<`J[6S#AjeDsrH(lb+u6}&ڏtw60Lv_#{=JPz}~ق2%7iQ\ f-@[}]u>>#e<9#6fծ||f9z*g=Epmu#%`3hj<>dqOJ#G=\ݾYw|Q}i?޼r5t]a6t'W7{/ r_8nvJYI2^1G$Ds\iO׎uCBOIpH]4-!U-ʹB? 8ЏyMhfVNٗvX匒GۼYocYEC j2XQ"q"?BB1@V`]Z.0$5AZX,(vGEwo> HB9X ˽)ruYMbX.WN;˭&}.T =Qʕ_x)vZb}LI5,x#1b!0,vf %Zʒ| 9%bdF/cL&6gl ՜E-™8qڠDeM?_{r\+Q*F&sxN0+~e) hLBDN6%Q-$= \H☁ 303  D9.#ȁ1mP=_Lc/ƴ`JhbKFb"$S, A9+a]*583eJ˔KJ;j Q(\GVNpA9p7wsńYj_H"M԰K2t,5z Xcy9 $X6k& e<+W@ZB0.zH#ĝS>CnQ|Pm"&6h1Np!hcTRPm50є(pq 6Lj b֚2cnQ$tBI @*P j4!13ˌi5p8Fikwdsђ _14Lv]2e湏u Loo3DMpf5׵l8 HBquZq3*TZY崁?-DaA".nT(fR?^g:;Fx/$3A1XSV ́Fc@bR‰4:z8[3Pa@Kv)59bHETB@5p֤?7 .CT X*ԃȉbK–#-}@Ac<*jPx&6IQ.֤ESdPƈDlu90>Gu%vMJ&+f}/:;;]]_*2QO?=Ͽ89&;-H K/@Hl o[e[Cxӡbk Mp,:60=QmevLVD/oonRU_u[d&Xݤt%v? d~Usi /BtgP+_fj@1t ~$mGmE]Ivi9I0Bʛ+{K ii!houH'l'=a0dp࡯P4XMs`3†xHGBxB, cDřuZ99p⅄nrsa:KN;U9tg?;a~A/L{NߦK2A1\h!R"` .,)c1beP ыM{gmH@h1fK ~-,9<3CdQmmluηm瑺y$*R1qVd?1#,RFXpiH{^@3$+Z2i3)cXk4Մo3ϖKj {U@K~I+.Rܥsrݛp5g_/~cSe,ri F2au4:^jH2e2ӟqYa8BPϽ8Ӝ锚't@31VEcÉB`T}(,D`Ԕ1тFQ4A!eLDewakZc{?hV Ϯxجj=Znr>37rL%O_c`2m-eT8 E MGEEᆆ07ǭRӘG"h2E 62( F刊`) )k*:Nk@d{5dK6EYtٮ-XiܲLyRL$ ߃L}zsrh>pR|{RMCޞ W>M¥_Oo0l:[j{3 OzOB[%wސ0( di BAg ]gnbOa,)M* .v|SFB˶\k |Wޝ[a*nw\oO-ܶr7՝ g^@껃Ly|w4(YMymP\y`vr3 zaJƮ4:ǿzеP1q8o@":D n/Ʊd2&c(Y eyܻº;nmAɭJxDW^#{u=d7`)eEU+.]L.`lGopwO~#7=t#}FCЋzVM4,fڅvhW>*6W ftQM>p3"؃"k/%%H *XZYh6A7IJ p7yPǟsnu`tDZ0b4!Ynxty+yl9s^ =7|Zlb{JRiNx EdXivJ:PAh${ɭ(ꉷ⭌͗@UOS /y2lw,iKYks w t8?+^6)sENZb3&KƸY⏙`9P bԤMy<E"ʖm:(Fm}ݵېgɕ4LnN>(jMFyeܡX"3ՙXbR{,1AQK|D@OP:Į=Qv+*A鱳+BgW] M":ĮRw]%pug@K :vv\.aWzS`Xu] ɮ-ήT5+7C^= ),/ ?7 GcIEoOS:M2f̘>3L9"!N y;)vZb}LfQkV iEJAcX5 k'ޢ'm}(ma<۱y2oBIu2$ 3** gr?d1ovN(k6x?{33inpIih=[={ 0˾zM\W`RGQJmƌ6>f)SDc"kihmAޝ+GJ`5M;V~QD=#ݢ>9rmײ*TM~, BVn$ %^MQZAKTOP" `6pnvyO\J6~{a^;:n:ASD4p,(s8PL,6Xt6(,r!ch@L> 'H Ⴑ>rXPAs_4 10ڒj8XG!1z ",iP΁"NtPGcmXC@zJ'u@-6@D8c~6eE0J!p$;tѕ37 Zr37̍nW$l߬'L[.\歉3Tf*(ͼxS"S ~o#j[j;étGeDD\V90LpiG0ƄyTH"Ö,jXE:Rb&y$lViǬQFYJy>%(KȳccٜKI;b /A2t5uo/xz[YYkY!pBg>.8/*_ZF+(HDdX\HAa ́(p.dnt];p"))pP* Jb,"BDcPB+M'QOKĺ ``$"6O 8C 1V Z$D8)  ߎ]%v!$9nW\@Q0ǩ I&PB!zBj!B4rU :N2D5"G(zImRlVϐ##*/cbϿT)IQ Gt2I܌C>.l0RxiB dX9qҠDqo!:lN ~ZcցGsP8 c_1Γ b@"#^ )(ԝ<Wvt{π ^Ϊ{o%@̱ihMH\:|a=US p-lНy£0.J`,8|)\g%ٷ|/zlvu}x)ST*ZFe,X~)T,<4u2%LT(sDA9ŃI-\bZmTNo9p\%!~?|;_}PygM'̪ixLC2CNqZ;݌tU?Y_Z%2)z:}5 rztPkYPD,耬H]Seu wwxc - gVd#5qT( Hf 'J m@Nwon]fsz1jF2uમmS M3zT`8-st>o|v3a?gg}tGE HN%Znu Le. +%(%U҉6`\Ӛg0sH.K)!DžB? B3ެ:)BdNe2D/6#ci z;ׯK9~ YV?NG?VD<Ͳ@hS+`o11}e9f޺ן&*{LFS!(P&lHEGN̋kr[Q\h2) ˨wK"+ k5kDsp3rsv;ͼ],>w!cP )YdzD.P=˛A.8rB4#A %iqd`KfƖQGƏv5[?vYWOl|s}]񖍽`G/73OGJKfZ8|!5_:S1SŨ<ӎ-}jv*=DԳTjkyH?Fڶ<x f7kw)wIRnKvR5m8]DZׅ"ﴢ7h9gaf hN6FMf|8tw{S+A'&iեh]cVoU&18]X3bfm=z_{>hW{+-n}ZZѮnweEztc [vjGͮ.6tGFh|P>?*1|E"G9n Jzf~˨S7e>>+C6.y8MDрΆ\V*gr~B2jD.yR$pt4i"$%ȗ戴ks [A'LֆL~YuZ, 'mN6{٨#"&8c xt:Yyg&̻,vly'ZjXǟoϿ72g< ȵ}G]x.< /h~& Q'$Y]dCW= %hY5#y!&]NS$܅-K+[ڤQ@b"Nc0 _+F ׮Zsvҹk@_[*+Q H\JJ. hR/^\U* vqW 2W`k#* E\Uj]"*GW҈!^`ˇ#*r0UԥJ WjU%X 'O *F^TZWP!]JJKWJ-jfHΠՒWr`JJWδ6XRuScտ? 5F"΅ApɉI?]mԜ)%UL^p$E*Q$Q`q *#KQj) >w%x)_@~M?onnaɟ5Mi~@]_&udߏ[>rC~W/+rQzY}GeL2uTQz& L|L2uTQ:*SGeL7:I90'ÜtsaN:I9 Ju֥+ lZ X,)O_R۴_]i_i9[as<馲֙,- ՀF`Pb0e# 4UjJl8FI~+YtQnѝ:$@;SMͷj;kST:]woIz}͡q[i2>šÌ yMۜ}F N`)P!E ,d)-B$| ' M%qxɗODDF?^IWrIzA JYtZXK!M2p SYW+;e1C Rit r'5#0颤iի~Z3n2=o'c>u<&$HC,-Tp|/OތvYEў/|z)WTzk0||}(VM4egĈV8Gl.BPgtH=MER1B# rlI\*JH=9  2\)mNF ֒9%~+IB@0H$[$ Д 59,$m8"펗Jmh,U݂m!6r^h@nTْHG#tѐMG^!M|Dn޷8zt<#c::kPc*YU!BtM6ªb$IU<(C<#|9sZSrڠ9Lq\󷅈:"sޒde\Ts+ϔ[JB{\RTYi" hÜO 9b( `|,I)>Y=gyeDULi u.ZvSe{jGڋ3w $ z:HHkr22"* ?ޠSM= 6g-1 b| 9Jie J&o|ٌ#$C/J^Hy#˨5˕QdSj[3o~ƼC֗_Ug13ǣP;B7U)MtLhk?hR::tEqs(0*.Pq @Atsq.#fZ w>,IX`z{ҘJjJ%=ioIvЗ Fس3` $kuJ(&eknR$%iKfwzGBuxfP4T # 9`ŒD$AJpDRQ 6PeJ$YѬ&'E#(rLArCƶ,"dz9IK"_kGȧ[F> MO ##S&aΝD1, H^ gZ2z3a젢Z;!OG#LՙjwhRTM%c'uJOJH~(Sl2&E&SOFg[r@i˅+×Fuvn;CSf`e_O?5UիlQT4U>SM#pS,(>^qD00unk2SZy_x3_6 g;k{݋ze[9 ϓqUϖBFS?8tjrMV3T>T4 V0b~vݘx4E+G%hIu\% ZP0Ő40ȥ'%PI3|;:s՝jSgⷺn~q%?O?}ӛ~7?~:D޿'Xqp $;N`Ϸ??bh0thC欛f\W{}wH[)]7Q뛾-]r;n=eD_ͯjݲ3P^AEL" *w+@xsVq4M)kuEYQzkavI0Bʚ+{K ii!*E>R *Imp84+!9 ֧kSdaÂSOLň9+jhq;oqb Zyw^3!0iXx(%e/Ϣ z!־wm%`jQWktΖ{K9v Ew`ʸE A@F3,B♎3䙾8Wfi9QRB=J95O耘gc.h}B`T~(,D Ԕ1тFQ4A!eLDesakliYؽwp{qdۢVhk 3_mܙ\}' G^PPʴQp2`v4\+ ao[`1VYE8yleF͝QFGR“" UtٌYq\u;{w˼C2&:Et~^5,DB4^"Ѳ7ooAǞ JN^JJ8%Xb%0lio  moJvyH n2k&տY1M1^f<`fF̌/Kuxuˆ♑bmJ'ȘVmWl'M@=-6wfaZGm%n=\0Tܒ 37 A6=&?>&Hgn 4èU>;Js:цtJ_Myh^gc)vL9%jW8+;Xslt> zVERN^]씦lAUJ_?| "TH֔C.;MlKmtpW{d~7&z<c0̪Ҭ;]B~hC ^5ѻho ۇwRBvdunBvZڂ- m|,mDcr4l;]&.-9?E$7,7+[U9hŸ s @` *CAЄI1斖-ϖ|vd>N36lS6]GUw}K{*]إ];_vE:&wc U#庽L-P 6Ԧr%AJ!?ET#Ղ{[P$@ TR.PG8 Yp̩sjMnV І5ג z,Q^y2q-, DV3LG `3jX$"﵌FMFS3Ζ]j-UG!%3X9ݵ~ pCtA1Qo,\Ænw=>mrI7FT,::/AW M6;^.&o}4".-Flr5{l\n-~y6y󐲣ovsNs8pnWz΂\uCMrﻠ=zM9ќNt۲_ ;j_oe䏒ϑܾU.9-ts?\PN^(f/Q*^6ys4Ϛ1P2a%H^sn=BkrD <rXUvߕWv'W J3pr4*+رNJx&W!a͏\!W \N\%h_JP*wH~~ 7ޝt'fح|I=g.XQYeMt_/'wQ+(:'G!**[.C '2n.߉ug`Ud O7tVI)@.Ov Us39uN+Ŝͺer^*22xeDsNd_s"9D9sNdE#'%^ߖ=)+.A; = Ĉe{hO҄S߳s{r{9=ys{9=ǿs{9=ϭм%P5Pڛ8Y"$wTs;+.f_3\ ]gR/fv`4w`݌{ÛM1 +jƊ1ogϷ3U Qq!p6֝RBكO!rFBBZXJL!0"4ٚRЏn,mOp&33T*X! QP:dǵFYNgJ+ޮlo}_&3TheiOΈB`)ϭKt6"/.4'<`"2x NI 8U{ɭ( 6巀cw̦2=d'rXb"ol.LrپtfЃE*\yQt4BR0xc!pDCZ~|"s>|"+m#K -WթՀۙ`"g0/1Z%Bɑ{$E5Ej ZnvluE *@tb/D @X@. ĩLu닫٠ȂZpK\cIIO72{6a|3ÚVj3öf.+q*gmb큋톸9Ϭd0䎔 ^}h#) 9B(BBOaL<:JvYyP6n})L$20D'2Eؐ nSs՚G˵ i}SZy)][,szwgV,sQضq CI(Al&:,U\7vް; .c(㢐PAXL.Cڊ![Unnnk?w8 ̫7 LdNsŁwSV#գF'( 'ja=<'SVer(5|?+O ɧQ|h"3J' рRZBETTd2`H%qEc` ȉ2DP$HA!Z橏!nV*҉r}kpGclh+wȽ[󄜎Z`e,dIUz !*%Ԙ'^bD/tV[rmZ)fQT$spp6M8g\T!є '*cÉRBrhaQ萀2d=ѣ (X1蔸U1H*R*MȤ挍3nR ZBhҖ.&:cYᷡY^? pslU 9|y/&2O h2)(IBhbE2Ys΃O4Ԑ8NCl`c I#$*72Wa99Âqy*\ձ/ smhv6zhF9B6"EDa]HqnT6zl>kTc3CPPHE!(β@<(Gq-ϜM`fEZiDY#hTYlYQM1+mґ4}B{yRjph j) K'A cNu0Q@jڨщ9gA&Ϙ&04]T;|JɅf>2)9hϙ"ymBg2Qf$E!"!mm"aOkcs琇 >5V$zv] o&\fhs:>y}/Hњ "P\ADq(=IYa!&O߂6`oHN;kd45cFxI.Rg:EKbLx A< #1r#x8:J5;cg&rMQYvEkYɲdjk;m+$3)4m_5 h&F?l\~%ʒA,Lŧ83xЮYhx[=\46x_h99k-P7+EX R,4uEEH'ߺYU:|ɡyt\s4} ~l_Z+zlJ?x3|0]N_S0e~w|ԃ\묣$˧uU9m^^wKodz;=\"4﫛Ӌ(:E,7-nN7kdd 4<=?+5|W+gl̤^ Ex .KEdly<`X-BkpjA̠;ZL7P4' go:n91 x`&͚ }yo] wr0:xoɀ%vUu& վվ&37[Wkց]iQvpmqҀo-|[6,Oh'Ե\1m בR4g =#Om&بiq=ZfNm lq+6f<5_Jj# tedԌ:XxQO;\a,|NƏyce&P{Xh.P፧:c 5.pEphiF0j==y("<}=O^xxs2H h 7m5=/Χf1SXVOZ8B*Ҹ6ۡLק4H. gO8zRRܗzK5[S5^5F Dsd .kE`o 9nAU)7jhBZ)iA\#@4.jg1)T9jn (t١3)S)H8&{m&-PtR^ŕ e +W^(N`|c] Q5c^30\,gڥec \PRJ7_ 3LS @'(G!wC#kjeRq5?^$ PE L0D &Ajř%ɂb# 9Rs{EXQ)C@zϒМcTZJhog#J\`4HdZve}ڟK4$cG[/) eH"B c%M7ƽ'MvJKHiIA$xFT^q $z`"qHՀO;zb7%KVj.cE?p3?gКV8/qn/TNθ/2:>@L."/5M?|R70z0=Rz]"0J#A;_n]d3hd aP \\* @_0޸޷EbrlWKsVQ!:N(/'W2Cfo8!]̪_~(G|J~B|J 25$("L¿D'`USCxӡAfɻL.&ofDA[]?%>w~}C<~nye.3d㘷Al?t#H6vuKLUI%U&,DP9ӷf)m6Ҳ.(;NMn'y ׄ8+!9 қ+A9"FYL&X<Ъjv8LY fLcN<1@qхmǕ6#>赈RHHhJ(CHNO|:'ؚNB_90;kԪ6t8;9\BwƣnrVB;C԰zX$d;3\D{yjQGszqդtCUoUUB 9(Z^6$RZXZDS=1)'{Μ-Ǹ&jF-}mig{KPzFɕ0:zJJMd543!~\DO47JHѥ봰%Z"BVW3/^(r4ub3Gqb=]t:˟I.y~vUK{ճkԱL:H{\%F8JYcDM/ZX_ v xt%VKF]U%|KKUͫOȥ ?;RWhUj +Ӫ2:. R|Y0j߾i.IƤ@-iF3)2 IY QDK w~K4/E[<&j=Pތ{t_W *6|o 6k:` ?t!T}sNesl>v\zdj<JCڮ/𢡊^q6ϭZpoPl 5F<{%OoHlQD>^޵$ȗv\f2`2O[LyDَUӢ$۔%L9&[fUׯ, IZ76ٴ^yqEz"x01{O"yʒ LTpr+˔g$Z?FDfLO;C>˪ դ+As:j3$c5Zcm6owW&֮ڦR(0 kY/7C{cutݵlO%тIN͛6yOUVxҭ&vba}9 WRn{iZmebJ=Ш 5dޡYQnstm#bufp$F;(Z&igA@[:6~7mzLQDe&Hqh7 4ÙUn҂NDtNzTTfS7(* oG|o@. [Y^nqXj8.m=!Uwf&YI6*\>60dګq0?WiYz'*P.<8T M,sxO$Ey6&x~xC:C:%i?} ʸRhX)Ql=QEĖPƽuH ÌМJ4vJSMln1̒Q,+no67}[;/ o'c6N+$]'uEңQ!0'Ѫe{f"WƳye@ @u q%Pđ%|*-o.F)Fh*ŨhJbxYJLLT6\hJ.ʾZvgղ)2$9AljNÎ4`jl*[|I)>*ۺ4VkiGRQ)*5 H%(,hQ OAXؠo}W.b#p'+ zQSl11SR/) Li"t>p*ʽE#DHD"KԈ H!UUWX@3rv+S@}p#)aԊS*'F h+GZTC#:Vx ^ozu"&0 gfF,30xLI K=cD|ZA-7vRo' 9@\("HsMu@TH a;#AdB!xptpUux.jhh %T>4/=A.b)@c UrBB^ڞR]xx\]SOw,|Ӛ;7Ax|j)8S&aΝD1, X7SUgZzNA L[.\mPkUP:UL(Ewʓop0>&P"" `t0#h`LǕH"Ö,j[@F]3 md?tvKP0qW)+#X3Q O idHQ"1)G1R9J@ r7)jht\pl&j!FR,$Aq)  _RM֜qL!C"k#s$J8B'H*0 Q R Q=VB:Ո8BFQxVwade э okٛN"H/"L]Ub/0d5Kjġ6.Y>ٿ?:w*K y?1cȣ}r_P)S=v-F"rrN.RyPL݉)pe'0ysoS Ps x+:mq H ˥֓+H]q"Ä yu8O|ISlv>2>dE|T_}vm%ߚl~̰+*ebĉۥ 8,Xxqt%0>C-=Q:'Z~|np3l.|Xgkъϙ|q8g.8Hъ_n&iM0Bm#i:GjI0 ЇbTQh\9D1WɆJQ6j۳J01LCTBoK]ߔz,yO FѕUT ; /ǟ9&8: LK[HH<ߟE G -ƛMah|fwWt->\h3Z _nPKyծGlqH ~1 lZ/oT%L"*w%Dx hƜe^i Rʋ6FiKLgSXՇ(tG#g;M̈́4u@ƒٻ߃}z6|IM޿|4姛S1Ky3w)7X,SX3U/8lNS=JIfarQQ&X( VJ*%D,nU.8w7a1pJu`T+!['MP:[OlZLfYuZ_RYYsŠ0}Qӻuu3eه_7tVy5qz7f`Hnɏ(z'twi8!)h)*s})g)W2v ".:A10 f1gr 8F O&rXeH!ݨ IUyå O=cF@THbZ+It6k(T3faːՋ }TlA( E6v Ce1dajR,Ѿq˾J-jRU'-}gU ky`A ~t3^0hkVi1@W{tg [L5քΡޕq M٧t7`]9qv";˙2.g\krd44=C}tiU? :44;8, 0G! YJ^Ikt$: +뢋1ZYjF!B0>G",xSjʈhA #(`A!eLDݍqnla 6ݜO {sashZL.6;@> ݗ;;q4L[K/v!-hGSEQ@cb!: q4*k6M̨3hQ9b0XJBx u 3rKZ x4ĶٽW+_/PHw8wbST,dY2$&xә롊kx10'a^r# $g\Krh CX|; 7Bqxi s\VhfGy~m\Y9jc*.oZmjUw1#[咓n+qp\@npK5V]5 mN9 TRy#8A}Ǒy1\`L,֓`ʧ"9"CZyBed3h x$ )K8cƌL"^ˈiDk45EGqgQBG1Wǖ %e7Yivؕc3GuΎ𴔤/'*OnͱPafthݻ]O>[i]HZ9r_Bu&76 m:E'@s1av[Zкz:[o(t:k٬:9nݽMy7wyf~HqMn?.<n9K'n 3vlSoȣ2cOe}Skz%VF*yfa6PE,Xnm~ \&+JszR$r-J{R>uL}SW/%r l6EI:p'Tr] KIθa 產yѻN|%݉܉X>cg(_nmɋp0{9O%i Ȣ`;X+\$JGF5nwݹ/_ cr}[tԻaMokwik8A ,_9$A(s3yΆz sZk 5(§Mu6# \TœTh/w uCy8ypm߆8C4;tE{8@hLs^z)k9\!r -%& > >>*ԏtv?HtVTZ"! QP"79Gd9)xXHǞ "k/%%\2N CLDK9 Q.#vFv53o[b^ٻƎWy͙鮮 6v7Al%p(#2)xO5o(^Ȧ#hWKW_uWW> j[>\a kA=|DfX??aVL2Kn(6d`'eFiϳBF-,A[Ey%]x74eȞX ˫yb\o{Lh8f|HwBp+6FtV5u9@Խs @<\H˿~nz33l^v}5~w,Ӫ֙V_wqf.,xǡT#6ヅ Q6X ej(,~<zd;X Z[O$C 5GPĕk[tQ]Q瓥78,%[jǛ!|N1Mv"Y W' }:Q AcʠM\1o}!E \RdFbrxHIo4I>)reZ3*TgiP 4^ϓq %CJܯQK_Tzi>0S_UQyAlBCjZX%kyV'39s*G9(s{VJ(+cbzR OCt)Wq.eXMȸ kCPTBcIAvSٗ-LN<=иї Glk db!H^VӿAls6p4l >%׏)`DFLxAW F1c3eBrHm*#v5q# sy)]muXE=ݵ !<(;Js-J"qLY*7#y 9D҄51 #a}|̒]RُS~VR8eeD{DVpƔFr[DA@i.F(AZ$9Z85vuz״^.^:a'P);qTױ䰻@3YEo;{ا ,s>M\Λ3isO_ZX{sk :WE\ĮUVK"=\B+X \q \i%^:\)2uH` 3pU+pUE{pUWWiktJ2ÞԂ".UVK"5U^lEwFvHZdH k+Qk!eMvgH+.])߻zpe2?~Fo/?s[*iV3N&w㯳5w/U7Rߘ^Jm0YXcÔR{ͳ4y?͖i{w梒3LR`ɱ 'Af ,MJiisJpqNnbJŢ@ B %%KQ&"dA2}IT赯:m*:H6f2p1('#7,cA/<3L>j @>)ՃE"7UL,OZ$Sy$XtmN1D11fْNCjT1`s72YZ<TdH45j M8,}ưu0N'MZ#}>̫kO(a}rݱT[Li,`y(١-1BwĊ *ʋ/R5nPF'QA#Pj2]j~Bil|2!Zas]Δ^_2ݯqLOzZWX}'_mwGuiz)>s„hLzrs#8&g{M>S꽴w1')>r=1cS8r縊7)`qBq!$H쫌UFwz؉h$K.q VZ-K/U) y0Df9 32k"@(01Iu[HƙyRiK*|1񟞔yM,pnf4=훮^ȝMFS+5MV?[~B.o/@O7fi8=.⯯ќ6HO6"^fw |E+O^hG{ow|ݷXR?Py{.݌녽Oe8Gw7ϥ^x<ߊD==e]ۏs0skIK6zLR`|F믷bIz.RdYJ_6]mF2f(=Ƙ{M?f"CsAB{HnlL˜>`Z;g{Lg;COD՚4>w?m>W)K0eLN4^kgde㓎Q!/<:x%0I(Z"U*H J:!@2`^h\kBQ"EۨQ ae\bA@v]gA8..@=enYtj 70s[.u:eRIg)<w}=r}5{

 S g,aNf[؜A(~Lqli"0="\J<Hԙ=flLhU)MeĮ&~a\q./CQ+QgV|0c$$cTqpZ sNŐCI$ U0K]Fr$!GH&&$0Aq$YSs8p/`D>EDYGk51$Q#P }>`0Ʉ@N.>`]-mު5״״Nk bZNu,x>Ldڳ@2Za΍'nnFKƥefiލoN?*(L6] 3(94Jjf1F4I18 ZC<7ҡ+\\MUոۢ򷤰ʹ.ڄJnodwQ<=ecѲrХ߷~Qftudl>34ͭf;+t9̭_a>1Lh7'4GY=5WdNX17mAiim6O$`$_ۏPSM(d/Mb޸ۖ+ß "gš[CZ=\` 8,͖wlȭ喗qgQMМ}W n@(=/Ww;-t@K&a|~0ξyf/K~"O1kw۹okjW䃎s tƅ̗3aB7ըB3QS+Q2{gH”zKނHG5e6x^6MLjO7soRMz|&r\rWT^q!l֍Vj'Fwa,|E%L}; t]1䚑Aq06>ktr: &NNx@n%C2!E]So/MɭOg P?1M Ć{GE㍪1.bk\5: 9OpiI>/*4IRAsɲ=MP5[S[ 0l-RPIV)*hU'gUx0~y QyLodes%ztJ j_&N RMJ:}oߑ}ik"UhȦ߾ʟaK_WVTn[ⲡ6꒗%uQh56{`ID>bVIy$nL4)\;L^z;Az sd1%8#9$P[UNh .oK3X=imP!@&%mkmȲEY`h@0v:b:|i4i )G8~o$ˢ$%KfqXsN+ kϬgنz)ٞ5jA_lT X*ԃˉbK+GZ&h*jC;ޫ@NRfwǖ5H8tH1b! Rc8ʰО8괃ujeZn:>N+i'GAk JFX3$wGh)_"XCpd1UޜJҌl2OҌ'XEx;ީ3 Q"lBf4-_z\8R"zm˷"c"U0^-zwL&"*ά|z|;7Ax|j)-S&aΝD1, X gZzȧs0vR9IȞAQ^k\cufԝu1]q9MlǮ:^fvigfH-!O^q28lx;j6K^*Sz;c)Su6%4P p? |x0b !)T tI~bfCk_Sm-Ճ$rR_j·4ք`V\:hDv;)FA4;as2@n<3 ~{+7jmJ~h/S`VL_(b_SWۂ(DI#X"# $CT#xqaEjwV[~[pc( Y^/|%&{;fHip[ }8b%5*5JP]Z}n|t8^q$]EӠY ^Tis(3 H;7`)"g7"Uԝ vpVM ^.fڿ7@/њs)e];\2t$Gl`i>;l=9Y01)o5YjޝOoΗ?*2騪smmw` *l~_qj7M>/T-7L~x F+kǰe3T/fvi6)S; 4=]wuC:ѤeQjC9mԸIo<=]-^NS+A{m׮g`bkGVCR^Bq6+VVn*@&~kn6,7q˯~~~}f`DC ~3ڮb;tMpΧM.Q/hnYk?+ʥx-C&F$>g.%1+0lPf+fX ~x  \>zͻ*eIc_)qϫ}L>?{0ݷq]uŵ@kKeVNu5Y]`p0Tx.KDkH=zpPj!qvwm%hjPK:+{eq_M0^|cS,r\h A2au4$K脙鋢LG'LG/20K0GBW,3yB<ce]t1F/ R/ Cxd!>RSFDD b FF 1˅ن2Ă x8uf2 c[@vУ];_ۦܸy0/*jmq-ս@rG^PʴQp2`ڂv4eeQ1&àg`1VYE81E 62( F刊`) ትJʝ$pkZq*{ެ~~3T 3)@sC\GY~X,\ν2jj }P7oJ,~ Z-I'2cɢ`2B/V9pcvm òe-AxQԌ>V^3w@x%TyAZ_Ee+!JO>L_U38^AiTGkv@k$ƎS_٨P,0snV+>] *omͫ7Wzd&EvY;mz]1bλF*̒-כԛ}Pg2DOS{))Jnhb΂wDðq$$+Ȫw:j>մSUnyQvZbm2t6T^^5+fS̕Ⱥ:?bJJ1W,?vL"v+6gܰyh 3y6.zS]ioNʭ,nζ4A;P=Tpgwl0ՁǔT },oo̝,~whVV[឴4YɕT="1AV{r3hwSMjDU-{ xa}gN^ r}틾f388xva" (o EU7Iѯ@/K rgVYV3. 6ts:.-U+Km҉a7hV~ċJtx9:+矼4&SrO?qM9+8SL}z rnm!E G_ A5V(lOO?)qqtO8S$N#w:yNŔG1c!$1hreqc*h%{gqoR0#4'(xAc3ن ˧tTwt^UYɠtysRcڴn:Of3/ Z&2qj$IZγ oIWŒ`^¤`_f^rÜQ [XcI"!0mY9 :.Q#^l1sP-7)ͳU(2T1h8"eT̊(PʍK<(ax Ȣ`; P+MX+"ZH2lَV:i6 N@Mܟf|[S?>Jɱ HЇ"ttk_Oɟ (D(ĵBB,ZXJL!0"4=z 6"h JZKU2b"D8 *F2Z#,'3YA"< ld|wZ067 JP7bLE䯗ŐA0B2;p $h$[Q ;+o坡/Aw|4Qrrף}`{T._oH ?eR1Q]H !KE`9yG1JN_ @u㛇_w:gn/{x8\`B<UX-zg՘IDk1hFs+%"[؍ϫ懹ף\~9(7hhT8PeK DDϭD tYmuias!,$S [)rwX gI AU3I7/֩*,<*q xy ؕxeJfhm溁LN$vXvrkN٪m .U޾$iQ0jN$ՌZB ܳH6Hq 8"Gkb'tF-&x#32x5sf"ܖ1p[(80dk if[{[xhtVv 46VmͪAaFh'^bK$`3ys+$<"8vXG2BN|Kp{}G=UgkM7&&6#ϺlNk1եʝ(UN3Eαdy_GS!]i{0H#U` ʇl .]! ;mVYH[iC2 __3xYkM'6DB-MEW/ E\zE ~wL_ӫ_7ð!֒imu'w0&/1NaCT qKZAmP튡];RvH4H GCWhH+q ewGs]iap# ."̑*RHxb*'eNEݡo u=g7<[Z6$ PRs=&3BDN Hܻ[V[wZևOgj[>p ;Iϣ5Artʌs4Ф>ơ<$E4'$VX(߂`oHN;kd4Jc,o5Er(J1&*yR:9%x+x<MSkoVukY}YF_y',A#T6Z {#l?DtDq@XiZU[ěy.h֗}-ʢoT?W`tuƮEهK(Ϛo< 7?iW[mI- pk^7[`A:?䵸o{~wӯ9V=kַrAPݞ9s]M'7g)7Gkh;NCC&~NFkϢwf"Z'xf2\\- f' Ϻ<`uBBd/nkJlJ"Z6~Ƽ/ -Loz>^&#;'-46_isj*l߻pWt [w,$dmɞ19ޓ.`gJ`&h~IWiZ{iX]{O}`xOh'sFd&zVJ+ǁ}1'x1J+:QtPډzuI(o EM oI;_VtZ\˷a67`nNV娟(G]sk:xYo鍸y(o72'sT H"I.=Hǜy3qvǜsgq̙ wD"L$|iNXu"4y tR*aRe;`d T.jV; Tx㩎F&--Xi#1HJׄ/ENTZ/ӱKOo`lyֽ˻]ΐfEr& ~Zs+Z̬*EqB )Pi Jb쩬T$IS {fY7JK/Qa<( DT .k8  b́11Snkժ($zis|9q3`(UNC8ߗ.Q=Q |&WоRްs m^%#|ft?LQچ-1LN+yFq9P9:c@.pi Jci*ypOQ:@lu'$wǹ7.]R+C lԀy d8uZ3KDІIB 5E̒dbg 9R 1K=*e޳$4tJK"'h Y1rZY-Wl%Qs+RV\!1OWN Dìx&f k<ƀNJo u =}j QĀU1FS.H=*M`^ {nA-?t~R'-S4I1DP$*((Mxm%xs6(8NɾO~#՝*1ӌ@BsTRCi%y_F|4q@ǕrX}uDi*n&0Rȡg _ 6-p@0^*/E|Rz);I'c *?{<(!J'edܓܓH䌓EBBlrdΈ/rX_Z('0NH_DX*#pa@9Q9;n0=a%l0&*iA e$$2 X8KF(ɰAҲkD:υfB)jcA!ƃI/smg(4d.(n!JR䴬vԦgb]z8o#{tuZFht^(]`کL$0"xC,cp0 t9d_M1NAV|ٌ)s!8jTj aV脡DD%YHWςcP9:3Ew zچI4(n)QZ3lcWq=1qD]48oMY&w?чI??<ذRG]J`&yl#C?TqF}T93Aރ a(;rq$ ;[7xLQ ^ākf7Z/HtBy4|q;$7--Azϟ>~~B鯟G:FᢁEW" LJM-ZLhɧ/xyeGnh"vh~iYks?+{=.]d3gc (淵w:k_*ceTP0дj 6ʫ>NiKO' O hB<9-:+rDǭMyNB60^U'va"S_hyt!o*gQPHhJOQX .}t$rFH,H3q9]`1tȧd,x "q`&ňA "1MAN9C%Q qJ69k5|0F,FNKgϩ4xN(v̽\gV{xk ׮pOY F=f87{[Cxqoo "w6SlB[1VA4jvwCZ47{vu]l䧛KZrn~F;[\߯h-x~4L킂Kg`,)k^MVzGrɚ_6 m\1]1W@0C_:DД⥄dѮm4LL$xA*UEV'B?*5"As\=OZyCzݻ.Ĝ>wcss$v1?G墵sYSM3L*;u Cq]L%,x#+5W9XEtN& &OE ?Pژd5dQr#C9i tE+9MD[p!(\XOȵ5<2AB_xF<h>1ϔT'V89G67v>x5AμkcMp_`@ANeFɣAr (&J+Ha_ؼ5@:NK:h %%[<ؖeʒݧt:T}uc l1 K!Aom73d1]*! JClY>DЕ[ I9pn-*KSR3K< Z)UZ "Fk)J>6"x==.GA{"kx۔g&Qeet;lmp^J{Dd[2cqje[,(VA \%d Fr4Lp xVjrp":Vz؝Ӵ]tِ jYطs F* 4_H]1hOƢJFxceiv)1Hat^~NIt -OIc;%+2 D#7`O3.boOe8(5|k^wtLJ6 tLاfMgxR~#DMg8i~O FYȬRFy0 !9ElڏqnYFZ|{.IK:Iy-2zl;K@U]HͧO.8Eat.D 6`j|P1fs]}Fӧ?޾~H=0 $7So̠%6x1x{0 {^Ù{+P[v!Хoc?^_Se/2Az 9[&'1f"&!沠bDiw "CD5/%"*n|RM)X\Rų6#T(^/W:<*d*KիUz_^]W*7g9kD锍rsa%(ZN=+&_~kKۙc/QQZoXl9]zO(%₋921IEAtr!FxpvʽIX)/a\2H1KX>dP9rJP3ϲVHqbdNcDVh-0ibY;\vP2RqV-q?K::[Ǩt-U^ZpZ&R99S=D} lB 74J"aGL4Nw ۏ3A>4LVCND J$b9sE.Π.:Q*n (F>ۙZL8t~R'-{u0'h!0lh堄:&iECRE&>Н"88:4j}R 6Idm7KۀI䁯ɹ8kY $eWZ~D$E@k2z8K ΅J kEV0%0K٦@ bM܌K%=2HҤXRΈ-I XG$+2aYr h.luĽ-CPD-'{= :>6d:'BIzSBjqt9A#}Ct coV QL;  ! Z%jtVdMʰBRu@ ]b~ Y'PF&JR*yKH c~B-ϕG^rҟ7 -b Isր?n4 2+n8U7^_ck$YcEePGJa= 6̯QLL-y[&Os#O27%AϘ{OpG@ 0L\rxc ] 9UzGGRQH'` r-H?~k Ͼ>ᤐϹX|!~ocTx׻(fkeO?N.N8i,͖O'IfM8'HT{Se=lto:e/O7gN3ť?̃te%ܯ 1כI[.v?)VrLJ??ii@!|!&ZE+XVhpp7_dÁd+gl\5g0u\r!ebMu,ʽ >'7\bӛ[sSC<ӛ˽ؿ:'v?ϟO_q}o_~DZ= ,3&"Wd SKZO=[m55yżO㘾mBkכ? _XJͷ]r3+aGle-y}>$TԊGnfmE;䰥k}Ƕċ$GWc^6k L Ez'bw,}Oz\ቓyNÑ !IK lg/ Ө9YČI<{Ǎ)(| RשsTx⃤t$GI^߻nͼc2ئ>Ŵj,YZ~yA/[DQ6 ~ܪt#ql8ZM1TQ#9?rStHe-[c>k-K_ C{a_O4 ﳮ{奏f0:;+xn!Ye|דzc.'nɞncKjo]zם3fb^MֻWݟ1mzvk;<;tQsjh/1,p(qSibJbf'!Y+&(IamZaRdn ̊t$cv=v+ h06 4V˳ߍ'xfh4ЊLSy5 3&.<3 3p{} E3u`Ӱx=ߓ`mJ`!h6p$a៟]@Qٛ>InFcvݱdČl-z5l/杺e]at׮l:4N+mRi;ݎ[)pR!65y>Lx-va9jQ޹Em@e+{~+9鍆)G^yGQ^T9όǛ[(a:vYKfnIV(Gb ݴ;mɅ^!=ݡZX(ـΩaFlBcӅ>c戎98|2:С,Dг? tu+7jȈ!I^@E yH\G!̃:yIySweqH~`R"?= tq/e GP\*UR݋:TU%JUӆ,2ň"Z J$flJ}J4K\O[-٢Hu"wjs*h QVq[<7^!sR&1;/< dH!'Lřu6 šk0n*/E?]^jGJy բ:b' Ǘ-XFLQ. ]4i Հa'+c2ϾzdL`ڶhuq/BL>2"8{b<ۣЋ]F᳍b;•h_Xjė?]Q Mαab }ʉ )4ǡpP8{O g΂IV)U\vY8%&wYD$ ZD"rE'=OWI!F/I"rH(YHبLflYlh*ˋOqEi2J?moF&Nڊ ׷pC4}=Ù0L o?e?f<\ҍ=\K4URם 6GpC&Ro}< Mn]V,h.ny?i\jͯnN-<5V^l J7_g`@+MT:$(Zr)X NJ$&;RQ1ZRXE5vp4#g bW>p}MG8JӮw;T}n|(Gw~oHƎqtTou@гAAz2Ŧh4m %h-#!|2:XLI%P"j%3:tSG+cr6 P>u#vXe R87>56snX&8ekWԠ|r6$HDgRQ`AiRNb`&;Co_^+,tia٪bo݁ofGACx2- rlċԥD( #| Fΰҡv3hRbhS6H}Kݞ} [3JL<;{>%]Vͣۿ6~C1Qjv tuS#>{ϟQN["17a^w^BwP*w1]vdRh":o:r骇 0Q΍xOn\C2 u2n(&s FgVLfYHk7Hh>r{P֐ y~# O<= .e|k;hY^Ϻ)S{x:qUкW R,)SI+_BGj[PX7Qfײ)$ad 4۞(M9Z+\jҌ,J}5Pr(y$䆩.27B4WLhW1k/:EX.A.rqEdz>|hX+Y>뢼)o}xsc,;&E lP{4 ;̎Y+IY%Jkr-?gϙLȽʌeIҲ:ز՞S+b; gxA`)QYC٨{dEBgFQR&Gr ᕲKiƈX6I+v5#gRZ<뗓B+f+'c]Z35k3/?d55`fm˻Jq16׾3&.07G啓]Ar6nַώd09dؚFHͬW#fUoJX2ߧh-&ӻ0jr$H |$PD6tht >?ABY:RdQj!'PFSSZo *nX!ˮ֞:gq34ΡW}n'(O ʪWNU}typBIJe!}7UvDz&{a~ܓmD<`ZƹOATτJ@u%q-V !ɄN!|76 9}2$cZQĊ&ר'79I<񻺏h|YɗҼ]g!S>c>ЧЋ*ʾ5E}wtyz~R: .GôM<|Jf.+kًwP7hx۫{T^ x7id\F'+ ze$ƿ\]^ҎhX53{UӨxz0Qu&?>zr=嚇 VΪYjUjbIM9ԉ5o]:ՄW9#y(ΦtP jxOtA:9/^Cw}ÛwRwջ7?~Z/jK$XaL¿D|Lm[Mͧ[Ld˧-6󚖏b }ʹA{g \8>ZV?-: < l~[c^ӿ7U4"1 XuhA]a qZ&"; 7У"%:{o%< BVNzj\a }z7kɼSgCEkdgғT$>q,Y GVX'@߂`oHN;kd4${Xb#VG#$KS%JQ*"UtxqSFx2OӶų_]>97Y,S\?ВRdG+/ \S∸JiZ(-ɸ7y=#S籶0-j3ޯmMGa߲42)x]p|{nIՋM]e|]\p&39mɑȗq%u4y_W]j~|nKqA*tnzpߪݡE{nƦREMvyU.`xYJ7I7Y`҇UL8@1\SCLPQĨ8<hXOKM@ ޳%#1<&S8rI Z&S}Ig浼JC-2+aV<ai֕B͙ ʼnbg&ڳ*}W Eiߠ$RK{?@h ,$mKL].˭+DMøex~s3B{KɥYV9w,LCudulx3.y@]#ɏsyZohWK ڄ1{4c,yqmo>;eҒ΃(N*:}u+7fǕP E5x[pR6s|-w:e՝fpg8jVts:|4-_HyXX"-KYKx,INˀxj4>>P[A*Y1 >ZbpooGYsjwEN9M'T̚fq2Jp䵦VDC~k7c\C`/kքx]%83&H e0"N99\[Ų2 .,EiTgT69Zd>\ YRLb38lOם޿ݏ.iox wm*GmB^Ào25ʭѬu޵]vq0yK-wz{!cݹ̃¹_a utܥn~9rҬw{^M-*{z+6Ӝ"u;ʗ4G7ׄ #mP}Roo˒9snBm~I{7q2dHRC1? O"{xGA*&(* CXaSZΜ')~f'极Yw,bK@`<^ga.a+g5DFӬ_?~OПBs6_^;|!UTj .Y%0VU]*<)K;)9#p,Yɘ8jͼVq`$pm(-ӯOCur)y |n,۩q[C,>/)"*-5^+/<$JRQH3GǕxo>fͦUXnr*?53|\A+7moΣL00jfόH7 -Qdd%&Hʂe$4t>ȪS?K\ћ]ဈ Hמ״sdBʘNBc'2$9bO1Zks TիŮGPFA dީ+2۟0v:^ š ^^ 7? q Av·#qW%,F" qrQtcMWa6˓J1~FMp2#C^4N<tC+IX[Z<~x΅>@}S9Nqđ ɝ)|`LЇQ"Qq m?D\c,dlH(_<\U1rzl&.&Rj8ۍGaI]tʵya*׮v׉ FR)##XI5LITJdOr +$B3!2$hϢ^"F< ^.$ JuTņ[~ٙpbP( sD^9b;)Eb૸̗;!\w_B%v.fU/2 Kqm=Id P!L,UZ̢b))w+Z{R,u3g!SJ.M8H9u,Hə*.@;M<]6d5S5ZN,cmm¦0r.6=]'xd{- Q!HS7n'_Fl} ӻYt?~p:Wz  Tgғ6*q = Ia0;PPxXL} R!"9"FƭY $KS%JQ*d*y\p)x#tY->/Ϯj)] fban+fy&R3B>f}aP'; nbH ~EhF`<$8Rt}RD)8S͖_'H y\t&F7l%#ɨ]QO Ufզ4#|4_g6Vp#%yF^jUx03=G8A6:/g}2jf_~m0s߁j7ȹvR/F j.Lm2$O&(As(Hg8 G^k}PnŻ6&8bEfkIՋM@ys'ӻ0M1g/p|G38zU$]gHtn᾽gack|*TH9Z57M@ swpۣd6wnv[Fwp{.OA-2+aV<ai֕B͙ ʼn0TiY^L[ҫɢoPoa 4A%x\.j!N+$.fn7 RO-cH,i/߻b +ڜ ٙk`ucz}Lj:qԺwh6jz\msΨ<ᑇenYU]tm~J1ѳY<.|vB%aQTu(Vn.CC8x N F<VNֽ S\GNy=9z".pxtGl/*W_/Z:=[P_<]t`l}k[061c|i[4}?JT R2ģaȈIU#eB^!N\\&yAPEdB6HQG8-6k':aH.t R*aRe+Id Dj$V; Dx㉎F&("8J#(Ar.!Xj8Ҵ|2'~tqƖ泪]/?ޭr|rDx][o9+B^v8m,^CNfvg`3} h ee'~%-2mI'd5._U ?].>*֙fxxu^uOWRTǫ$H~j '->F&Kbn 9) P {ީs7A7H |ەn_Oxu Q:4H1LOKAb"HȨa҆[oMUD#%/} AosBqn .#֮U-z=8yMv .%}fc: ͕ٔ4m_ˏ6uhk/@3>8g<&&u$1Hpr 5 DKn5]s.Kp/g딴/Q4D/;!{Otx Jv9%TL*geF812r+K {DE@(l: 029[Y+W_Inb6_è%adiN`UO(ErLBNxVu1}"[BOc&dn1@\KZ$.[HCjP+^z?»]4ϚYr!jA5Y;)Q1Lwd+edB3Y+*+ l?j.DvGAx3-xN,4Xt,h %| teŁQpV/mmחܻؔ j,f·x$KsZrpGT4  t޼Q+ !ţx,XsHMyqȎtדbXuϤ[ř{r]ڙ{q2OHMl"Y]% /WYtlFz0S0ɢu2yǥS;{a~@üO@sd&"96:4K灳 hp.dm\,Dv8H2i^w{AY HC GǺ(CIz m2d!Gncv[jlYvM(ĺpײ2&@'`2y3Y$y,b 2F 9m )ɗx~+v:ŒrFD( -`,J˥̰rH"Г U$H-6JAPylU-a֗ ~ugy293֊})HfH2Y [2-!6SHT`(u=K۟4m#H)rX)lHf*5=-8!] jBjVXd/Huil - QM4c$T!,]R .c~B~x!w'gERY+h?>QrtNjpm/ty >~,7q\{]^}l_i.m˖_LL8]ImϤ{Gܒ;P2LpƁ vtb4x'< #fvmj2sx+GGtP&±d?g' mzYϢE?·dJ?ׯo!~X0$)q&#A/=%qML^~R-H~DKx~lT^tbULqhxr:]Yp+ !Fv]/v$c]39Ig:FtND4ˬ0Qe%?n =sx6ZpX笂u]v=!ӋY'(%L}~~vʃ >G|p>=WzGMuv͠x_A^;_}?T>{߿7?~x:HᨃIo{Ԧ4|jrZO=[m5cޯ2u+C; oX*{̷]rk3lq$g`QTLb܂xZb A1l򲏱GZ'`߻t2,xhJA%V8D<XU~s*OsMC{`Uʹ;R p^L\Oƪ %z嚓QTQU7endd *&Q`UZ.Vݥ״fܴvi-11h#n4Y$WFkt$w!T'isJfHV:ƵG]Pfep0E)+E]RJ-M4|WLg [r45-roW?}7w`B}N{XmWdu)xkSwۤgP]cҧrY^U:>~VےZ0݌ϯ?oڅήGq_=w?i}|5}ˏ^"ΆD#5Ԙjp\19d|_~'Dw/kXʏQAp1 m(C0RƳDE%'k v_K'iOĮ89^[9U9$ %x `2ҫgle 9ѣ族l\\A^BFk9Ldsȸ)KYBQlur*pP`C7ﵽ4\\DϣçKft7=<2eJI:"OreR(X$6faaEe9һZh'Hkg^l` 3o~Tء_rF>>A'>+/6.* JXm#C@Fs& }N$gAM Zb!EĖ\ZИuT mt1c%xYU72ύ̠cJ`f'V81 #|fAd/AP5r6 ҉ ]:IOMyKR[?[Pwnn>{n 3T Ym鋬sJ(2i+ &^hMT!GV!ހ"<8PY/Uw޾ٽdzic'oesy5+(v%|7{ow\WoxrܾRmLD/ƖFHeUScaI^Ӥ^WRy)fEe|%I' ?6~; %ڽ&bҔύ̥xA {IPc b5=xx+CyjLNTn4.B#Ӎ)7%BsBJco*#-Y\Xg@._Xec#=!-4O|:sCmGGgףЦ8Cm8Bڇ6;T!m٥T3upAQU9kpnUZZsaIel.sM<̵эKݪ)XiâM Ճ" \c9¤Ϲ>Y3*ta-UBe]{]puAّӋ[ L&?prklO>XQ,gqe^eW\l,i`B\mADj$L. L1hlC٬8-qǮZ[VklEHPAh!F{-,`4NlT80>JU\LȐK#!]E-Pب 8_"a%a5r6֩W(P4b5U#Q׈kv` P,'s B!9p.XƳREK)*Ri#7F%`6`)D+QDrc0隳젟Sü^^i;^)Ѫ*0/Z!"ZU&EM04]߾=* ˯*uK\פџ'm %wadN{T*VuP-*!PqQ"NJDġhntSz7jvR-b[d >YgsI]HlW04pb^ "(j^vuݩLlh-'.G79XTŽ%2KRx:5 :gX䡇/3|$hMܓev޼hl @%ܴ9o.U/BqmJH2 &X=PŁG9ZkIu`zr˦G=PbTZ80&WJ$d99#'N@˔M1$(B u$9EDjD/Մ%XF 6JyE{t8[YEW9$2cǿ^BTl HQ䩋zRr6qˉX®[֢ɞ$ۯF~i &qDD2Yd\Pe# <{!.^L N9vhRؽ BщbZ!҄=5("Ƙw %G'cӛ 'v[;~g}R׀*CS$(ATŤ">FF+/##!Q86$)GUK!BOmݝ l7c(sٗL/I|Ȯg;&>Xs'81bBF @ @T;`q J ='=ytGseu!;R'R'!$ .ݱKYu)gn콸.mu$3sHc:لTQZvl2sDNk0>Yؑ OzBQ.㺴&0VDg/be'L(uDX@ Z9vS*2iu\.IbURZV$ˢv{F\&2F"cs.v&FA棶hVXc -ι-o"{t~Ҳ-oPN&.V:ſ?G6ZS+~\OR!6ljP`=|3A^cj2QB9a7(+.rBcbeFMcuL[jx}3hxm@+`.rX٥D7Ӫٛ 7/% _ BFu$!7t6 kƲaQ=aQy$ /g =Y9LuTFu6ɶQ檑|It%qȯ JE> >Ǵաk\~5T!N~`X7ap}z?߿>Ǐ;{3ٛ9{5J? L!䢂- h og' Ǜ ͍bhFuیr˸!rAA;sZox=oϮ[|ٲu|4k7E=`#_Ur+*~X*$Ρ Wb!5Ԁky^\Hē$| [VBS\>|v]Էm}7~-:M}B+m#'$^s!670빓ڕfE]&?9Y>: ˻4/~gd0."IZZFqVP-:'m!79dg:9 trɸQr A9XοMCǣ?{;]">n >v5ٲmѦZyU^}u}SգRQq4ʬ\p(D 4$A(!׆ RES wVSQ\80q6y>'7 ՚1R ]wQ x<MS}틥o:e6u]v,_Ȳƈ-\EV{;RHEh8B'( J2+[ Pؑ~9vc8n6?yY]owxO<߫ N#],/Ge~ɻo 1 DVfZs&8S1OR@ %' 'ugZ2ck>NP %׵g&C;DZ6<0m|e"Lz<27PfO=aaY= VO(Q D˫^],JI"ky;I.x\j\~sCrkG=DM/ zO'SKuWtgX bŞAh5don#\A Pq ,PqƎy#/>q;mޘ=b&5I,/q3`'NҒ@t׮Wr3' m\i/L7#ގN:8s|-/.r*1QoNbQ]38 hmզT˫_wn&R j4ʽ Ssg\_|ԁ2e5"4B}箐mȤ9N.r}Q`0Ǩ 9 Afʏ1>n4R.dt 6i'9+n?#RFc*BȈ3@xd i;2+7K`uheٻ޸nW96$- nF=vFaPI;8 QT,YW& "QcskclU&IqX  B*uJԌ9&)˲ CԎ4i4dÜk1E5;T%V2j) rc[N[TuLr抩˔ lhBA˃Ѡ?Jŏi,s J W4/EL\:p,," ˥tr\D/@B]^j<`" P ˕3V40[A J <<(Z꽯QJ'Υ*"xk"kҌ:L;pT$UMɷ&:ʩllK@dje `I p2 @PUv 45c@HqHqƂe^_@x\*Z!uϮLD*ХWޣ.%e5"IX\Պ!Ѿ*`H < BD bLc`AcIN5nҰkMk?fR3&i0tjl.ukk,3f̪HN1ƀ/\;JD1cy}fSZwtf@4dnIKD@KJ7AyA3 vXmAԨƂBG1 2HeV**26VYѸq Edu\;djy=*f!ВD#+N;(@ "mR82Ggm]acm\g 0,6QM ;#RF WGnT\pYT2N$Jhƚva1Q[Y ̂5*)8BY xAmJUΪrݖ2+{уEm! oT[ 3]$ym[I'y*0t.Yu0qX~Ɯ<[贘vԮ_]L糲1%P`ڢ+Z`8FL-FTaǦh ,E4^GuAK% y.j #e`9q=)Z~9#|QaA]wàD,T@s1#%1BK6\D[9DKh wzTH>"Kk=xP@z#jq^#úEc4B?z5_a+@1\ƒr\ \1DNa|;?$ ^ѭ `\ة %p #EiFF|R ={?A!Zq܏!s6,ѳdVl,2aFZ[Xf=_85'iQ˃>0t$ƀ?!\,nY#9sP*mufh o:k/5s%Z56B ZxX@a,gfFXV2Zc@+hB_?Hr|d8r1N4=li3φbU~f"|٥#h]ID a8tbɱ.klj~\ר"!]ED8Ø<2Հ]ojn/nϗ{40_"N[f;jyA"ʰgju6?>j r5pq~sp+f+tyu Zv%`I }>J k̳QZ?{%t"@ړ@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H *NJ Q)l@V|wH %*H&%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ RU![@! =%̵N Xgl@_ӓ@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H +,g(WJI %*>nŒ@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ?zݭ[x3ym5mח 7@S)yw?I"'Y/,,owϛ&S.g/eg>I /vvRy Ӵx,+n1Hmr-snyq~VR/ŔDJb~V6u |ԭvNGjB*_U3ޖe[F`iѻ_;2$. PRVx9$0ަ'U7ZIH8_jb(S[@m݃:YhAQv\K'Tvo6TǷ{{"/m_|*MKZcDJl5[Ow.G~\.ߗMa/$py6דmU~1}8%X^1(STHwm ^?K8uo5+Ny[+GFO{Gm]yM_BZ>]nvt!%.5֣-|ZKK?a_T m1rV4Ӟjƛqζ$zk=gpys:bQ'߷z)iޝ`Tr u~,.tдru _t?6?vCCV|?Clo?7} v=oq<'4Ġ}18]\zד_w7CZnŏ6}$59]~IǾ+j^-;z˕{,Qg)p%}ͥ[~]ٛH˳e|g <}|Wf%kcXLrtrh>ͦvo~:wK{ cman[uG^/z9[tFue^m(h.|(Xʬ=oϴ eZN?U8Zl>SXe7+~}CXۖvmh+mnXvxW?gg雦MǮTlM)lnCg}񊛯s/uy6+k5)PVo~VY;W6YW<ׯVtg wF<nrd&vH)}_%Ъp<DzNیz#ޑG0OǛ#=;]ObSr?ni7Gxm|+MR\vt(QyQIUX,*hU@zwP8[^~>4v.?za|å^ [^뚱yzqB -EEϗ}h&]wGM[ue; /[7Նt_tzxhBGgx5d'^Q _RPޓb>?R26h'<Nex l,Z57MHK?w8H1օVTf,4BEhUVn]gw[XLz~δ|| ٸ@pǶVu:^q4z"He۩8=uV7=x e]ɪtJ'E&sgT%S  Ƌ kn?E{2w3{?ZQk2Qv.*sn7{LE eR{͜Sc[qs6~.K,%!Wa`Fnis_qd}-ͺķ] ,g F25$w.dގ6VoF c6͜&9 Ւ̡pULɭnfUdV瘀SL1KCw% O5hYf:.'+ Nt+d;7)NiPyTV$6oՁw2|o ١YO W>١aoۍɼd3_ik's\0y v:w#LeJt¹v] Qt$ͧiiJ>/| 1{_=b>][o#+yIp@6ٗLmcd=غXݒ,SR3n7$b>qbtH$m&ux)NA3"\2LlF2ZKYj7G־8{ls| 'X {>Kr  Qx9si=:  @O]o)s\)$ J:-B`AZK(2A% b P'kcKR Ƭ\91X>C}XzVX2[|Ti  ʒnӌyKt}qa! oᝢ p>}fexˆ^ʫYBKٗ޿9c4TW54Go)&3Eً/jeOE#9#=z3o$޻Pi0COB|.&*7A0TzI@B#ģ\0a9g8P 8-ḬLܖskKȶuSK hN{*oQ,'3&&3ѵInƗ!H^SWGp` !/sſp$1uޫFޚ1eחJ^'V*yHHHKq ]Xy7l8ףbW#DO0*፮X9/d㖄AgNY"YX"mX"m'X"RЖOzrfC&NBJ,Ȥ =Kdiݥӟ:n#vu:4q\ owRT40A2QFXxTD[FT$@&{IwPS£h%s*/ךPXPug}SQa)ܯRvlYYV]шǮQֈ׈F\[;Q yϭOQ,($ ӷ D S!PBgj=JS`p}0(5!'̈́|(T;[TjzեXgU^ԅb׭xieCч@K撡}# : 4P( 8x])whރ OBEq_VTᦱ' UF4&q$>Hُ VgԠ;2QB9agw2P\]Í>cT7g0̉L.$z  ]&tMwuhˋ`4ݧT}o]#Ĩο5d:J84}^~||swQe6<\ga_>_V\嬒g(լ hz`r{M+?7~|_M?^5yQܸ߮Gix}y5YnGv!@ mø9ʍC;џ!m-Yڒt kmƲfgFGQ̖v4| txYNZ[ed}lk]5 Z;$$7,q>G?Ԩ GN+ZlPk_{1^بb7жW:/p>)YY#-c7Ib''0xx;h=wflILEЫCf*jl3ќs1op%ψHy$JE qQT>שapו$bkb$+e캻Mι<>u8;V.%b3;falG̭H/ziU] D>"_Kod e^ }r/W-JP$ցZZBtR+O(1O¤DH<a U[x}2Eb_Pţ&#!FogDK7jzWNm&zǁzXJPօ+)GRj|H!#).ڑShiHqډݼl5yZzM{_Mي/o_x*pp6*N $Mt1ib8(eN:)PoIJpT&o (ku3g $pTkdZDJtS;%Uv{NV%w~_)&޳Q$( rUAHU[̡_B/y򙞕oɍzMT"P Se|o^J/O=V`hGo@үvӼz;oHih޷ I^2pQY{\m褚'݆7,YfK{nt3;<KfB^I"AؿnOѸ?~z,➹woߜYHE"Rg7EB7=y['ۙ'7}ĴJeNEާ-WlnMgZ:`5Z^Z{&&B" ʛ@m#k84ZH$J!xi=$3q@BP BP($ BPR BPMP( %S{BP(b BP( BP( BP( BP( BP( BP( BPN BP( BP( rK񷄴2DBP BP( BP( BP(BBP( BP( B- Es BѳBP( BP( BP( b' BP( BP(V( BP( BP( lO ՂރMQ]9,NoS $nknAFSk(7/b;A{az@ô܍9a `MTr)Th !!Р@[w %Nd2<x5"P)c$W8qp "L ʠYaH}Ҿgjz@.֕dΎ;^ˍ%n:rH&NhjTBk&IȽ!18z 0řuD{ fKv9Jp+7d?rP./Nl4YG >a'*4vc/5'܆7Z-*B (.N p8CS/-9qz|FbdqBsלf9lAa5;?lawWv)YkZQ3đߧ6#*An lĽ89]C9;PѢt5/kk.gn;:cX ŅcФ|>`3 AAk~ ys6[NHWK:[.a̦G5 kT&ML(fx`epHglou9ȮVƪPi6(>M|rW'j4WzkWjJ3K (wo]~޽?LwopfpE䣻Gu#:Y=V}5-hzNרO!wh"JY[P|wM~09=ȇ-=s~8cۄ*,F5h|ұdTl{ʘDUEha0si#vS޴1H7Ľ$ n'y ׄ8+!9 ֛+A9"FYL&X<Ъ*vs;K8Y4x+c9D%mF|ak2d52ߢ@zį\B_{/$!;>L֠sL;˗q?˳ASk8Ǎ^,|g=֑TsOL3u7v=gNE]_p 9ݩq3K&-P z^[+atDJJMd543!O\DO47J-H飻 AA݅3VqWZrU*GbCyl[ !䨪n~']𻷷W?7j>6X蔸ц3\MaW71u `[en=^BJϳBPOPUPsʥs|`Ũ3{/!kkAXé&yׂ 59T\d:9tr^Yϸ(-,'&6 ^NdhBFPqxLx-Me3(3Y΢q!mm"gwao >q'?v"νl;-knwCm/g2{ }==Dk4ʌsAz'NqE{W5H 1y-h igHKmnu4[mMr5_hRJqmy?``N: WŌ섗7t;aE}J'dGIA C||<NDi-!t5j #x&x,y  jsfQBRXtLrV|˽cqȭ`&zDb1JpNS#6sr)K$Me&1_f6n=uArl'׫\ݰ+f˷vw7Q!v#{WhѧYsgw^Znpz _hV%k,Y/ujMg5 <h-QF4[>>2~IV:|Gŋ0u` +\YtwWenϩ(a6f3_ڨ ˭/dSN6 ,)lzu&ТZr|>Ass8 zq #YPm2h_I4Z $g51j"n=&֊$.N\f=P\~DK[k'lNIkPrd^SMP)mޖApL6OTrj-A'ÍC ~)13Z&EP#~$8B__//k46w;I3K 6Wj~p93mXf`qT{ s+/4D#ӧϔ3ϲZ[ԗ|Ç8Y3斸Yj𧓞8nJKǹ׭[ٓG! >q $r/-̈́~NăJh,>h"R %T*tHy!m8b ~▛sSW1&By] r|#]A/OO?w=H]5j֒шP@ЫAA{6Iy'7Z9:D䓓1I/HD殺{ϼ00kE' {A&NFuMj!Hf_2j8Vpg4ߺs)7t+9"&"Aj2Weٻ7ndWy9춇,aOrdvdh{-KY俟bKsGkmc yj>~EV}ޡAAlu$YSC bp&λdory5~Ϛ/kQ@~;7'oB͖9A_ƌ4~d&](SJN`s*ƒvCpY5hHzZkPNwVN3NϬv`ZaZRec^qkS.r f7-:봥ŤMg952jp~jY9@JrAP*Z&EJ*%K+ ttQdu,)% j@9 z;S\p_,Z'_FK8[Ih@Ҋ% QI @e(U&F ;e8&sccGm6J/|t*ydVXN{~tq<:~=9tUZ`TLFza3JP (C1ƋhJl.%T֬?5ܫ6`/f*%Xn+dufɮpl'|(I[@[絷vNIb j7}Q[7Fm5`7YA4 tz5w]\oqil =o/~zP=};w:N%x?LG8iۭ-bL_~ {Gu4к͍Un-t="7~ku>/ٛWWe荒Ҍ{yk=?q}P̏ #%aӚBQ.IZ9|Xk?/:ƪ1/E.]Kcpp'a >%K7&r%CPnsխ'X'Xe1cnb"(G՞L`RN=o܌ h򞃛ڻU讌{@kzKyq5Gdm 9Ym9VZ1d'<(k\6k߀89uJiR)v CZz&ϯLQ8ҳH;eQC0+%Z\g5]>䤕&86\ gK38) n'p9n9rQqh)[Kw+|z7~?X*[ _=[|;'-'=f3NfB>#or.v?љ~gwf{N7͍y)Y{.O5D+Wѭi\͡VZ9fܪy>6j ibs%|˾lXm֨ոOHL+@i%rq{0"CoaK4=7=m2 B[wVlPH8:~f{kҁRRYjG ]ӷ-*ZTolRm޿N]&>6fg6?8y;awqu=7!rnw|MKH]._UOz,G{^7_[v{H+'<y|e^-ѬQJQiKBΡPH+,:"l!y;;=Dz>xl~6&Vݯe6!!k_ % ƨ$ Fx࣒IZo~WV><19첎5v<_57oQWlXoݞzJ J!D6Gjfj[J$ڼ#fi*kK b@T_pFEs{4r*.y|O, JQ(;ŴVj QfY|1`9.{l9bFDypC8K Sqq~ Ns,9oUcr9ʗCTs5xm$BOBB`i<8п&.ww}^Kڛ>%Zߝ?ƭXEYY;*"\[kAQXD.I)ؚ5vĘx]W\ω|הLw7Z;VLOzp +;K>: ެe ,"'[C$ rHD )HAMw>ߝj 5te?dcE5XTGߜ((<ɓe@N}]Av"h,,6t;-"SÄQa4[D9f;7%GedÐf"[IтO45]Դh\SDUVr3BwآźA`Umc3/w WW@ Ɂ-l 1((+9ꞈm0LqB\[M슶y jL,])C{m D5Sr1OV*=ꚉWwu=fukrćU08~7t!̸ XDdiuw ĹTz%k{.&Lt~湤[`x_r]@+_,ﯞ//7߭giLm)۸Mp/wCwO N8Ƕo9@Qd EtJlb.]bZ!A2GT}х\\^IMk㭯%%MI$0rdx!Ǔٗ´VE)A526vdlUaa+ Uc, xM ;>p>8O?sĎsZ&0*Q&D#b%K(E4%T6*kVZ]U|j3,uhu:3dW8>`ʤ-HEZ g;b$ 1YǾP{`,=RXkI(/A9+ 8PXw1ELH23C9FE&@ -G(ELvV g;AMJtsAf/"ƈDq/;6HrlJyB*VY|H^ȢL(uۘ>x)uciuRerrBɒ<^ȑ4h" [/h6$^?ĸ8uNkͬd_\tq8&78uG}(!X߮tI%(cՀsfֱ/xF E~\9ȭbnzEECܐ|`9%M5* ٻD)UjM8߲mt}:Kr]jeO>h0~Vyg r08,L٨qQ&)rA-%ֈ}f)`\%@ʶ_x[4Fv?\XxǓ'ar1aj&SȠi "2tr|cL+(m9:Vdgeھit~e3ă0Qy(Q t?{ƍʔ^H_\ݤ*=Nξ˅KCE*"eG~3C4$E EJlh`_78Ph_4 {<ͯAŚM؟QsG6"w< "֠j1u *#='7s&_/HJ!:$iuP,Ar,wU`BB\ԭ m%4O2RLW:hA% X@x`x$ a9:[o2)7;VrrP9_s_~S#,`-`k'Cq=N ǟs+Hp_%]8ύ8L.]odäwst0{RJ`q]oר+r[(U;3jZuVnO~|w==k>x y( gr{ d*37/F BFu$!m:e,{qG U̚|<zr;b0]19I먌u6ɶQQ|I>%_W!w}=+=Bi1OLvSkVt'~;E\!;~O?{w||jpMԓI oo~кq܈ fY7 ɸ){O| B[0׬qF$PʛouOnumbg_AQ/|,I.*zIS.bv0J5Y Yi#Noz/N NXaN w"V\ŵ'2z4Ha3'1qC0Jzs@N>|9Muh3 `$HJy<"MP(**u:9יP":m"˲îȲJaaV*V-M:X=xSIYRy&!⎚&4 ^4bzd@E*E 21v`r*}z廙KH+@ -H!:׶yY ۘ'aRR$*-Z<>|7is+(^]'=CݯX -d燒ɾY,h )ELP…3Ԧܥsyiy2 u;#(qphTg5тx_fazxE &<D`1C#97ͫ݁ ne %[`&tny~y>Œ/%s#}ѭZfa/妗l}pb42b0RL(qc(%jO8KNPXr|/ i)OKbi8,mk9Ϣ-Vw"|Z6S|FYI/#% eKh\YzxfX *T{uv*]GԣTjװy ?Ȏm{\.A9s!rK.~{Hno9\n8:~l~k\3 ]0³Ш5dޱhϻmv7 5 wLRSWgvǷeV{Myg&n&M9"fR 9e8t7+fGNPiAYg"@'=}5+7ӡiV)nށPԤ?q؝v87s=NmQV GM3 `T,ܴ9-ƣ.6+2OȲָ|n7Wm$n.Z_9X~ruIj+디W7'ջ `^JK,/L$[I><}Gלa3ypל]sVq͙dz D1#΃I THRZtS>RlJ3.> 'dSV2Quݮ$ڗZEi,g)&gKu*sCOe.S, rAtxu:OaN`+䎒3=EM׸$J^m.չ}d.溛ezҮ Gu=;->Qa"Z2dXnE#V]AEo3{$/vo3m`4=MLfӏ>h{fO&ָ;tn "Y\7cF%4.RDR*5 )Jn /88t.MC #EPĈ†\**慵rm uPFk!(\=osX4a)^ /Ҝ 9qUCjAj{>bqTnmg/(+G[D"$*uZ4Le*<}IAEh]Kp/g'ԩ_AE/YN}r5&' Z6 ʙ#l8@О)MXYAkK %i1hNPw,g^EH*t+59<kk$rRE&g_9:˜STs y!b;ً@"QoOCBd"X4xhdB pA,`gJxP3NZY*1Xi9zōN  & &lA0ƼK!dqtqT!xbnx5h547IUvO>xѼъged$$J$Ij8,d |p^w" Xl#|Gt`ϝrCe4P)&,> J ='xt's0V͡HGgtpҭtp%:A'Y7<;ؤӚ|8 HY@Bl`Ù6Mve„7x|Y|??7o }d&Wy9t#(Q+%D(PQ=7MYAdy#.v<:%6Ak#Y}L%FC@ :6=fJ1Vz#0tw!pSB ~x(2āg0*.V"T(~xhybdxZŮ>apWb `$/?FT'@YK լ4^8< /NR\#b"Su=Wz$hNY:RK,rR3SeF{b_tB[5܀3_Vst“羃{?_PZxyZX=J#$Y"W9+]F稊ї(_ t#p!I=pQ@rH@r@A t@*fRQ<>2Ds A$5@<'Khdbo29yo@Y"1Z(OKtFΆS3pЂ<ץF<2du:dZRyAk Y"k>38l욮-yigk>\[eȕՅ2u@h} s39F2OH㭮3+[u lVsu5YݮZ{=6լ煖j<^@y<Ee|MM-;0Ls֜MtӨLV8n?gK} -6ߏmn 10mjo[Ϟ6 ˵=C=ZO(zY =E}Y^]|pd1ik%ew^Jj`?Sv=-ںS!Ye=J "DzAu@mγ>Q]t%woBPnrm}5-Nigc%fo?DNQŝu1@}ʔ8 ZbDD9;M9cQEG▎X%ЪMZjQ#Ǹ0#`_9Pn`0\ hJ?)~mq68*Ju=Nꦠo*M halZ¶$k]Z#iG raɟ6?#qהܮ=GW'm:JΕ;_BP H h&<oEY CqfmI8{2ByYFfh -si%1 [yB=QvC\y5ue"ƅ=T&1壶4EK:HWl_o5(iFwF, Q4g"&^k0Yc\TJ[ߕ`s Ay؃?{Ƒ qp#ww .8l`ZLZRr|zEh)aY⌦kfzHʅe6BX&{dgI.[5#srFa`*HQ(;`+5IHɣNAfY|1x:R9VA< !B>Q؇F B3Φbނ/1O Yb<RiiۿbDh ;/4"Y)MAЫAAz6űm4z#xl 5ORbLSR %QfzD&5:$o|$MrSzoɧ doU؍F:[eGAa.$HEQmN(Kp`5XaRNb&»Fo_잧77,<2q|;7/߰ vCy|ӿS9Tj: YvY:m~쬶)14i,>T9^ž\9P9D9,^v СԬ Id+)XhjL7iJ_<cHPЍQ&Qn- =' 3Q,¶繑[E/G2 WPc/1=8 ɴ1+Oa׫¬/,]VgI|F \=徼~SS-{Qw68>uDK?hnk2AHþnR󗫼99Y~<9;(M{FEc^ܜwW>lʜz>=oL7G'GRK/Jw?~?YyF {7S;Kf+RklbSg_-7_4;ikK%_λ_3Bc[peM!MI]:vDd6*&#,nI՛h":h; t5JNJke(/oӋwxJcpwy/#]Nlg;vl̻=5.^l̛v ƴ H ^. yeôBN}[DŽG:m)w!jYE%qt<#<#3QIIŢe \PR)*b pgz= 2^әz6R[KB:8 ],ő%jKs^#I+"FRMp(-K@DxebM!Bv紴𩱠6#qAE_İwr ľSb<~P3ldV~<;0bg1WPC ^hŶfGrRVԸL#Ubuz HŪJ)rKaXEJ AJCa%Z36#qӅ8c_]uuAuQo)sXq:r0,Ͽpt5T)S2 o* d&/)X\ j1Tg/d% Ftj$¾XC&$*"9:9k8.Pv3WkZF="؇8>ys"ST)"AT,+T(SqAqU}.!32Xd*R>Ys1,T;#9ީV8،?ՈFԣF5X$d#US$EX,t,H9 Y4Ն\KkB6 KP[GI2fiG0fa* :XdTGBi [8^/ݸyXB kn|?'t>=4vMi^Vb+z `l@̈W_jZO;uaA`+CIjէM^ƻE ;[龮ISa3!hWUA+ H,,.*Vs?Z 5/ݏ!7v{k؉q[nӂ3A ]@BZUT&I%< ;Nh%R6S߯I^9;3ເ9]ra20lrmlAm;t,UIΑ(цTy񱃋@3}{>gP"ҲݯlSn?GAkC+!g6ːR*jDɢŚaJuhu>EwNkY3Bɂ 0(*iEց/?nFŬC&{97$}9)x%=y~'7[ʪQ.`.DDWkFd f<9*T^Z  f],٨$?Jn3:_rFu:_FE+a+2b X@cə`ڶMN%S+ɌLo UtvrؔT1^ _Oawn3V+h(&ˠ0Z(IJ;,|#˶R'0h8t~q6JVMѺ3MԚ 8 rzL!]G w$utɿ2/Hx?\YǻIUwNn˅GC*"eY~C!) ({ F7) y π&jiJʧJsoOríA,N%ktH!:`yY ۘ'aRR$*-Z<=<]*sGˠ\3qWT߮yUgTlE5ʄ r\UwEny*껶~75uӡdNwb,+Z2;O2!q..f~^y^Bg\8EnE/'A3z9ʋjRjӅKdYR$^@KD- H֢QAD rZ89<\d:9tryPr A9XCdßyʟH#>|qdkliѺ<]MΦmf2s1p 1T88GD\ Q Ii=k D)sqH"}KZ2yU`iq@.jMυLGfLE$HNk//GI:]<{UZuM1mW3YjB[M:%$"EYs4kV-EovؑԾ;rn[C:#ݜ+@ڰY{go z8:=xEnIއYe|X5>j͙ /W} Ӽݯg+lw]=p+c41/&v_@TwX*3W ftAM> 1 DVfZs&8S1OR@ %' H?=ӑfDqzX8-vqI [l&C7D aRǎ4|rae2{ sŲ)NzD1GWL-.L(*!v(ԃDjiy ?BҶ4=kx>rV\=[G!t[gW[<]f<8Qriڦ?1LyfXx fdޱY36O79l >1H4; 4ka^㸥3lw#Y9"fsXaԸ:saa6me&U^I5j mWD0W݉zs+IRP<t,1n0+M~.o7U32,;~PKu6RNѴsGvHuBY{0JHa$hN_~l,2'?'ȚNy;C=;J0"XhZ&0 E.fHJMyAqNDA/Vw&hY"`&Rs}-U)1Y BAwi֋Tp^/!NwCi GcBM܃  ڨ<QJ`#0SIԁv_@*q}8<<>Kv9akϤjTAWX=ũw}?M4Yc?вL0j፮X;/d0H>Ae}){D-=CQ[- rfC{'Y"v$R# )K[5P21C7D/(18@Y"1Z(O?=S~gkNY!exD첪ag_8X҄ᯎ;s}JJQ (#,PEDƊ*-uȨB H6L" G"ьK4U^F510g;eFKTK2r˜Y/'_J1)ÚM[E4gy6Gw!NN(?h:Ԟq|KY+oYff6XMI d4{XIƒRTZlrFet1yOrzeLL2d=k_HzDUUQ*KȤJsbltrf ya^H{^QU֌;H70K~ j7 mN͐3xh+Is}4 &) q2٬yGHnx s=ȕJx%"vPeVm sbl~%Pv1ؗk\\`74@9 )ƔNuxS!i'$*qY*0(?LRi|y2jT8h5HP G%1I$V4AYwF<53Cc_( sDsĞ#nx(hCVyϭO,($ Fl0"ɩemzk(!J A jҔG-XQ> ExMhI3C#2t9b1ps3k,%E]//|q\i:.*%CF@u%Y]>@]$LD=_| x(vMa~A-["_r6bZWb{\LяhG?ޭB~r(y%IX/UwJ\hXv=>YH`.~iCB!eLS{?%HoPR;))J)h$I bdJmE,h(E|05u1иpTwϹT-bইxWj.)C25\2@W^YNw C]ɜѡc[P,ܩfu^&E"Dĵ!zpFNsG` ͡(p?jVK\Br b‰bXk4)(.3}ȶRAPzl  2^b 85whcO.rc~F-QApO0ۣ =.Ѵ !Y+̞9ʐD)riTn{eQP%\Gd'c@lsTF6:dר]sH`j18aN!y`e]_{*{O1FS vj͆Noqg߼7_|)3_?~RO9 EGG@#7_uq܈fg ˸;ƽG>[m AXH֖8Wr&0w3?clO=a˭Dx̮X+6yk-瓎y݅*$. e!5ԀEIJF. ;IBF!J n>̙r$d>C}S[.JYŜ` d2DbKvgG7Gzq9ts(~$D %bdYEVN!jr6mSQW?\/٨8e{.  zсǤH98@%-JR*4G8 g _5&'I֌D)8)xh4I]罋g/0%_]oSLwe\%gF.,KQpr9wN nq Ռ(1eUk^xqT}[3 2Ygb?Q o[]ttJGJΰp 37֨v~2D˦aoocڄ:;%?M~f_DN־1 zԔ]LAH t_Z,:Zl)*yCvymь⏫paGz`0\" v4V&U?8HZhY}+QL.o*6jC^UKv&]%Z+Ej3JHhs:ۉ̰ωOHټ[~N4!sa(kNPTkmHB.ykӻk?-Ѥn/Y,R$")vR,uŬ'iJٮAѨy;ACGIK]bn ?reƓ,ʏ%'OxSF;Vğ0cf0FJ9<~y!1438(JIaJQ1xΒj=+!7WLO㱓90}ޮ'), 3«M{je}395e&z0-DZV/;e҆Γ(N*:}u+馯#@PԼ8Ox N Fq]^b#\y;cr?n-;͋dۡnk̓* ڠKvv`ny\ t.T#)(Vn<.gKKkCLJ֙fc b*^^UB3U믇GUWzٹi? U#JuoͯK- Ol:.el3\ݭf"3ykI;- Fg^3iJF &%0DŽA'V6p/3PYMƷw}#|sd`v{0qaЁᏜX ,-?OGs; nz;lmCG?"ۤkJ:WҜ ~߂xbٵP|n;u c~oo_+MnnZ~~m y|hUhwW20Ҩi%yZM`-skWdQ0m/q߸پLg"xb;y_8 GnHn-YN/^!/Z]| x4HdmF5lT$IS{>oLjpG"Qk3:@G=NS=qNi DT .k8φ†C$s` n^is:UqrK>HQu<3JCrv/[l:~֡Cj;IpErfϝ<_֭{0(Ul u+4z@iv0M%<]$tؠtZrpq rvE A*/ѯwP .Y0iRQ4Yl@(nH5(Ǥ",A!\zϒ@t1*-i/,gŦs ?3{6j.HGsǕт49)9jCb5"Y+q&f k<ƀD"NQ/}BԤ#E -1`U(K$.Q&0Ͻ)?[V1'U?iA%)!iD3byIUbQ2E 0Ppm%xs6UqQ}U/Q~=հ<׮`$=w1fZh{Lw9$O&H#rUA1]#"8YlrnQ%9ULa3!F"P)7z*W$z" x߼^ qRţxXw_"vȉIY{ ϳav|Ց<i!k 7'9#jPe? ^lZ&n2Fh^%e|#ħE*• 5jѠ5MDTspOZ#U!2+ Mtuv+̸rA.>*`yT,/s_o㯆/=7 쏹T@QǽUB4г\iK|ѩMwl(3xhq ޹&$4@fVD`]=eY^L?a\AUNJ<Ӿ| s(:n0: V89Eq2Zl>Mq44.rv/ͯ';$g}2bY<>zCm:pαaF: .t*3K{>PK9b“9.KbKrxH@S5C]8ǬcEo YL(Fp#iMs "B31R' ƃ\(s%B3aeeh (\j:G[*։bjQ5 O !A(mZ:-P$'8hms)S2ekRC'eKǽ%ƐPv3&D۲ au:hWWSX@MҾ&Ua@$ŠxB{m[hIDg!% Ý#N)e6Ta9[!~R Yf؅Qi©:0kVHASbgLEP#tIx'<¿"/ǭtn:[ TZۯf{350w%`@Jn h`<rA Ȇi5:֐;SO6BiV|K<$C̛$ZDI`)44(BPzn2s(=.T&H@xFG;ßܓ6~s}ff.TBЇQM٠Kz (&xRv[݅Ꞧ=azr<*jPV37o8'|K̯O*9|?4g HB.KB&FAV/<4TfB i:Is)DM `.rz$V2I1]3B<UEܨ- =$蕐{ MK].Q%} j>b^b6<:TkgKlv\KTI /3oЧ&W7)ItAR(gk|ꊝ APZ+"Npj Qir6(ڢs<C &T.DxPL9%Y`jDz3IŦsd A}s7eH+w7WhVTd~mg?VR&yHNlLxo(4X!R:LK$ܒ`^FO 49iX? zBލ6lއw h jtlvEdIl`K!-h k9LdV-@ە| _^N2/C3k?_&ozC~W/<>ӻ >>{ܺtp]xkû㏿z%<}Ce5&դTVjRk+Ie5RYM*Ie5&դTVjRcEjR z+Ie5&դTVդjRYM*Ie5&դTVjRYMJcxe5EդTVjRYM*Ie5&VVjRYM*Ie5&դTVjRYM*ɀ}WP}pc?MpꯓХ Oq +J$(H9w X#cCje)xĪ:rv|ʅLބ(kT4}Y\'>֟G-?O5,GJx_i^h lkM'6DB-MEtHo=QFXx*-#VypPR#H49AkML( b< T8h¬2%.=+VSe*9ȁn..QE۞<9?o:oĝYv!hQњ`552&% œin!uYtL4D@9͟T&&SJ)EѤfDEwJ Ғ9OKrJ9YX3,,<,\Sm~-3钄M{j7OǣfKl>Y$8>I Q8M6EmIJCL6[s'ː=޳E W:(ᕈ(FNcCIX$)Q%N/Rw+Ea=X8>cITNuʸV딀dEH FEQLقI*7 2A@bLPh8I"Xѩ,|<,FuW1iY0E"}%,, 0$Ql$ ['V#,($ W$B魡(-7}#PmFE^zLhЉ ̇%CYsWQ.6\},%E]X.A.rqK+hh\2 o$Y']% %J"cžbܱ<4GaN((Njn9\:oEECBяF;Oq:|۴j$ nß"TbZn\5j:άt }h ^Yz0WzivFh~cfM~& ŧzۢ&7v 1Q5F2tIڊZ5y!2ÖW mQS)jy? \e:T8j~;٤Y NPjx5sY:TNNc7[HeIu3IEcj$[="@EE : aXYvKJ%r[ r^E%mͼ6j6GnV Nbmu09 'z%%\],KÊ'Hż~PsJlj㢨E4hj.W.( ?7޳AHP=ؗ%@`z5CoZ.1$렌BмD-!1@XJ 4!ehzRC*E2JYRN?Z}ȷ# .+g[sW!Q('4ƛ`h2Cܡ( -UNS1*-E- K^jQ'k)bIPMs$9EFjDΏӄ%Z[U,I#I@OZߙRѤaJFW@1M`Bi‘Jc̻Bh`~uٛȢR,s'89.BF @ @T;`^7TPRp9QkcWOqȞ,:87AXw␺Rvz{\9Tu0ģG;B)x*T;uPxKv-׵0р":K};nOet@yHtMJjG=%1rDDnKX[T%+ɲ(e׈KäPHB¡cl>x%Τը| Kb4bX5s-M{źp~ikwGwJY%46n1EIȸ,qhhѣK8ӂ9OLT̹#Om*:~n򴡶qͧua*rDHR@$! 'a!"!@qzV/{.[B= ,—W-Xu'hQa>7c!Q:VS3"SnZi Y+̙>0!H#S4Lɽy{UI4Q"THH'x`+ z@HB #p|[ocRNy!$_?}׀T'E+5&Cu5y/ M?3@YR[Zx>Aͥ CScO7VW]6%M$]nljP-`ۊ#Mirn8݁-C%vrINrm*Ék㛓fbƶ^^[k9|A.-u@%kr .(|l'g3d&Qa:q'.]+J+ZYGpង]ᄫ?|0}BWwlB]q>ޚR-Dbx^m-k//?xs8{ &}MxtvXYnoW 2׫yۮ͹v !]3Y9t5 Ʋi{aɴQy&N7nn][r9+#g]=d׬]Ϫ`j=kkA !ybtfG^5(RwO bRlqSkT!Oh\9o>~#ezרu+0u:D`~3֥VSs#vђO\]U%c1DނuhGaY[qI˨ =ȇ-:Q6*d#/ɾ [T|lh*\V 31:f[}ij$q&7GVBS ٯ?n_PBKkmJ5< ,m0)Q)FXB=bc{V1;cw^?;%j[)ʑ8Qu}j㲊F5ANq[هT]~n|L'6MN{Ď |wCڍzV#2~Isjybs}Y}KuR,}B\JE[nw ?z/d !i.#REtR_76^B ja\E ⬠&Z< 6"7.2y"I"IѼr A9X`2Ms6-zl"]c:kt맣gY*PhHHQES '@I!(2yU`iq@nbMO&0Q9jA)^\x6Sy7}mh)}i|!RYjxqryR߉[DT?]b-R5GDZ:AmPkR\GǏ~$}#z>Wp|., ;ك󗖵Q籚&$=8<,r?y"ܡtHd>'Рw9_"_~.i3ڭ~tҸֈ(h# %nUQhH$Ҙ[p ,Qk$B3%Cx=b,9A]`%pMw:ռ-0ܠw+8coM^ly|slp\G9ssóځ u~`L|_1uګɪ)RK#4803,mKȸ;4 惔듔$c_C嘂zsbݛfqoGۣŘ W92M,L\x~d&?xObay#&\ g۱@QO7̫}wy,4-z-r杲e(a ׾rV:4N+]Y o̊jぷIAqހ\f"EQCm>W@aN#EٖGGSڧjuo3[urMs_dB!ss9jS\V2"N˧w E.J_(P&%(T6I%؉X5:!EMRI$/\O\3ןǫ`vꃀvpaɅAZ}"t/ӜbƘdei؄L]Q:EK*Z D;86_F5[AG^&1j)0!@BFY47_% !;YdRE3C5o{i5ʿ\\-eڸ ZuhY~l!,igdLBCء3ր%T#,fѹ>_x|$B ѡQ$./-IA'EDEiq, 37g=eE9ɟ つҒjg4k-6[o&x}{.:z&tdXg)L1J3vZ8 iE餬 D<ωDKFuR*F Y$kROSH&ZrSd"EjsRd*!9"u2AGcΣrax=? Hx78hX'+Z bIw}nI۟"|ܛY|*og!Z`6+{S[] g+nR1nvwrUXxt4?)ulWix?y-J,rfy\:?oqu{m25.ۅzjGi:[ӎmr.zeV7,fƆm~1O?qAksͧ1a0&&f&Y[;LrI,g2`Hҙ$BOUBOp,Y>*3˙EtZkL 1Vg%'e?` lRϳc:nfgyp"v u&ѡйIŊSɏ-(L*pPؾ;bPx-ٖ#[͙O /UCU`ɤYa)ZR`( Dbh&&mAYe$6x8B2!2P!Ir}%M8f1NmiMO%Tήp^[r:^\xgHbY q#ыZ($MR(`YNZebN$\)5U2#1"9) !)!7HY3ddXu g|3U:())KBE)Jfك!S`RY(Xr\YŨCYꃂ7P Js9mX9LR.uVπ+`dP$jMb: C4&ͩp4}Tu\*wasa]zF 3ClV~Ǎ94l:_O`S.{M S3Q`U' F0䢾R(8cH쟇PdDNI&9& !y("_N"x/MV*1) {HWLqؤD`_&k֜Hf }vic'ϗ}鷂T}Ep>?Jǂ>&w~ 5$v tP9)Wl@ЋAAz2gۚQibK*訸d1OFoRRɆ @d6MS?jߒ툮y3K㼖S Lջߚco? A7kV}Sۓo͒^^ լV02&dxI]1~Kʅ‹|LѦ]).ol~{!'c~vN+cy1ZnwQiH[X>,gL16G~ ~6~^s?童yO][K8ICOwPRƃX^LRRQZ1"G29U)熃&5Kuy[=usT( Oo>;;yÛU?syĹv[N@ ˗^c'eТ?(HO\HnP_?_^N^b; gBZ4*/{~!ʴS?û= SPAqv]o+,c1Tch8!pGޢ|HԤV>,DJQi-TC[3>bO)2D˱R3JK /jOʕl1ճ=p`I'0 aTSE%#|`P:ր_[Y_ebK!d%J[vth3z/ 5Vfٮvg3,:)i&'YC%9)vFZ )x 6g\yHj݆s霴oPS5 ZN䢥bC}%-r 0QJkba#26kv$c_[B9£{d7?r>#4\N ]bz[>I""Er \(-K(ه$CD%T4;&p.:{1+,E;Ag2 vBQ Lp%QC|VnjW\Sͤc_m[m=Xn sZYE {eUU־R2l [1!XiCfd( {Ig[I$HAr FsT_>6kv{;jO"6}-ml`&+59 y$"xփ@{BCfH)Y-iB8o7 ;>1cCBJI+ud'2UYsvJT#i7WSl&%E..vq`{켶>J(!X/W1$c"L҃]<]<{L:y|LYӏz߯[w_1)ד)'0>q !y 0DzjT5?[кe3QrTvɑ;ؗ lP,B ]4(Q\R*%gEQ}*`JPd'F)bt`{6@#r8-/:Ūbu"&'Ҿr5ȀeVdE.-:K1ˤe{Av9)a5[܄ĭi) KY#("Ӷj۫D>WgzKc7V]_P(PXbQ`Q\`T ";ɱ3klݙ#[B/3EH8L`rJ"DHYQ.S&EQjRn,v>wC4:ECPQ W U;u*0H&!.zF*H,H'x`+ ,I <@XdRxHX!ulj_Ÿ! ; 'pS8 1l<[4 ՜ȥz|A:⃇%ܐaT\O>e:q,\=q"ā>zs7 Wfpg-~ke۷ofoV? ׆C9qإ1U\,~B]qJ.?BnmpgEMy>^_c͟|_?jv|~=0k-Q?4w۝]9 ABt5ߟrkG0~'i֑|xaX0J* #'̻5dO=BE=ߝU$.gc-B "`@]| =)Yh#An''0xx;\ŵ'2z4Ha3/UI}yU=vЁ/f;Gu|mC@JF_R8DH3T!30<2N{)̮.T&qbv%n+[ƾ;ṽTnY[tyV;/#0+a:"=lf?j<*#襭N:)#"_Kod e^}Qb֫[fI!I%#V*G(1O¤DH<a U[x}2IbSINz&bc18'hf~^yޕͶAP2ʊ9j]8ENm*,tެ}Ot~/;#qhTg5тx_gZڽdCuv A9X`2MJak@D>Rj|H!#}uN*װ~n&qb&ۢvN!Vjv:oS5gة٨8e{.  zTсǤH98@%-R*4{8 g g $pTkdZDJtSl:9Ad2Kmg2~)qvEoYeh^sHߑ:vhGñ:AePkUl).@ۑ#x=t5;rղ}J j499q lv3Ji}iked ܍kzܥ)w)o)]Yh7w1[l"?l|_ux#2̊,NM 1 Ds]ia6j͙DVLIǐK<՞J1p.|| iـOqX8-vrǞY [l&3Ci aQ-骧=_v/ks3fϘ/ǘkuϡ-|ggfybibj!]W UI=JC}0#&YHۖqFt`K.I#^=ķK#hSP89J-/Oix)5eTq-ԴN[J ,;6q7mz^)9yݞGf@ u Xwh) VuD tң}Ws mq6P(5oǼN ~|8ZP xH`+]d:(5)F@js$1RȢn]{WrK[վ<͛Wu/{K}J啕)VDq2MӤ|ehmm/{&;::M\T2xBK8]adF/MA&F&R UtS>RlJ3.> 'dSV2QTVmT.EwYz?z2Yss繳*2\V"Qz?3v5Z3)GuhR<&<2bXMwC "$JWNL;.ivS,ބy!F)IMEE6AcFPqQT"@U$\HQvc`P~ GNs)$3]9w.ve)Tס%@`<',gClȥb^XP9( e+i2Z A!Dr Ud|`%2KRxTtbَ R.Zaj>t8~[ 5:TVol/(+G[D"$*uZ4LAn2C >@yؠts7dyTJ FЌP,>9LrbIPڢ|Hr@(2P#gJDfvB`IAL2(TtvdO+PsfGJzxѼъHH&H#rՒqXIsۓODpnWPM|Ӯ0_1ٗla с\>w-J4 v`>7TPRp9٣{,X&7]b'wpR;8AY7z-aؤ,3qVHdCBlB& I;,^ʲ:4^ 'T#k'ATYgL%jWyo$d>796r%tr/癝=y!}K,O{Ђῂ6 y "$y Cb;vY;1 1> mHؖ\]{O6Yn-[ 2^bn٭z*:Zyhj45CsT+Np]9(v j:t(D h`VZ- <3|D" bMd, $(2@ 1P  %ՉKYSQΎM/kb2lAMawy<ܜz38l얮-{דYg>ܚym6/gW] =3cRxsr/R=ռuD߻n{)j^ϷK$_r3=\N&[<ݞ6Y6nKj:qFxWVH~I/zߞ.rLV>3\ ]zelsN`7qNƸppV5׆2_; !f6jIUb*IU09 vo(y^Xc> ] ~ݖXkLs6˴)䎿!)˝]b-= 4>`Fv8 ~7su=xr=nsf=. LY*mVEmXQw:6R 2Е#ᚱ#x3?N&>牶ajc;xG?|wH\!ʊ;+*TvGH!zUd'!'ĆକO4dP#Oơ!ipMh)DkGinBk|W('{?OS7~ǒegQ s: p=alý+?QPPN(.C͔S$̃okpp3yR@WJ]-sypF-.x: oq$.]E;>~ǃ_oPP-zq9r/](~{Siv$H_2u_|] ̪?eOzILANFXH1sNe(kIYY6s+ᬪ\ ߬QccdR)%A #8]}X0,pd"'Ak {?f 9-‹Ѡ6q5Qsh }J}W5q+% wEӣ-vIeŗqY !-z.ʯ2^el&fcf#>]Ue_glΰxҢJ]"8(=qqQܨv[ӊY:,7 b弐bJ ԇ|!){!)C!)!)ZzB-8& ڇ>SD8HF8RbA&pNCRJ.E&9?wRI)IR{4^]:K6ure].4$)R!)ګ)%JmJbEpЃ~4nBIV:0DLrVɌ5Y@vU@x< d$F"gTrT 0kg% ar"8]?WiCFGګcqݓ*WNg.:íJYKKf. 8 L\rФs5r娡USښRZmmEK9w<&s\49WgXMW4XXS,rZy~V׋[b_4hq4͟9b{9 dbCLL']NERy]as!Dj} ` BPKM2hF1 $3VFj܍a2N\}AjPԖQzm)R Ah!F{-,`tQdRU0+m| VqI21C.ɋ&!ZQq$|HYSN]Mu'p:&,/XM?2"{D=* 50gR_3XƳĔ|D)*Ri#7F%h6p)D+Q$#].t BY\=.U-x(x@XObY-B'6sjd OZ*'$2Yl9)9x PoJ,'xJE0Jn<1X@k<NJYj@V:}g2F?JDMmU}.GE HCeP/R"+g8=k Hf LhLqPɷY|g>fΑh/2R&e)dv"ɤy5Ce2{D.K  %%($](gh" 8r+K#īeB6'G^˕;3tP}ZL&o&k1OXL^H"Jh*/:}0YIA6mSOCt%䌄P"$B$)-^fUt֐F 5bFjqpVHqp mvd4n =Ow/ìϋ +K[u_Q\|۬.+?RKpK@Ya3L6C<lN0xr#7-ZMӜ$#U&ć7c+q9 0k=q{sv5:XP=7nhd_ݯŢ/'?X0$|4;4w`kЗl\q\L]PE1D]fѼ,ѯ2W~j/p6?>xu B)N8GNjw1HHƿ:x5#/ԓ=IusO'qbs7U a(2e&Gˁ]9<oy8lc!7Y ^:+Tqi!cEɴ7x6#MY >,αrSM:frǤ7^7?Ͽoqa߽˫wo~fohK "-%Gp|=6][t-xͧ|~uG5L]mjfm m٧,Wvףl\̙ *p=NMZ~&3?YΚݢW*ĸT$bA<llˑnRXG%'ɝg_t2,x\ &0xJ:pM! ,eIz?s/8,!oO,cjc6oYznsyY{ϦB_Rͫ_.biG:]e:n RTnp Af*,Me !R/B _W'_#=4Y$N ֥$)K3W2C18rM!q&l圶%6vv:e;cw.M+PwHw;We3oC*ךjl&_͵yqocIz1 HDVRE$m_ˏZ-z =jo6Qxr'GU¼(}qKGv99nOo%?01ͷs-ycg4n%C;xOtڮۚ2^.l;s5S?3u1єb`ryLgG7skUi}Ȭ69T{w9ltN91Veo飐P{E²Jr6)Jvd!qnC1RTBIk 6Ja3gH6Й $a:2}f]5qܔB?9N:!?vݱ7Е4gEn5/>+#wI'C $11K\):BIb!˜ <`[W9FKܦAz.!(lFύB3dT9>sȳO'y}dk5O^>Ţf_VeYM-cW,.Vx?R*ndNj884k $#o?=oYgiIv#OF}_Z%'& h%ZP&(Pl FeY,.|v#"aD|iaPk0#}R_ ~zy_kL{}gy(K\,vo&&/w}3~?WHAk5 lΩaF g# NB  a>cXv 'Y%{^}@Ȍ稳\iH]FU!^"h#}Й̃:yJq% ++Cjg`;/9 ˼E\KsTa'm1Ʒz UuL@ALWOW zw(5lKGRK<n]ѡjk4XK|NhYUhaZgJMLF*A3یP cdN#IHeJ0i%IE8<|#i2?Bk-]^!'SPr4UWCz|fRwZV^TT6W 6%ю,P4J8U'6FezѯUy y^p sȸYJO18 $ST1INn  U 믭T]6j@-$ՕD9 ռ -+"q ,%-!&r<@M@M0J+hN{,U K;N(+bΛ J<| $E<@fIR Ή h.~F𝝫UiYeg9c-) ;5 gHc|WN94s8֚vPpI)4(Rq1׀;p9XiN; 炐aH`^ErQNB6T^ "C1騐'ʕB.2Kʜ;  ugUJ7Pm8~6㠩,e>֪$fo#t![0bhu: ;Y q{ͩGSa[66*k|Ba肺A_1]yc&Pn&gWOai~&$._vʋkḪtPʸu_ovtbK1?̻yN/ZÑ5}zu9٩+^n36I}ުħ7DesqsaչTbB/4KFKQ:&09us1Z✫ڿ#2ZhKwx;ha/Axk@ H4BHI$SJ$ꭝ&Mg3&̄(5m-'zE鄲 <)5B;K-$xN%UdWΞ${> ^+_rYzrr.'{7O77CK\Б3ƒ7`h#KJ&&Y! 7,Jcrv*Ţ 2z8L뼓fγRu)d֝/5 79KLxr[g#WeT< Y Δ 9̩hOXgy+ġA `$dS`7YA+YmoNohҮop4)tQV*o?j$j`.{xc4Jneb$hb2=9NNA9/iN6ONA}Oxf.1s5|dpFH"|,DFR؜R$|>]%M3t2C\?-lQ4O驑ҳcG EUC@L$sR=AyCl"rmhڌMVf<@i( qzՔMΆ҉+^n8 *R{s A*VPꠥVփ Q<,R*=ZwF鉓ϯV>$}\i3 6~lCN/1;ʂ L 9D4}),Ѡ,Bӯ4TYiۧQ٫2Eu߀1/'qEeVttyK]O]G5bW%D &mJ{߶8+i3ZF،I3uRs.TfYd-SىD'D1M4I.wuZVw>t}c2b(KeAgÒ0O2)>0k$^;l\154sgI8(R'YmU+2aX-Yz24R fgѹ/lj#^~o6}?Cϓ3 BIzSZ1Dt  I/rv/N+Y)|HfhȢPQ`ntVZQ(7tle։ #h dPiƐC$62lzZ' YW ky>i.mAxM iXl%psi'Ta#w%0k?>6̎QLu{+{~%3">P qFJ..~i[LCnN^bȜ Fr|F86nH&VyWǥ{0 `m=~%z\.w+EqdIf*Bq?1{4K7=&P7ʾ^_0'G~j|428z1(7|{su=y-Rq[x͵-jΖ]W3'Y*&޷s='~=Ѣ͛su*Vg7j׽Z203ouTB8Ұc,~%|/~6M.`E L.v3ěko_~|w?_qw=y?c,L;`.:Wt೦_д47kZڣikӮy~f>h[B{瞵5 )~~ah"a7|jW=bg+)L4HPZ\V:m~\U${R;!`@vejoݪ)/sign<)z09*\#m`:-j] AI/pMyMk=( ٪ݹ*L !ɝK ,i$gEIs.x >u*:#?=d1qn_ؾfuې\tγV[wNup4&F8ljc9?5OW7tPvUzo*h*7*2\!*,B{RIimɏu7i=OHB`۹>^c~x<4 #n\`πH"u[õ`WwjIOzԒR[O_},ǛU6/0|H/!P١]3}l˪5{xIFdLWzw>^N3;2kF}o0^&|:F/Ś7`)ɡx/>lK Lgg3> $B RȤ4eqZ58d*8s9of:mբMOhwL_O+˱~g&J.I̅ 4)SbrQlH,s.sjq :^vxQ*t]C( ˮE2lպۂ7oǹkw~W_r~[͜bRWDRॴwT@$H" ed_{y,Tw:Ux2u5 [|<‘zyNVD_&-[_-jBL#$ j+|FHfϛ[tNz>0=ۛ]gRlvvej_{"_Es6FNXu/b7,רUJa({OdKwȹ/)ءCB(]ae` d'È7^r`X|xbI\EEr5jfT(x=~*=e@gqLN,S{ӌw7SGhcq_" _[1ej,d2~:r3hDs˰99ߕ>LM3] vܠ[O9p2SS[b5MbyfǞz\N_0G=ׄ/OŗQkWƉ8 Ԑ1W/pQE2D^s!!'.F߾f^ ^ͣl}&&q2Drʧ>;:9nMzFɪ`NFk30׏èhֺ͍Shֶ_Lxmg\8z[+xU3s]xs\aE|jlr?;rhm-rF:JD ]vt nD n *g3)-IPQZ@JkK!` CUf[Akh缍BXĢC@5ݲu&0HBY0J33&1iK<& w|l!yz7Kۃ]-_X~~h :VAS1^^πW|ң5bC) 4!dKr>'eVEl[ ^nJt**}*|o\l5]y( n?% (=w^JKٙ<.14YFCL! (6jrmͲ@ɹ9.YnN;L(_Ν! `7iw- x'mmf\'tt!t*{jկawGBnJڕBQddKS<3^q^0: 5" 5 ҋ:x2B[52LJ(m#"aX[7P^X"CϜT#si=5$U#JimM-:UJ$cTtF9o]#3"Sc7׻pКq9x o'^Q-'Ǐ8>G``LmySWO}s{<Έxo7Ov_{p?ycpws|G"84oO~2_4gnvq{BSM&y,{N?SQ6M|s4϶sQǹGyXoڶ팈Y ֎RgÀFAr,GU~wK;%Ҡh9A{ֿb /9&/휼w{W7;C}-ߚgcb?YgRG886AXFlu]J'Hu!HvA-SVծxIƘDmV)S Be4 q$7 Կ&$D <,cD*&*t lZChUEJr)޷P:1khBBH 58*>Z*;Y8G[Oճ7uA\-4 r2H hAj T/< lLKSs%eX |v6yQ" `jո\ z*/Qa UEړ% uʽAP7q w-:& }ۑ9?6kS{dͷ+<UH @fvK?V5,k,.9AgkvQp ]7V\sZ1դ]wރ,7,v0n92j'ٗɿcS拫i;o-|^Lh҃e@,Аrqs FN3GףZf$Y;x*A{Xe*Nsr`S Bfd%RʦF?EDBO,^+E`/*kk#/St*2WPw1A ALBw)3Nngi΃Gqp(HՖ S0CQ.+v ;~ ~ XyJ*dmzX=2Z^o!q12cU 8LYF$m*ΎM%pnif_-Weށ2oWq_}cbwgYL|WXd}?%BV U 0E7TVTKL'}P{=;v_D5$?&m0m.Һ V{ZlI};z<~Ca@ ~@eHvpz[\N-P(0(qD-P(%К}TL56@=e8F'UtVȶqjһ=4 ^Bj2+$뙌TgpXmoςR.`dEb\sKT> 1k۫b)tf$QƊ;ڑ- w^Ѡݲ\V`)mKyr *y1˨0kC hW|xc4x1G-a5o"JBY]+"ZvN(F\?BKޭD*$@(hY|%+DZVqqڙTH9la~8ζPOIw _<";,?W4|]/߹NsVr2$M9[++]c->fA%[cCsD8iB064g/PV%1#WT2at5sUه9lf^6\̱XnT`J,p=6: h0keH5ػ8+WiЈ~XۛI amY) Lq ܞ̐fيI@nW::Ud,HLkΧ P袛&SM@Qrs;bMSA㡈h'FDÈȈxƐJ6ˍ(O-똻"z؜! d7v%"1jhc:7yAEK05'o(JZDdQLda[ ]닓\dYP\c\d\)mަ꼶 h&zL6.\dKQ6Lь N01>Ag0&zw]Cu6Y<C"8^~ ov#wԘ6h W[YV3]k=>}`>sv~zހ+o6Dw4-fxӽve 8uu~ .ζT{s룴Vl3Ta6e)gK1.'X1 @[|:k|}[.K'(D/(=A{nz}:=(蜱PE)#u/>Bq^:g[}iHs>?wՓځw=,T80𗍦j#@QŸ주a.W48|@xXּ݈zGş߹i|j~1ؒ2V6k,]ilw!IWd/08÷t߿mCRxhfߕhO%ׇDhY,Oj"\I!FBq/rLV)d+H0S5rFEK4LEk@mSVYrThV^TCNƖViܤ6ZC*7eSxPMt7|$zԬ )!#l3mDZx=FIFKP`F7QmqmsQA% `-ʁAN[:*]:Bz  P Sќ ɲ&(S ͤ-;`sY"z_ppzFyẀB}M6y-Qd Ք&TY\Dd`H{2 {xWZE5H4Nd36ޒ/Z1#/Ef̺(NcD'NRB"_ {S`V95&.ܽXBkeܱ]-^$>Hp"P # o d9@X8pi76O[Аf%SdJW=$ @+Te ,c*$Op(v5k=ė4XH茼p@4E"LԴjPyU(|ڄLkHtq/A\Ÿ-d6 jbZ1Hܩ`U1v>ܶ﷼͸ d}E 2FM&ЈPGށ]Jr T@@P _|CGP=n;t ((L(۽!XRr*a3m>%(Zܱ"-PtC"+P(vǂ5DhUˤ&˽e;VН@&e#kvHqAsڏj:LMF"h4(=(D;䍃"2Ϊ ת7 ¤4,,H1#d3:C@ rĪ m}؀`%͐NڲhF& h(xn!mVӥGoU4^P^`A-AT6P*4 .mP_tВLЃ![[TM.ڹ;<6G+]Ee6(_D _*/60k_r6X~Yud O@ Y9 k@]DwEH;垍8(Z@e@ N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@r@b(BJ?'5;u@` ʲ@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; :*J|NN g!Ul@'BAKp]??[Sk1tlNj^2Ռ$"eY/jjW' XwG<蚻tI-P5E(:jrPk{;SNQW(ۻd~qi%Awo7/x.羴ٜx|ǍIo[dF=CQ `MG+ey9eˉi_F![WvE^Y,隺/3'?.!0_|ط;9T}f2 KoՔN74:[]~d (jjN\{ i5:WaiA3Dj>=:uelUww6UGIns2EҰ}=29*6fz֬gQa`z&KhIp/gClB8W ~ bi7t1i:Ō4:z8-Tg1>,9嬒lyQ񗐥."`S-6>0)M64<ۯEAO4I @SH}ƈD<ꔐ>ra=qi#=bIjZ.~Q ,&U?uW32dpr.pH: d$AS.ɝZ `dBC^qdQeX̞6]zcףYr<\#xymOgdo|7C3#.4#ءH*|H `Y>S?a:u"*~&:lLLvd|\ahgrM$FmQna)!uv*$$x/ ;jOxugIxF+`?޿燷?|z'Lԧwyoa"8.ZHH<_E ' -s -un˸"[}$> mRvJ(8ݨtS~yWz˖=N5Te b~]Gʁk~I^Q}B `@5f1 ڔm6&xNg'a 7n'94SY#h\AE[jdLKA{GjۧKv ~u}|^7`tt蹱ks zyi87i8xjp3v >+eLRYR׳{d IUjȪЯ1qVd?1#,TV aKF ֞zm20pjzWZކA򼅁wWa2X@ECA.o"Wշcy2`DVہ## I;3ZlBmz-]i 朗>@ѰeQs {"k!3=/6?\aK!ĚUZ^A;|^qI- #MĻrEw;V0e\"Dž`T V>=;gzmFgauv= vl>mx $vp*w4L[K/v!-lюkEQ@cb!: qV vh5.&PfTetrD`䵨Ri;89%x' x>.b}٫_Bk50[lcW Y 㵂K,./kGv)ۑ#>o;s#-'+<.s ٯj K*'V &*eh\ZNP:c:(O\c <(n>\mJg)?t/c<1A 8[,gGE~%vj{471&#"ҘM$Ⱥ!*zB'q $ 0½F8FH|fezM)BQ4`9eSMzU[ Qx4QG|+#yc&3 =!"iX{)) nȝb΂wDðq$bFa0`ﴤ}gr[wWݲ@w-NIw@$io{&5mHaJWx< &ȓ`~(b" k0 ^dR_Ch6L=y9 ^PM~rM*MhLU$P=}hJ?Ol>J۸(LtuD NI4UPkKl~lqVG,pdxsI W S >1#yAդt0/ӛEV4iY5M#SϹXRq}?ObeeJ>3KҒ3ptA}׾RnSLF[yӔgaJ7W'&֙zE plRsfd4IrQaLӲ@M j_AMxUlʪdS6l3`œy'šaGY4+rUh%FRP촋K&؇(qZ9Km r\e*(Ý7x;cd:.X,,@HO\!A \Zr3ÑJ/KeqV;1;bWqW4e(X!} k.PJuEz7]`B 4-4@`%'2%;:2%(˔8,SxE($sdR &^!)ˍ *"cƕBHc$+RNq,zq ZDl ha[0z13FNOJ/SVq\U%sam>[}rs4]g} uԙTALW_bH튒7Y/VѬMrU6fU pk5 u懭*7 |!ܺNѱDi*3ݼuy'KC5j'/K?ty(23/%?⵶wܩ9k2 e & ¼&BP5;Lj0b%bD!DgZ@$͵cz,Q^y2qSOX p}Vd(C0!A*d;30FMFSpLHKDfdMWPq6˞ZT?\2t3x6踹,Hp>kt͚/.Ϥ^Z]ҙ?{׍`"Y4`&` /E6HZq^oպZ-)>i2"|)}-\oC2XU, J9;+=Z*tUֶUrξ5F+Rtm& p.+YLxs6;~ɐ'@"ǜK]l+T+ v QY}I#p!$0ȃC)2DR [.!V.8dI5=}kWyrgl.bO /)rֺ̎@I'O>]V{㌭mV*DzZCe"1DQϖ֦&S 6cuJBĎCɍF{Mkd)OrHdt7N`5"r24j5[# MEtT&eن>`o_iB\ DR%.-3TuOkqOA=& ցߟ.u= rh+3cnOmAH$/\Խ+%YIgSΞu3za|Yy-j]Qd쓍j&ݰDjLr1ykm|lcXmLh=/Ɔ7l1h F ?|жu;l:C+ T[ _>^'[-)tMp>>`ϽrTC[MV=)ɪl!)o4FW&"@@mAH!gKzK۸e_g@x.dv1jف)HX[_lEMe.~c=Av23yX,Z.W Oj 5Ad XWS}i)X'̫CLaDl e0SPkQ)S^P[nS^7^x@A ɶZc!XC;E)$] an 5tVCu׽dRh3@_ON= dk} &\ٚ[ |+.s?s'xck-P:lh@\]#+;*U,Y$uBJAn4j: ,BarIٵ_g_}EF)!WG5`#ڜj$k(ʹ֌E znYK Piր• f(jM`U-xqіqt[qXe-%B;nkr{GK.>ӌWT:99I8gYLfv/?n`M&6H'bGs9dcd/6gJ>챈) dצ9脈QYoiC1jM6v]%L  *g=`ZL@lֆ*[H1`tm·ęAaȂ f[ؚR5{@8R.͉ a鬷sR XaEt-"NqkS>yܼ1CIWLY8G2Z{>ۓ#:Fd&z#%faҀ&! G6'UyWUg붅&%0..NvqĻ'ցn1Ԫ\pf6lR3daM! 4o "D?Fcpr<qpv UYyI ȪAVCD\MQFnՕ,2痳L5`!SԔ=6{ꁜMR_-t%[|[(Ew(Gd# Ha/+%=:$W50aУ(nVk1 6z},{Y@qe5*Abы:8`[gYw}U#:B&}GDs#=5.bi׶ǯmO]n::"hnO:&Z gGtZٿn^4NHpCkQ::֥XI6IJͮe7./Ofޖ&ҿS>c4٧Sr%cz4\-z(*:g f̎Xŕ|88ZGWD+x-_p?誜LG2a?KdPV/G #[7\IT5[uT~ 9Tz b{O#iһZx~f{3$7nWb׽ӧ{FD[.=s2 !1)_JVV{yM*G.\-éwZc^}@5fAԘIP^atzc **P3P tBvPG*% ί:Wً!3TL`;|}n6֋]0y;ɕSA>2;ud]?pnBd|}8Ogknz3f}^47JF^'W<w'pqv~WWԐ_B OʿqI׳Ӈ]o~{\'˿1tm#|ak3}?}Xz87lA/]P2Q YYՂUcEah-oъlTwsWvmz on8Q,l&SK9EhdH J WcΥ.6jPopnڧ8dk:zkO<8"cN([/\ i%2:3ptHo[>ds-Kǿbnn7-Z(gr jo=ihs DTlo-2Rg;ܬᤵ 6cuJBĎCɍF{MkdU?=吔% VKBz Vc)"'SL\շFA$wLʲ e}8)ͮ˃@ I-1YRM;!PHUq6@דrPp5ɝ=hkd: q{?%[x蠷* jر-dRNyFWzR!ʻ@o߂^b5%=Ys2]{(^Goo;YGr:;~Rygog0Ő-ymʢБS>*Ya'ȍrw-'Қɉh. iZ^ zHL75yBttU@ YCe6G"[hz]#pSJψ5"WPK1A=Aǧ,)\T'鴪BMV *b/V*+zxj GW@@-6bmP8 sHLJ76Ӻ u?gegw8_1=$Rٛv8l!QTbcglG^'qՍ~U.ď)Dh iaϠP \u'T(LՖhNX)1L=3Bg٧χD?<{?mK.ucuw^J|5FdƪTqĬ!*ak!ܯ4tp&狖au@q݁Y}8PӋϲwo}]6h30^U)(;o٭[/Yw5뻚i79I&L/Uc3gbZ0녹jEn )! SF;tz[eNuë^w<oњ}T+ L}U8F'UtdVC@^>BN'Ssrx)jye A\gjڞ5JS9ftܕFE  Rhb̢$ހw˶dU2v%vYF;EF&R:RQs!"1([̾T]kiD;?BKޭ95υP*h#$ b6%q ⴃTFĹ*aaW3XcIZ+yc;>MaLf'#vӷ]PI:Y+]c->ڠ 14g/ gނΓ_ŷS!V# U]oJ;Ϧ劋y(x(jcǨ =j OB5eZN:9|lRH'B@n V|Fj/`MaEΰ 8 F)jEYũf3Yxؙ8ʩ5`<Dl~<mLj="xgaHQl̍TbT!tEn"#HdA#vǠr7n}#v8mhr;-" zHDu٣ ˫΂mCuv6Kc\t=.x׌ѮO67bUcnt 2iҩjE8$q)pPP< :"_Uӻg~tyji [:jnC C*8DH<.Q-=SspM2H@*w4*VqQ!eVlw\c$]l.n|Ѩd:Q3xb$K u03qHt =s)ufؗq4{Djew' TXnFIcZ1 ~ 9PLhs6[UɰCQ^s;meקtrE{LP!'TvJ[QA皒X̶\+(n@\UbBO&de"߽ؒ2Q#2Tf JL{z֙8{J<<92;ki > ELNYҚI,\E_Xq4)99C䆰Q#9*Х |x-{sšk!Tlj(.mId],-}\P@ݞjuFh9Oz!íթ`T8R_ ;uDRUV")cLNU,큣X` .~Ww(] A(++r͔G;hbU+Ua½uO:`y;8V-pc?s,n?9vRSY"-{? 6l.U4xV#&(|;XiYm'^BtweJ2Gg@Gi<|X $7Ƕ'fenUY'֓>]\g|2w]rwmtڽx:eSQYZA)ɓVXG>ohdտ1+76V&wTj5oп\ك-&?TXeu惟[^Ni/︹jF=&w^(u6zVCV_gsj>_zA;Ao1qQ'sOׇ4>;ͻ/?_w?6Oӻ7(VLr@#b鷯w51]3we/vЎ²O?N_ j%>oVmfW>jNvَ?&țvzD@yoD4u5C!эKtq;xU]C X(NJu(-Ԟ}uFV3ЫD3KH *bKƋcKsE됩R5n7'q 8?lmg.oh\&:p 2ʑ )U;v&Ξay =>}>c: =:?lW7;4'tzT|y>gcu?;N8, 4Ϛ**h:bU6&JZN55[H҉SV=X3\P9i-kQz{qgܯWlYw^{y/>źn,ɳ?هJ;uGkCq06 o~: xh~Us O俾gнl ׆"ۄV%[Qlu+4ٱ& ~(0ҽՏ8%9a!l=nxow/42﷞Ros娍Ƌ'ȨbTAp'hjGCIkw RP"K#N3Wq}^uL0 >`@G窊0#u,ރh8]>k\LM3qi.G@mq釅0y:ٯz\nGE94m-n㦦vðmS /^9Q:'䢸" ,+EZъOG3/OJ=;G׆DCTSك25>:u"ӪfjF 8fLTڹ6jVi v1sDa e嚄'a~HD=i|avZg) 77@qs%iHujJ6b <*_H[y\a*?uT\ / q +ոN&S%t6]mИ՗<<<smu7hn­w6=b,)X8*-:[&h>jeߤ=71;c9['e&-I: w;haϿţN]Ċ\=z<38J2>򜑍1xD)& G%Øy4'̖c.zұLeT 7UۄVqF.CPRF0h`ijG4sZ^UQze|t8stğO VƅUxP*_V/TalszqYCynbhO }f$Y`y6tXΑTTNTus)P[Sttu+7jEUs>qG:ЛAH "s&`|) NF8$zK68D9@лA:^Rp">,s MF1U@pML!4pk M)3ևRZn]X4(D9~^^#X,2xb]/pu cry)/se8yWYM6c@&T ty'#2%CLZh\$|9w|f:v?I>[0s^,vޒM7˟o߮%Z^[O=Xߝ=ʋ yLxw9nt}}XvSj 6oߝ0^Yh;{O;})J<.T%G\JZ>/8<هKR~Ks9,jem`B(`-#u/|~lfa&e?>-&Csο/_ɋۿy@!nP9d_gKʎ_z OgsuiF;hgh}1Jn]h(F+k;E$+|n:Md`ep ;O[SBhuf@fId-c!)H3%U~}2%ٙ5! Ec9_pP~<>ofKAN+ٝպJƭt ;{o~qKN{[T{85X:UtW3xܤ rhJ) m#'֓m=;+MT2 炐!$ \r9ZBHmTA "C1{ThxsR*2'2O6cAȰ88Ah:kYA0.!ojG&p7S>+o+{Ys)|o=juJoSZ /,l7$`ݜ R2HpB`ܮv~q5Lw^{=ﷸ>oߚ~m{o8}\Vz=O᥹xvw k&v'i~y[om$Kvz5lCŌ 0Jp囌a4ގқ8vY `#<*}œhs礪q7Y[4'k wm`vA7-r%tM{~gz;Mf-}?0iq(X0*udy3Zlw׷㥨7돴1ƃ9@2C7_9pA7)Bi*\h-7^b@v(o5"j%M(%Yj0UMtN^ٻ*E?:zp99&R8n}Z^A^C.Bwd^gK!2[nmW*½PtT IO8Li\9?x36Mrw7W?{7}]m!(?9,HioHȳ=^taX5WA 2Rh0)l&SSEF%cNnpXNpcoJHYCZXe8e@XӨHI$=Y(P %pfN(` v R;"4qJQAp J-<'ϒ&΁16J^<17Hr:\hgaqbO\Zf<t䌱 ,gi k.Q+d`-$s!qâ4&gon dEoCpd gڻ>x3$`dRrɹM@Y RMn 7Io.meLV\!1:bgΛRD#):U͂,,d6%7M:A9#]b6,8Ҝ@CV`Uybrkoڔ+1E;׎ܛ[=6CKԻ=w|8O_xu3-Ic4j8җrꠢ7pN6*0h hX f,l1.sd{=b>h&Fp,u}H*ZsA D&Fv*7YeE;YX#L]ljv#|ySw;Aec*>{BeɠK9¬Dc5 l@΄ԃWs l HfKVYPZ2 !Aht*FL0Ђ'un+*҉qǖ.`Z*z{幁D|\t0FqbFr4wxVj2s"[eװw|4mbK!jV58o 4ZHcqO.&F#6s07LFi2b$h1ފ9 eQ(Ib;Ђw(Yx<3B{F dY2 >A*Q|Q6 %B2{{xIC7kKl~xX; &M}neg8'}uC̗ e7P=OlnaUd\RL(_ Q2R!^5%ِitb=}jGAgTt h!e!hAvc湸 +qgA0%PB @O֞ũ7>_큶!qX=;%e"9=_!`>`uъeʔF&IG(\.ߠy4(KPcHCHu o'B`3:g~ݎ{D:uitb,|v}!U0 ZiEV 0)%4*xrSNS$Ll{`%{p!-,Y{f`Q=z!oPHS> [#FnJtge05n ޚ ̐vBh;!ѽvBC|'BVgYq(>{[wxLXNsQgJɄ n'Eew)#H\?SH_\4X& [٢09i%c)01a^0oGIYSB<$VHMtk;gFŋq[.KǛ=RmJҩfZz=9ګ VZ᣷锍rͅ^IStRhS9gn מZrq%y70.ed+iH˙`&Í k2)ښ85c=ROZBYY^Tud%>x`s$λ?<M? &ӯ\c9mr`1Ș5w$/f!y9YZ;tDiNfEPbV!FQHl;|79bfѺٯzVL̮hjX*kmkZ HK:Ap\Hk9٫mP@'U@.kƆNsE2!CȊ-隄,ECt} eMjOևٯ[~VcW4bqFԕ55bwCDX OC6yKypg)GG)jɕk{3Vm\F0RQ0)4h8b$KZ(+Q@Mq]MhU.^5^,wYmmehz]+^{ `uHJ-HrK" 8y5K` KR.䛲8\_ìc`JuI(!M$68E.cg#?3<+q@ 9,C%vpFr.fpG9ٺis" S!zBNbȍ!p.G`v牋+WceuUox.zr0?=T Ms2A&p9ͅZ[<dz8gLLoP][Q{w=&LʻqubWLq> i4<:^,W; AFr>ojkG0~%Ι}ӮiX4uO*Q #'̻Oxt՜YN:geY욵kLf;,%$O,}~LC|+U9<,Zl15w 'l8^\o7x6oRfx w`!Ptz4 ?A/Z55ηԌv÷Wyo!JhvV'k#@~9z1w|ؼ/[Lz b~xMGhh:U&vEU8 1hAft+}Bml#]%''-*&ߺDlkOdijgM^<'1qCI;Tyu#=QЁ/f19kSr2 wy$JE qQT.`,uzy3پ$1$1]ILW턖۝zܙósYURw'u#:XTmmܪBA/m}OQ9=2%5PjNLP5dUd/)%} @)D'RBG!&%*EKܢ'S, 丯IZ8 Ot׮ .mm똴[{&SjIdw1JjAdl|N<كо32ZIZ[NqVP-Hm%{N2.09(' 9QFTBY)lNSGJ)|$WngHr%沛-onۡk㝆X➭i_:MTlT2=\ Q Ii=сǤH98@Q%U2yU`iq@.5&'I֌ɜB7r[@Ţۯ(xQ<#tP=B~QZSƖ"74~${]>FSw{"ڃf5W42o< .{H:DgL^XPi)Pڧr7{)_ lna]n&oX~FH,!b[ʘeeq/5)A=]ЂIU0RGТD#d^nٸ(A1p ҠLno!ʌ9q0}%T4x`ЫbnW!F! VfZs&Q89S1;OR@ %' g9+h5~)^q -pz בDuq0Yڻ)«u;_tg|Q/g_(>CPB)'e0F%LK #)z%c-xzR<8өz4 G9ՉDK(gRSSBrMxT4O/'.2m$tCq Ys|R ,I={fXCDJB0` #^ME }#Mp}D=(۳ ?Ȟ>xCz6 '0ոD_;\-5cSHT`P[Iᑠ:gZAF\P4HADo!X  >OfU7ݺ5o!' 5J?T瓳S5o?V}GfWi:9s&Wfr4Y;g'G0ErI`C>-rZ Lkm*/Y4[0e%6k91̇hF: z ]}3=nmz_+ila[Hbr Yd6Ab.#FQ1#*)?&o[=w]ᬙ}Bս8ǜ؟vȭf Sf,Ҽa)sh9y@@a" 7;ҿkф>7:;v +rb f-˼ʶBڷV[c%J<|%5֟Y8/JE&sc7-J,W"[wU7=\QV8btVo'L6imWTkc^Ȫ@o !m;F53b>TEϫkDvQQ}\}_AG(!ԛw<¢g_l/Zf4߱1362B}xxvϘy.ΪLP{v]¸ulP{}O`xONxq0[YΑϑceo|-#\۝[zlix4٦Zd74p }କZR Ao' 6$4_ !iC {=g?9g{ЩΞz~JkA}:xj7 YvN:[/H}FfN#,J90z{]7*/D½U_0ņ-n gѰe_/L/ͳ ȓ=`X`^Jk,L&5M 'BTaa!=BT{H\T2xB WG,E#ő2@3xaUkvn>6,/棳փFufn{UL|W#K̅jo65@8ѹZZ]ttJG=U'uv!Jf5y2^H%%Kђ:/_=#|њ &eJ,Lʈ[tb5%Q T685{AO3mỴO7tc̮(ՠ-.30Qj{wa 4h JH+Tm\*PUk +RYy2S~EsjW9'&m' z6/^#q8@(22P#gJDvB`IAL2(<{RkČ!6}f@ K[AqmYDOZU[j2@ѐX4xdB pA, @xS7S+GO*~JxF*1Xi9((Lh"Mxg1]BыثjxbjHi{ |U.M *DV'پ_o,ٞo Mp t2Rn(*eMm xRAI礈G209CvNN ,-́𖺹e5WK;SPC"',J$XjTf\$Z:C ;^.E9~ue6]7y1>q:8#+ek]Q7zxHz@v+E~_x,~{jj05sB<]C~Rx߫ɰANj).ފ*bff?O&g?p)Ļ=o=IBɨڣ/On&]<6 /n$=mݜƿfGDRG] RJu.uԥQ:RG]/4̪+|>50_'̍|Z=ƕ|_a Tj2š;U5Z[^G0?Ec Ͼ6$sR(lNxgRdpzyU[[޷vSq1W^5j |0:?U[Z5;x8Aן es&#X.cQx>^+:z & >-vi4cZpk1IL8*pL з5Pjd>5G(+jD JZ)P+j@RV JZy(҅2,P+j@RVJ aZ)[VRV JZ)P+j@?@RV JRV JZ)P+j@|E&MZdeUN{WFJcp!Y`^&;W[ܭQvWnɲdˉ$vYdOUtx=ȴv 00^73L(Pvo, X1qq+ . @c<9$2l)K͢&y=^C:Rb&y$ )Rk R[g}Ҿ2@}aMhu|{ܫ\a^D<20"U^DdXfJ ڴQK144܍*+xfVNJ. Dp ̰0wZIp@#E QBc=yD(BL]vT鏂^RTh<5G`b K°s) >Ng)8ۿҴ"Չ%B` + (M WFRp eGEWvC!J*AVH'G(<Y( oW@ڪat YMm9J?/b )}@$9ͮI,+oj64-^v'1|BIqPe4{0R|$x[c"v5$rf#Tj w m  &hp.piǷjfr:o*8xW[;S4B1kdbt~.%`IVd1fSv`dWۢ7oڏrE p6nB,X4yk,!azvyivGm)6կ͓f#h&ť4*xt~ь,w#AjV90bhǡxjIΖˮfHg3t7PLVkst9霦Uַ:dW]}`bj8\CR_iMrUY &sŊ5jRsMeyo?8_~1w?~8Dهs#0 AA'4-jڛ7Mۢ醱^M.w@}wJ[;#XYk4ӻeT>էnl(C:1+ 6e]42VQKR%L"B*wkA1k.0EyX#O^)Xѷ'94SY#h\AE[jdLKA{Gj]{{Oնoؚ{OyfeC]iaVN& }V(k^iWK/fҷ?;nmWhR Lb"F\*v(VT_f=OiDIEr(|6U9P)o[cB^U3MTUW\鱹]uoxw[eɱOBqܩuG 6w㲚ގ\rlm. <촋"؇(Yj>nq^X"z:&0<*Kg2V( <m0Xa#B=3pimb$#?^^.؅u%^?g\xJ8J)E~132Igo~d ⑫ͭ u<0g81簳6$Qs-9˝R`&cANrC˥#/6`k U cJ㎰u^ wE%< 쑈3h7_ign7ャ]ZF`/GHLaF\"`.RTbb,(3xn23ŃTOXj'bhO 4gKVRMqȂ/yPiR%ETx_Hf{8,|+N h9"N;Nm<ԗL_#H%?(7{$|#:z&tą{ ya#Wjz?,9E,PJh],ϑH~$&`%ph%@֓`ʧ+' sDE NLG |J))jX$"haolI=毁Moˌ G3MT/aʘ#qW~[;Ow:yE<;}-l&D3%&HI}(5!D ؉@=SPc6C_g _U} e_ߴg]-G|W%+l(s9<q{\,6gss0eJ&j3e7f9+ђ|:_,ݛ κ4vQD1t2 (1C㌈ 0dPrHRIpxa2c}J!0) +n'ﻂjG5 âD]`#ޟ"dVrl &Ԏ-3`N>kٌ^fmhMu,IHBVY^d4WU?LQN*&t0\.>z?2_f8IE8)Ͼ>[J]'r?N$'')l-TK`5L]9&=9)$(Tæ쥙]!|cpk.aexז.w ~V9ۍ ˯ߒr`8hM@|HJy>sGYvX6M )|i EUVFx#U͠e"$wjMQcY Қ… bQYq^C KF?" F[ Ci}'[TtI5ګطZ-d#qKa F?ٖ,O^Mp6AIK4R4p=cԶAt]b] y>;Sw.ڃlrZzBE3ܹx:g3ۧwwu̘;"̓-OϙKk ϯⷫ_?KUל}%x#B1_:L #_?G 4OvQ/|tj^_o}px,[E'{ #KVH%MKbƷ%1adz$X^ĒX2X㔉dΊ|)0ރ2YfZcf"gELN M q g -0k % I 4߶QGyJ*]<:,B\J.9G#X\͆2Kw@%')d@"DqE⼩P9l9b2 irm8s ~1< ANs:g|@GѢ|ƙL ]դhbCkBj#՞C~j]w"E@6"@"Ft1P~OzfJ=SbNh\hnSi1? fzAR.YyKWbC= AT~a5^m%]dB MJAQɓu%AQҔɠBb]yEr{bmKb$x1/2FKk,`ipf27QhLfƝ_/:ҟ%>SGD}o\{(k?{l@r zeiU:M$ AI|2r譕w6Ft %.*&C2Sx2:XE$I{ap:Mma`L<98.i-.֯Fõ}.O>|j"3l"2<@JIASUdnRFQE͡yU%y8} {Joز쯞tR^4<\c.)yFF"fjSJ]5#j)F,S`,*Z5A2"# Sɚ%Q2֌1o,w,7#t aOlk&6pxjeOE* جT6(3QLcЯh^fkzO?MvD/#bU!$Jىy6ʟ?^$1we6Qo~F6܎q*u5+n.Z;ůҀ1wGuHݧǒro?sUtW&>D{# se}Rr|R~DR~="[9%"f(}F`l$ J 6l[KcZްIO{ux0{_OK6ι0{4d,H~0h# KYtf%qga #$eSG M=xʺ8ZW)3R*X9%L)Z ;j''c"vvEH'$ZhK2ؠ48PmS #dfBqM-cM@Ff0qUkBLErPMy/fybWAƱ b3ucDT"Nxos vl)lbbe?X|t ʵۆ'4 cƵ&c&d(S$FxGRY DԺMXM%M%2EpST?"(BA5(IRbs: se6ZKddIlUFFl4!##VzSr*_MDE݁,d@tLQBJ%z"nۻ[JE[[zf_z>*MPR΁ķ3SFtIDpO礴WB-T!i_s HY) PӈA9FKV:TjHR ג77-'wcu}㌧\'է$ﮟ4̎/n^wo:_髿.u󃜏;Q_A+W|ݫzտ_Ցzսz{^#=zeכ7a_{]^^pyy86LÛ~Ij/S*)o$].?xtBu288HKv؏8]3wcϯ,u ̻O>tp TR:=цJtstqv4ߟw,g]@]O WJ3AIPp{ϻRFhAucWRl -=כ*!@{] y>;Syw.ڃ'C3*9>wu̘;"̓-OϙK󫽾my ׏򒋈r\C_~ ȮߔBwan9SfWHݿ~/?h˟K7a_`5ܻg0ſ(GYxa#G^wɒ!6+-c >Gawݶ;-Cr F- Fш$&*Tb ބvET0r@`B2ϴ\;)QGH{"c¡%{G9P7<IaAUMBEXST>8 6H#E XBJzSt ,.ZFc#W=t7fmkV]"gadZ{1Q,vk)T&#`-%+%fFz$X^ĒX2X㔉dΊ|)0ރ2YfZcf"gELҧ).q:^&ڊ>j8J3i`X DkK;0@hL)Lbij;O PY Huq6}/9l9b2 irm8s ~1<[ldtΠ+^R 9hE)RV%L ]!)ADпŶukXv43Mv\f Ft1CݗsϔC p^TAR.YyKWbC= AT~jU 6O&*I*f>)PdzSw:'JOeҜj9uԁvÍw~{8O';|'gOAz1h$8Qg>|@l\{((6 AhYށq2F*{&U $H> b9;# hr1!3'UD !j)r5jrOƥ:$TmZ$"P?eqdsTYd㡦۬Q`Qsh,)Ly[S]~ŖfC&0xŶN37 >IqkK0 z!K}(C0R2S5ƭ&ؔזP^h,cP ;d$lK ITA+ H,lLI6#gk݇O?XE-+uzUjErxk H[_Ĕ++I UaR]"!,A%Md&; +Ե([qd!5_ؿ[:G*htp;}w[e|W_r&nܓgVu( Hύ퀴^i+u@y'"MIDŽѻN[]t]o#Xh@YP⸷?hk`/Zo 8PP,ZF BD^JQAtU"kG/(G-!bD5IUD芄e(1`Juf5>= ̾vZŗ`uB8 ],ő%ޞQBX#DYJZ8Tm%`"S5f<-tvݧ2,VPV4 +NCG99 &=rb B-5=+ER 褦&Tz(d*ʾU4:VR5c3r֌J;]؊3ՅXAT>h>x`tIi]NO)xy15v`,dLH/l濩(Kv$c`Es1&f}i^J@j%ՙHv};(LmITDruBD+rq2\c͸c_km`UI|ր=dKRH:a'Y0i[еBH2#CًN@uM&"eHO1N5`gf<{~VhfF454- ɂ$9 B")bAc9@y^ȢV"%R>=z)u}7aujrD(sL^%+^{J;D&S١9񫎬Bi]ch8GۈO0p&d((oD`0p bI"E^<^<}،;Շ>s*މI|錮zj?yECBяKai YK6jLrugw#`sNKݨh4,zA[ 5 -6mӎ(AHfd^]'TU9`SRx-|D[ r֌嬗/'g1*S82R ?.̑_@I,jXdH3~/P;T5y $RX""5D$bV jPJOw% ce+b "D3)g:l+uJ [3(&cJG؊'|\[~vF;5% \""h; u2+(F#WgPAڭ0m7B3WBv9xd ^*1 JZA~&k4bvcGU) J?"ʹ.)]JL+8`ҭ!(=(ˀ!䅂Mz .j@]>Q eF۝:4<1ٓGJ/}-{D흕P6c"Tb'$gȅJg/QT_W4uAu18'TJdЎ'Ǥc zmW["NmFΎ+mzDK+珷=9;Z1ܰjm`'ҳ;PeBP'Y'!键姓 ]b8[6sy$2$~1Pm<`\bGɵ Ŋ'"Sg*xVɗҾ]&!Sјn5i!d1K<^_tvWA3n'ix?0w"-{vMް8UAo{;zDG/Sc~7o Ġ7wLq>^|ff 3ǛY3;7o?~z'߼og`F6mB$YHx ?o~]yC+˸#o3 m^eǛߎ_.Vw=f˝dy0,W\w:=U?b+UUg ض]݅ڽXK|ڄK4j/D JtWs-(L.xDב {U*3N1 G9ngF2RB,KښE" }&:Gڌ'~'0;`2;!;trI˚TEΛ5zY~oBH$&bWjn*:C QKo9T}+~X7|20h zGz Y zbr0Q %A\'KF^'m2P g4Z̽4<ƣWM-ۄL=9^͓>vo{wxr;ނi*euU3hmI9L˺1>f[ȬǻTQ_aKCsVق֬";XsF%B:gI9ls(t~m ;o PK 6G~ׄ"EF gxTRʋ ɚFc ͳHBg>J)1MN`$cIBA:Q,T%yZ7߁_O&mwu7^s>żovb,_Ƴ5lU7,'{k/;ɥݲNl:`DZӮPjulYI3ۏ}l~}s*c9h7ʬ!3VVhEFް3M>Cd X4o7ͭ(^E{q”lyG~IֻS_K;1! FJr,u"zz…JF;AXF8F|dmR !cb vgv]6^uW!.Tl 16DJ΁2, .Jє I5V ١bq5_3>ߦnSv[(3Mn1#]moɑ+}eGvׇs|l>*L IV{zf(Q%`clTwW=TOug<Sba>ʧR'e~n؈5Ii)'e"CV g zz<@|s'BN$R!2Lj*=cJML^y|@AuEOגWt 8N9y/:Kk 7R%RBIc|cދfϯJ7XoRݕ A &ԽHБo Cȁ ^ kt(i9{M奻3t-j05c$y&R#U2@+"5<9?s{uyjqp$ Kѷq i[{ҿV.ggm_/[P@/<~>\x&tzVIIvGM^+uLP!ò pHyPQPM轅qC=BI䛏Nǝ\Q+/mXWHS0O]tp5M/ܸw+]лaf5,}nIxAzVfUI 뾎MGl`YuT{z13bZ =W_٣@ŰГ-u;~U;RM\QRʓ5eX.}B793eAKޜ'HǜV8~8pӒ]x"}ݟ=״mh2o9h݇G 5Wsl267ҥlHa2*Ww-LPjza㸥3oca^VM̼预cwAн{`Žugm1Nafߝn斞3NO6;]^5କZR3 S<h!#6P&&4[<6·8? !e lWy*Ղ u8o\$h/5ԩTLR{_JR2'B@o+i ,];rx90/|X ^ IOt2q1"I6ՋEļZ%[dyme5QQ4+8Hŵ>+Qq-,k T8]8Ì0^:L&4BHؠfl#4Qi9DM*Sϸ(bNr9*K~ϩڏ^]x.XzsLK6S[ui kˋPgfzdWÍ>_]Gt)UN{Uw,Aԋ8@fq@f3Kj)` dp@f֮Dv` Bp/1"|4D߂6*b"AlTSsCV~VXF8g_q7z"J%(Uji|ss Ѕ\k0&!rm1D>W{%02iNفK ĥ۹!hl>O{gq3ԗs#2,/%q%Q\E.hak S  ) i5KTuyo(\j?$Ft?IG=P稊ל(_u8!R2Mt?K=K\Fsj(e`>䙉&ÈA\JFbF|N2e:%b6X r[*D,KF#u;Ҟ`X*¡bv̓C9nV:~Ϋ;/W';tO{9{|svnvKם޽٢ӏ&SӚEkk3Wdrb࿐ |jmytVl:unW4zn/@RaiHr<m}QoS)jª)dk`oӄ6Ky.(aXfGjV땗t46e0?ɿ,SX]u{onMj7Euk G'Gw⨳I5AY[@JcFQ-JYЦ]~Z״Q炀QB_%eQ`xŏ43^@ EW#_۟>XsˤV?o~J8ߊu N+vkZvt+v֌m([Lh3tr_q\>;?SんfO>.tDƒèJOi9_QH/DB4~T#~*z@`Eq\,JhI9pV\.ZYaeX=`jeNHGDEE [d>u2lw>qm9栧9bd7ƿ:tG{i4׷W};~lWVymkJ;XAȺhN1bdIR Mp3ir$SPD=W4(qTW.qe;DBLN֤]:LÓo'G,Nh Ҭd!['9p%:"2"hHVkeyAMBg.  |Lj}d!rAR"k >%֎ɦϒ'j])fR!)-lqq_E.9w>]Y=JgI(%ZDeTr&%V&*\"$[ jqi]&BTƌě) m2(>TH26Q:jU>{[4Bަ3d5[)-5Ʈ2kN qh+~`0ː1>$,4Jrs6:ZaJgĤ\B Z aˬĊ=A}L 搉ʂijrIe66 xu%+-V&gXt9JM2drRh:l_2j͐x,JR莣.qL28`p-\3 q}Zi+(jp)B\hK Axl<22 D+,* Q;CP0JD;˸Q4uXs ZDݼgtf)0xGm/'|$uL. m0IPFFP+5R(+C +o! xDR,v ,&0:-Z+;5\e|OH&,zP50` ˃JU,S 1ӐctPmV&ܪ!6uP4( S+` I NmVXF$<hZH 4*x Eh1{G"p:E;!\v,>a/@=܂F,#'fAUY !#zycBfqL@A{͖[ CM !N&$t88DRP 3)J;D _X,3q+)$"0%!`pq 92|\tfo -{@PSje])W51%9Vc܃.Pr w`񖔧j($WxJ8z)R({Hah,gBG˱ns ^_U zX<&j~ nj@Ɠԁqh!Ka&2]Ig^b6B,Q E yS _X|AEJ )z* <A&|Z,>h.g"%(Lt_XF@I^x43j܀P&7fxn>$,:=(J`ػ6$WYlIy ̇Ni4KVǯߗSHT&p,fEf"+^+ϓCWaw%,Զ ))e%)( dhfa0~ vlnww6mٝCLm]LՐEa}6o a,@{%!zy\ZA$ M B.e}t|QV]R"d$e@[z:&oC@FL3q!2jxrI'#&aUmA1;kFV(-5*8tHi298kgnF@uu39(k$A~2C #RFdCAasWuCϢb!^$,U@k6֜?peX0M`Y5 3Ko^5RR\GoE +ĿAH^q^ fM:[P`Hu mZ7 kҘOr NQW~5AǴd\>״$Vh%j Coz),A27hâHYa=+NF"5R&Xndj3RkO 5rAP/㢴g&Od:BUSEc?],߼FraϠ\U8/ڑ?䃨=xYk8}C( gib7 (rfF8pJf:$zAB~)#<ߘ#Kx>h)j2#> 8e+C`3vR d "`~V;͠:rIWŪR盈00K1!:Fff%dy%\^C"fׅKp9X DB .-P] Az3!z cr;/jߎdU'|q;mmT!KQ*[KaQT[xuO!!,w큰F.*(8mM͵1ߜۇU$7씘@-ѧZ ZwL HF$&Wj-=1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@_/)1 i:&@9b}L T5b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1\&GKv:L 0 G+d "&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b}L 5{%GHN ĕT@MZ㏝ Ԥt@_#H2dQ"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b}=LJL mOg~R6뾼 ?N,M@QKqGM>z4sk,6r8wY_gy3w?[5Xh iC(Hg]eNe%a855~6}?ſ (Ҽ eA7pOzgDQD| pn9dk ;n8g R|h?m+[חٖ; }}|F7煞`uY iQ7[TMPe~o._51o'X1"`i骤,ݯuƿ\B St{.nůů0 XS%;aU9R[ܫv+N<.i;ێFCx!߭]+HvRU;+}۝v<{-S[@"[,w >wJ;&5=jȯvTHu >̪v~b3SC{{#^uƵ)Zf "6YX'}L=>̪w~]'nj%ܽ/Wwz0?\vMbY_?\_ݞ75"æfr;@Iu|oP+vhh7=etu0=9߅a?mjZA:}k- k`OZ-[LwnŅVusE8nM.e 4r ޕ0k4}l2o~;sfIg.h,uM73wԂk;?}W-K8+|\?ܔj^Wwj92^r6O潸` }_\7glPzz8hGg/mG+=Y[i.c_LKț>J%ع nGCۛai.6ٿ(m0s!5vbީ`Z6 U"i:?!kig<›xs5&@38xE-I$AHgZQ܈.6R!*mN$K5ٱ"(:!ڷe09x7Ϗo[H>]lד6WlYfm7I7g_Q./d%e xN4]1m%J]hZ x&wW\FCK<,'KM %Tb=\G!%*_"< _Ν3{+|7,XnT!wsgZ3Ɣ+ZTIǨ1:)`!CC/ (:ŗąmr 4orQ/u_˹k/շ.OgL㲽&߯|9zirr<]Nlwdz/Z;'a{5b90׳-TK<˛WɾzLѽ:ptNFFw.7EZ[UھWͿ47ˁ|i|v΂&6צ8pDJjB*zܟ[q. %nRgYU?hSEE,B5W6 ,qcM]^drHZ(A&+oK_f]E_ӵ NX6- 3zcz|"<\qx˜Oy.n8 X4uq4nͯzY%/ kn;!LMܴ-7ZC;%M!N2Г-"g]^lfCȾ `JCE͜V Ni`ݴ?ܾmL=L7m)gnnʡO>ƅGnt o'eG_p'۲}#k#K úVo$E|g=BصC1{g-޷ԿlK2)N/;kD"E )Z&E }lV2r zie6.30;YZJa>S c$) An wddgWs!NeJtMH5k9Yy3(}O!G!:CeٜWS0ڐt(YlG^C06d6%ʩ!t* @98;8&UMh=aHY|,.zƼV9Q+v;̡`⼌=79(1XG|UH!KhM{ k?o pׯ4^ZP\K {JfJ#묤!IpYdfRƬ)&*`rcki9]`r6VJiT4k^"l/ 吕s !tX~`5H+$8pl2QU3bgҤ$vJΘX$ChgG;c23GlzvA!wZfZ#(k-gŠ!\j/zǒq&;T\)0-DY+WHt %*%Iō"Ido j 5I'-Y5jk/KL0l˽yŕJ"q 8M3N¦$lx/#f- 2b)"Gَ/`r^%`U7foZȒ Yp'"?{ǭJ#X@ - ?d}rC '4Gьho<5l6UX_]oҋu6E>۱ىm%!?QFS|NCbȖI-C%㞳 q.0vA,2r"[-f+"FŻݝdFHK "#|hl@_b'eu?.XI׭c.\LFFI}Z< Rm@@G8lR1Wn=ettay}u&q❤?[7/;o?\&g :cOsWxdG&Zը=%552% K[Be͈XSaL͠]h-.t6{ъ*VeϪIԬa92z8O1SVq֬Tќm\Y߮B⌦ѯ|7{c~׷~.̇7û?9@xI4S/GtAxg4jڛ6 7hZ.z2|%>\h[f WĿ$ϾF-LgMXW1}xK7¤W۫I*v 1 q%{VHwgItIEØw 70\{p:IaSq . AVNzjt;Ӑ=HClh&H/,#2@dJEJC(%zRt}Upgs+fw狏r%i:D>=K΄[BL_ -A?MSzTIK[Ə괃x8ֹx(N@tAf-암mȇkܐk}/!ݥK &F Ia*BDt'6 M\bh2$X ~eP.=E8cwuzUާ.SCYj+*tBL-9ߦAn>}UՏofp,4^#4i^T'ȉj%wI7nfq6o!TYi97LPUI]8#I\:'Orj^G0^&|gd 5fkkɨ`"7&98`p6f"E6"x@I9 1FrJ%4O0Cnv,3w4q􈣐O..{&xTlݢE#t7}|l\}ת,+$g2l*u!#֓PN`e΅R[N%8X.7Y,/"L28l›frr\k!dZ$ʁu}:8+`}y֑.)ZCs]q,Dz8Ms q/̹/Y""98K:$jk%b '<ؗ_½@/w砃 89p2|(S)3bVzc@a#]nPyo;G+UR=wf^ji|oiͅQX 4!&\[gLdqWB& \TGXj+ƜБ{w)yox pO:3O@`x-3sɈXr$:ug Pt` Q4v]P` >ߍ}7(=8JK"`#C-ٗON HX€ 1-w՝ %>KY2n[\2^ӏJ3ȢciuV a]r>5ԗ/kҗдo=LS[o >5 ITkQ+Z6CQFPZ~-Q1YVUD2- b=b/@DBJǝ3'P,H/AvF{,`@qp+?:[ϋ<4gyN_w.x ǿo9 Xm顒v2HX^U鵸C+9Re4#! ei1jz:%>,THx ذp |q[ݔ{[Wpi3<~\DlM}zIϘU+ ^ u),fW?>^< (؜mqAeA&AlTz*rܪ]02yC6hux)N3Aϧ:6Uج1U%|Gxm{:7_Q-t`}5 bzKaܒ***e$IAIE,Hk #^2Z0Oea&n۵J sp^)ןȌߪU!r)Qa:E太 $Ѥ,)m1O\"5J 12ykd:z_K]f.S[oOgo4褶~iҼUv6sj) r L}Jn*%7ңܯuvz_)QZvK䎺cy,=ޥ\{;$jR`)tw[ipuSb@& \W毈1~OC- ⳫOs7ͯ &;w5$G,_6 ,Mco̮|4<@m_Fuk?n[B]ZLLā{ֺEyw6N{}97о*'eF7B8/dJRwʖ_+[{DlQT*3J$,SlIVΖB,ȬI;vdݎ'S!{SzfO4aq'ujNL(}($#DaU[|H5 -#Vy0 If姘4$1I4霠&&T>QQ4V1j~Vj]sQgFkfȘ0Ls+w"Uy蓎朩RTZ\H7(Sz fA)\IBYt)ARTTqG.j#c5q#c=R kyƮX+c!=aAp.Q0 7{6fx hx#CΩKUQ!3NQ@m'-l\]w#Y@ %s=:r^:##f$!e8 By,];vEmQ O !(g ?,qSPƵ\眘V qPSrJ2!ȢOX %fqLfɊ:ޅ{a5qa϶|`<D"℈'D|#&6I|*'Yqb5MJkc!4TB`@T95rR CiKQ>TRI3Ck8TǾ7 q3\ǴYKvE] O`l+KZ4 Y_ ȍI.t@umpx,xX;vCSw||+8ka h.j91|g=Sϝ T>gOK%okH1F:ᐥ|m'Jó`Ŀ٠ 0J8~q֛/h"Ȼ@7AG7mW+rrv"i{ִJ(!Ddt{ⳏkX^j0pU d4 }^V~40I 'm7%NbS7dL85g'#0sne(xDEnD []y^V6 uA1eV^}9a[jg~Wisvw.M:=|MaMgoUd]hĊ3mJa &Ed _/wًAާa|ٚdO%Eܴz/ifhRƻ(ʠ[ec%6iPFoEE˿<?2\(!6Lc.1h'f/Z?WV_Q^{>gf,dMhZ` 0$3xOȨ:p"J?s2Z9<Þ#ѳky9/8no]qCzi{9e]bsm\S,2L .K56[ [s.[J+Qz}Y˻/oɇQ~nL^mC2hRw=C̚ X/.`}"Gٮ;v}rm|zn^Jv3C$聬 ՛[E҇7b|}$xЈ3>d!vk+k9i/m1]4Rm5Y-5Ǥ?R_ n|!=٬i27{ ْTtɭ[!/"czs3i;C]FɚTӧo:p.Zcx=ݺQYj(5(٥3Y˞Kx3,![B< mNxx/tprɌ_~~9.N>ǃ&_Z/<;Y X#[~el}d.)0Yޥy޳Yۧg^7:RQ  |GNNJd7R%㩖,R#a2*5)5=fVO&\e v fTAzWM= gOʙ1?|>чyv9~c){7Jk'?KZ" XӅ5d6F9-3o_mٷzx_ʉ$@O==Vgzrv)yog~W o+7Ro+\5mcio^^]kNz5*\ýlj:ہ|l%JV nSC}7]5hV*%$.+CR^wg օҙqY;,7r󍍒|ƛ,MMɺT_ ϭzm{~<A7{WYj])`;bG#@Qļ(R>轃Z6tn^cWmgV)s-&y&8gD'id*o'깢Aɬ*P2m:r[T$DyD ޷"mb|WN=~7/\cz?W8ŲKV$Bo,jKd/# *-#Fist;/ID5  bLj}d!rAR"k fdWPVZQju1#TMikkZ+-Tk\<֑Yaiir2DϸYJeJW=LZң%Bp-ww1m"D%1a̘J$Ye9["dl9tԪ[<™Za4j84B}cW 5ep8ExaQ3L!}EJCcVIbF`K^0f%1)0P"l:HKm^L6.P$HsPT{N*㽰A%a.)eFhU՝bHB( FXdn J䲥ޱR3O=Dئu}ɨ550X*3JR.qL*uGXGgf 3;$6IkZJ),Q>R4P3bC9. 8IĤՔ)5H@$b>QEyոx34@LT"Yƍe `:ǁ^{EDQXT,%43H.QYDŽ=͈eiKwht|F eB{G$sd`Q6npLq 'j z5:/Q%(x$H&rZVȼ"a>X\8Lt~ʢmHmh( AJt+Skx$[ -h`uW$ 8U+ɉw%li{~6SRe%)8 dxncoAí_a+@1\ƒr\ \1;v)I^g<[0S!7B4Z1<*IUk00g bj"Zq܏!s6,ѳdVlmƙ0r#SZ{zz@pkOޙ˃%4.p-c|wɑr5WhSMg TfD<8FD 6PX˙@̠:t= {!ֽ?H`p{&3tQ` %&I%n% ncA^}6DUuӥ71.ńE@MĕpID awŒ}'4\N&﮴ t^\רvEC@Tv""DoaL`j?.JZ!:]jc70_"6)աQ*[u r$+Hay^k$rpVWKmվAjfu%i*|NJ &Wpg+@bhR@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)>Y%)L7F 5(=u%P+R}J -"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R}J kHj!Q\j@R}J%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@H[BhO +ijQ5 VjIJOQ d@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJOG t8voiG?MZi{ꮼl/ sRl^w?H"$Ѹ'Y'$ow]LLS'xeg] &z؞_aҳa(͆9d8xc pvKW"ٻIjs~]\EC/G(u*G/(*膈5?tI Cݝ)[OߴnMd4;]YGZ䨩Mξ<:5\:H_'ucnpH^ɜ* )PZ1@Id8<,jma QꢍlJ?|wͱWm`s<W8KgVY ”bӬX@ja\ ע46@Fmtc;2ѮϕnWPMFθ'wz)ѷ_xX}q\r~__}Bc[Ǵ8Z]經RO-+o9@ =dKAeSF=~.gV\:)1o@0эpmt^]#^zhn0;k/!Z_5 v%]^˻ϖk@}Ge6}А<j-#+֏vfI (Qx3XXtVA7j[K|64*(5(+Ep!ou_(ejtOs~t.\J_N{O]ż  Zil6דF/:[w7 xw6Zyȫb7kmuW͸mL>674٠ >ϖm^/m%{br=o wpĵge2y' o(p= widvm6(qVQ5J%|ͳ,t0{<* suylt RkaBܦty1iu6m?wtjl}?${҂q;Z Cg0h88 ri޼P!*=4sDBY[PXVa5- vw_}o{we?mcr(ۘXcpku;$H.ؘ3l`7FU$]M`EJ&MHׇ{+›{Cص.n.vuν.xB} @D ΀Jث[!ɋcn1խg`>{\c2zy@u5IM-QrkEuLZVȌSpâ%uqfi+ /K02^7xm#`l!pG]c 0廋hW(iV-^ wUyű{^NgK~-߾xQ=Zᓷ\rJߪߊwL!J,6W^kpqkdCcd͜#cXXؗg eϱS,peQ{놌O4?fE0._vnƳݍ??aV }; ;sZaAxLIsLTl}jX74cFߞdNVfhd/ddB6465TFɶ1dRVw9bf;N'yb>ݛwUQ[R&mq@)(Υ(RHx0µt#PJmE9  `RjcCNs dUSUbkraɈ"\2Q \u[n7s<͜aG꧿Q.-0>؛uQQD#bbiH-Ɓ/-UC"%-Rr>VY1"fYAWGp1\go^rh\=ECq>mW80mp+k|Tt-l $c≇Yǩ$).~Taoqh{Cg}r~T:[wǿvn"]麽Bi+o]MsՋW/:z~kY^Yo.rku~HGmҰ-Z?TAڡ*ϕ' H i8x2.bMPF۶j95)ιPwhVE'kU"lxʺ>v#fI*Id|5h_YC c.-A[n\1\#.ף&W]!\ۧ+jXn:g]ɑׂU"D*e`17;q;;vKu osM/ܟ3뜈5ddnSGZ\{$}*`E8?VU$C*N`xj;)K/Tj;xnt VFT4HGRl{k"LGcU`LV|Qx 038Ȝn|0 N K״-:%ә> !i:li$SH/ Ө9YČI<Ǎ)BT9ܧjc>6:z̧9nsdeM9&#\4騡GiiP&z훫\UUukܨ||E)Xt=7UzMt56 h!aR_T.%ȓRsS˙ker3n Ȝ&u <q@M2 ? Isƅ1{ےлEryp\jtm3|^qz'7>*~+`LBlTּ .y2Y`_vF,  Q(h" H!NrmZg<2$sΐXTgϩ ;~fSq:EO=>Ԅ<_;}m?8L|22 4ccVL{):ʔ-s!˜9ٷU-8z38ޠ%S$R9dB蒵HF#ZM$x:յWk_{TU;N+"K_GR4#OGrqTGGWX}i7NgP?nJÓgkD[8nCI-jBNS0.I0f7km 8нPRw f4ov#6"x:fLɦکF%9*m0^X A1T0 #?aCJDMt## 1t1T2H! AΔ.@  !;Elc(YĔ!=dS nk)J!ʎ:i@߹$Z^݌|>=mI(ZS}FtnS-5Zo$22x ?Јwh\R#@:F{M0!Tr<:g=ϷyXdGr'J1E 2"dδ0J#(Ly{*Ge_m$.xaꭾ <ˠܢTQdf6fϴf)A(D,5ꭾ՟KF}k= SYvd7 .YW^ċw lw߮W#?ApZY%2.Ϩ3S@Qe-6"d Q;;Irf԰Ntdb4 n~p9Et]IB1B68,87i>Y?ut7)0G6 O']k.q4񼹚\,Xep:ȡ|Sc6ZoSF#tXSD3P+M0 㧊XƟL\4F4K4H7^vJ.snFş]ܼt_S]du|{+BLhtn;Hy8ŏ-ԃCndS5ڬmZwy oV:`y * ~ϷH> R:Am8Ry,IR[ɰg<ﰪ8I5s(.>?.g+1dO2~2lrI. )-+ ӥP6-u֍0j  ^Ue&vtkVTØmfGwnE!]} ]-iE{my[Cr]1 XURW^^S Sq4bq18jZܽePoi?4pq bԈ`!kXT:iڧAakrt@?2ϸ$?n.OcC/cyk /Af: ~U@ɐp!/U>oA_xَG8n޶kCuEd?|~pë1>]޿Ͳaf07n%%nKo~pٍ'wg?޾7?N'I.7ڥֿ9Lp}4M 8ڏ؛nnV\j8U}>fh5S75T(b> *zCnUyX vtj6z;O'V%ۖԵ?WE60$x&4et>rUa+cښmQ"u"|䖑\^]x9YgBԛBҼVs#Ȝb2x1[# 'm9&"H@Y?+e@T8栶C6wRdP` Ih `TEm ۰9:N)>'.jc._EI-qWL($= ^h^ξYԸ~jw {c hA(:IQX |'M4w!&3M4gB2:cXNR1H ,B&&rˌ稳\i1V(B\;8rH-)xH*( 2g dd qs٪/Ϛ9lF\FM˻cTU۳]K_XG7w=w[*n,ALW_Z!O};˫%?CG7{WT51(­24`l L.zZJqj/WpF0̫n{䜌X*Y[vna \$hzȬ5t꼔`[e68yWM6c@&T ty餧 ֲ$1Tc?I^p>tإN`vѓa^(;]=Y%B^ƅYU䦱2sf{r2(QVKk~VH4R}.Zc7G-{V2ۄ1@K>DFRT>{ʱt۶nq 6gRc23"w(%z@1/<\џXI5uCeEޛe2̤((:hVE(yh&ӈ tØ3{E/ӀiAeR\]?MLLiuRZ(Hp21)!lЀ"lV!/sc+g]`R2Vt(0k2%͘HϒrjFSjg ti #wʂ0I)C@hjlg9ZL%6c`KH띖| 3o2"CMX<'jݷ2;G}~3&D ̫`d\QHWB"(7>( I<&#}ZՠV8V{ާj0'^"DCt6 4r:&ECsL! {Qqijc4+_|^A<'F3tIM)ld@18kdYLyte]NBbꭈnSlMzѳs1jCXwwHm2(Iجtq\hFU "D"ycJH֛G=X&Nn>uT!GZduC?"L1.:֛ǂI6Yҭ)(-( (˄>eMj[5dRL4ځ6[BFARF) Bi$('rnKlfzQ3gPJ(GB VǡsaM^RVg dF$UA0Fl#SgSIbv&y &Tv9<;vΉHրB; Eǻ5yLw$ɧ\- x+y4ֳpN=:z!t$T 5 !p4hJ) b)uDI/;; vF,H/rgJx6 BpɥT*+]"^ P-| -b@4: tGL GREf)Z2'2/1Sr**#jx;#sFǮO761.nlֶe>~=| `yZ+n#=2ŽJTmQh܄3"QyQc%麴n+Fi>blz$7rxN&;-E'7'|ʆ:{wiO\`;WCseCIJ^?nUⷙ)Oڱȭ_(6/0ٝl" ՘0PP7ݶ.JfTc'yh@(RCqjs"hB-F]g,SgTxX^#d;%ϭY\OelVA68X=)vZ8(9tkԮDb6`[?GݜǷŻxq%;T)̍q<D8L*c/j>4`cTCTy^?+ϧ.T%FK)46JG)EF%cN/Kϛ(NI.=M#uC`S:*zK3ePZKR{͜V80|} f J턶D.҃S  H4raٻ6 U='g q{OY1ErEʶp{%Jʔ2lS3Ù~T|UU_4hY}oV9k[eޭ6[{I^ʠcNH`%KI"5 Bm9ʓq,h/cIwKM!dLS 0FA=y5=+Ho͹,OF*6) g9E#K\&.B@+gJebSV́UY6,9jH$JXJ݉ r"jJ]rJ+3AQ*N+՛ ^ # z%nՒ v˜l[Rzmˢ~yX|{L }Qh泴&&H}!(r2XhcC2u`^ںRz#7?6[FiQ\1*ydYF dtZ5{E2B) \T%Ǘ%Yf- 5"a  F&rʽ5v$s }svƎ/xo9x~)'Gqec]A ƠϹIٻh} (e%EEɉ p }JV Ah HdC$"u:(!@P2`^-~,tQYrT12xdA@v79 G%Mm57Ώ?:2r g!]>5B@%.F0)4AxҖp<;#r =,-z-X\S%TcX^9{n7ƿL ^7!erSB'o5,Tb[-u }z1)3qLSf^mڋ&ȤX ,jA^H(1 c  "[SܽDB,6U {WB:ߐ=m3{y#LjtYҷ( CBo1A2?09Tgv2tIε&,l% wThɧm^ɵ* @7@l72jئ(l)%tqߘy.!H:C+qh,`!EC/"Y(A,S&crYDc \$12$339k>"PN_O ~> X9tGe)͖Օy*[&4 i$1& @.Rz.+te@J u e}|6?? ?Ö흮x}~~#UTv3%)NSٖfs=QLpFNVm9e*tV&L-M[=)$׬uZSe|'X^V588Ax̹tゐ-Z&l#!<X#X/5SBdi8=CI :r)֜5 4Od%49dהXb"$)ceP ȽQ}NJAPj$A!ދfmaS*y7ش1&64_>ۡ{ӎ?;l:m?h۽pݿ60ٜXMӯɐޛ_{&/!;^}N?S N\HnMt#!dɔX+<~xCSQP`!@T<f69*F!72G'MqOS2t;Aw/{Z7 .A#?>a~sC].C)V|?@O9z$0w@(AeouwT`,AR~^9HƭNƁ L MOyV4Pِtr^%S7!+2w1\5< טă$/NKvs/jYҤ/ҒJLl~ je3z7(0v h&I!*o׿wq4㛷6Ӄ}r~jgqoNz.qQťG?Vj ۝sT3`/w#w<8>Z Fץ.]RWֱN lV/ 3ه,|H^2 `s ˽Ւ̏,sHjMJ]~KoS5E R.( ʠs6H7< O>ɒg]T6+P: v_Ys.%!1 1I<)ytTd粅].[umW+WZ1ͭYWdmE2`hF(F:Id6傉 t V"JjX+}cmUFطY(ЫxovގE?F!bA6aG#ƣ#z}_JV}(KGy,q`7k~j&mPhr: }YnʖםP.+/g28tq-zM:ɞkN"+,<NҫX'g45s5͒0ƌ啊T*JV*[bTl[XqזJVAPbTlRU*JV*[bTlߘ7JV0*[bTlRU*JV!fSY[JV*[bTlRU*JV*[b>UKmxbTlRU*JV*[bTlRU*JV*[Tl&RU*JVتU*J-4L`l8F޵<4޵jyeRb h=( EpMQVZDlMr"XW{kFW<{XQ(ѠJ2Xc,QˤN(%<*‘tg3,X#CEVtUoK6c,Ө=*-抋L0oLEYZl<"8IJ1S̤%y&cĈ9>YAzk]d[#;?qS4"e"28yƬTF[psFd@MZ1.Veلly̓B4S/RND-0[iKNis(ŕ2@!&V2u ^ # z_%jOT.qu'[orn'i] ?'ôOw 'SCx5͖Nw{`nI׺8|aF ڑi GSEZѴR:/Ur eSQL:Qܶ{B7~nΓ,,Ip6d|e"G Ы=KLJʀ =rR rIn#d5˞#gCߎYoY1doS{Pz8ǧP  a@p$/9:GC)21&>"@$3՞kҀlCR}V|,IzRB DɵW05vEMGoRBϞ<.|2979_vfEJNu2V>֜a/ø-7ET=[DY-b7n$Hڧy2O^X\9>pZU9oIddJҐSVqOc8HI(.hk}0Z1A%W.3V7.>;d-EShpHLZ(+=(B~RMO.:]7r鬳*9.v vq{#AX>čeg.y>sAnr}VDZUG>h> [Gk>|Wix 3;En'GY%NG jWO#$l!ht[+H}mq:4݁iO߿ɒ`taղfͭv[F,Bs+!-e'CZW(sE/ouO7R9* Eh.*.Q0ʍ 4YTL=ou=^Xop[Z%ipkkLIc($t:gU)`c9pV4`:X`zPoO~[}9!fl] n&;`+A=#z Of|2;jQY(KZ} x:%tbHDGDUQǂ2䈒e@+2THs9ĆGS&v[g7N^>bݟϟ*.]=?l^'=$#ARI9&Z%&:K3c#g|'e-wx0 ax a L ":SJ}Hd yO!KQ$&92 j ͒&,f0pHJI90 һmZaRda&aR,>s;rLݜ|< qvQDQlMզ=Y'(cH p"ph\0H1`4hچ ,.`zv|kEX)y$ssQXdɠ(#,2陶\!``")pIYWpxZ_px`T=ܢTw3֌1б- [2EBWVY= ='ԋ@Qmf7 ZqR #^k g#.q38#;A C$f@`".9c<`_nƏJgm8Gx V'K982I,9i4?N&FF-xӷ=u ŀ Ŭ?]_ן?AqofWU8t\<›YCK;sG@Z8 Iw K?gq\QKƄV-]+_Qgg/:7ߖbdek$Q/?8:iC6{SpbIgڞgw7?xP;Z#փ4Ymq`4$22giwǬk5 ^dWлޗs ~*{_|Iwh1Z;dHt;zsFUA=옯]=jqdq\j\\2_X[m'^±'XU.\5s(߽?.  W&RұPHq;]ٙQo8/Zyj6|[;<*0x?j0aYqSS׷G/Z~+p9_Wxx#)3-ucVL*&1$IQ,r7-2}qϿ=-bUoCC0:7kXT:iH9FЍ\4B.H?3V_.t/r|hYyo5Ɔ&c5!vK#@tV9xUB)Y 5@\\Kv>t㯱qy{"K%ZdM`|1Ԅ}Y5l+&ƭ{4ɵ;0hλuU^qv{euf~|W~n3Ur-OO.WO}s%yti50R:Pwύc<)&#Qwl';5ΟsFZ{#D@]UZdqV/g*;60z,|LܲZ9%{hu*}aŋN3"DCCέ\chx !!!dDy#l0 PSNrLbax&ZgΟsQ;0(ph ) 9@-8kd9Lyte]"{j!1mNDHlJ&M/Wm|PYSCjxK$جtq Gu *!H3oL4Z$ԣzXgןVȑ!Y]{<5YqGԤi=qNHClJ&lr*Yxsk%mbm7Kۀ:P+9@P!G!|p@ysd&"`hy,e x{\YY EmAeI%>^c&Drȅ`I*K7Bqz Rj$.4XB@XRQi-qtP_[ {vqcZzx8u.Ll&"0O:LX/*sQrMYJϞ;(|cMܖvf>Y %YHHR:ʴ  #(XeO ^47x"ꅲT=OVzάs"d)HېbJ_HZ=mDHs*P0q#e/]5@p9i )M~-D\K;<3Jek+RԉrHp^8㛡?pRM%LaCZ+%08irxU gg\Bԭ =$GbRD:2 Q'W!,'D.hHi5!xg^cBlXލB&t勑?557eOknXu]a ؇joRXik 0/0CFiY176hR??ч `C󤊺$0%OPW(Џ8 >n{=a(@XN{vㄴd'.Vt kưnY]H5QG.Q6f>ɚ 2fmUY|36RQ!:g >=ZkKXR׼T"OY?q7N^x2|_<2}rw g`=*^ ȭIc+(lA]v]A fQO'|~eCn D̅rh~Y+P|^>WiS n3zI1a(B)V-?}EEկ623ỳhRm>m[^~i}5IraC͕UQpf&F,uHH`cUph=q4y+c Ty>#.XZDA)$@ 4YM̷(r(z{Wwӝo>Mgi9agYuHy%XeW9KEk j#|Җ;N";{D(EEvV =gm!)e ,:^ĉ {-SS=ŤHL4Ex뜡HF%SY'ߎmpۣMN.oH^Á˛YzgܬZ ۭ@B3Q30;^=›zt1ٚNX]"r-i\ 2z a>q}60uYw3\ι]|]Q7o^h.G5k ;~ϏiL/s#q/zMrƚ𛦛Dۑ6E!*gWQ[|G !;,J!Nq1)3aw"9瞗ˋA7A +@B~/̊h<%~3G<bW!X:|C1-&D) 1PK8ygLDe@瓪uݻ|ۉ­`?*?,Xf u5Նb/bGYl5n!V'mj.񶥯zri_u"p=vfjXZFoU_J&T@ bg&UpA)G^8taKBTu9ږN"*`KudK=Xcc'!8btXXp u:%} ހHF" \kv º3ǧِ(?e-ĚWƥZ5 ]ԙ-u4cE U9DZN W&bl>av$^{'E0>u\|܊]^P7mšT9E]R?/.Y6%Yj|,VӴ[h{Q>U.XQWOO=:A1_E3kǿ.ㆫFxQݕöI^ }O|i3a0?_xr Rr2w͐3 - 6^R ”<^. $>݆OXU))pLsT۰/P? )>aMSc^yqR~m8Vr v̱jƦ$W&\ɭ!cSsfC65=•;UYś\R|XMΓm;{6w-3MrH߼ոMJ~?(7/y jlBb!ɶ(/Cgt)sfWCQ}3]&!~~WOn6S! @7н".zCi! Ds45b6B@ax.w$sܘ F&4GsIu #{G 7e1@r:-RJ JxTB@\IF8]׳FrvD A*nF9I@KK֢ƄD+7)(n,I6 EFT\\Cx! gIhNtZt3r6Fܷl{᣻\e*'%GmdBϬTy5h<ƀAO O7 jґ!-1ܪ(GH$%D1zTnOPZtqEWkz3I@E h(CSYG*a@sk.cu˶D]ED烰އI4((K6G5 g9Usйшk"#im( }`d\P:$z/Q ^&WpV Jz%, H Cb~$&I#uRy rRO䂐恉!VwƵ@`2!6UpW[R!#73`eOkne A؇joRXiJL2^R#M?Q~L"P,UΑq{T;2QB* @x76xދSN5kj#o%oHtByT|q59z $QN lvތlhU}7Ņ,>k~[''=qnXk7w*P C= g0bঙyfpHkd[mcU(`rkHHX :j4]qJE-F5ϥ]~irRlL0 b~^ ?}EEկ623ỳc̪GR^1HwsI|ɏ'y ׄ8+!9 *+A9"FYL&X<Ы:I_;ly{xiW9D9}F\赈RHHhJoQX z{Wwӝo>Mgi9agYuHy0˝O=9=P?T>0=[EtHBF%W6V()5΄?q?{`~ yy Nm`=' W֟5qtiQ݁оkK-AXé& Ho7UtL'{N:L. +wy4q ͷ&I Jɧ\r*?29]),V]>GOS"v5ػaNӢU3tm:k1x-;.c6C&H@B =AE'8Tk $D)BL| ک{CErY#9SR[V[\.it*ŘY(TɓwFr4ٳoVvGs jr4ĂӪmK_b}-:6> mٍvaO'ԇ mtUjW9oMCɜVʱ$78'5&EZi ?](spauP@ֱМ'ZȨQGRtV7ّ? u9%,Ϥoӆ{b:5Ooo(@Ti>*z(Ntذy5L6f\QY\r K4qP㑿Q䓅٥%å1ƙKjtQ Z!\QgVMXN8?N@S͊ս*<]LWUs0uo?M ,o(%M]Ih8 }"UZBwnzT{ 45ԃ ].ieT+C֞ȻZY`v==SM`ϫGAQ>'2,wa=dUz8֓p^#=ҽ?6#L2B|b_\jCh+ӥ)\6suXjZ;Jlk=EEGva^wBl0U;Aykj-K(p|[@p۬#,8 .g&[זy'2SDA*g<\|S] b>Y;㸦5r0^k@>?gc `3Vat* P(sdaB h OX^h;^;$?fgY2|VK|g Z0J䔳B $5LO%'gWnǙM3׍/U9u·#by% QSoR!J3F(AK֚KZOJR & lBa+1h?D# T__gO=Dq}9' 鈖1"R$(RHK42ʾQ(207(<}Bk{pQe6g ^W}4kRfxLp"x P;E˦~KPz[.ދzБ:!+Ѱ҈J Hwq Dy|To<"Dy`ĜI BLj.$y(!\@T¤i$KM Z{-v$HN-%:Lk\0 7*9g«&6 ^^LpGl*D{^.qL>&i(K5ZPn ~^֪]~]z/ b/kքx.h&$1;cX'-uڝ|fQaR."Š+Uټ 3A*%3TwTB(8.}`)1qJc.9Qm$F-bTOXj/δ'><蜱QfSjr?8{2ۙ l4d20.q_8,ޯ5tL.B65d\ uAR#nj?W غ93#)􏎮xz(Ǖ\@GJl9Փ;U+f|HbsjQBB@:^yqԄ^(E2 # BȄ,bYR9*{ƾ#w3>ͧvަƾjr6r~OGlSCU0oe_QQɍ߸%h-];Xia]vېZ|d]C{pU &k9jzಋ?f{cػzd1=O[^m"홥MoԢ$&qkr4]!ݚX&F,]9ڞP SC0)Ov~[MowGi[vM}$/{^iy>"-nOmsI`yo鸭x[y ŶL/n׻ɉ}ܿ_*qC-j6ڡ.+:yhH{qmÎ%ۃ4#'\dJR̗̍sM,:._qz,9{}gGE<jL1$w`:j2+Lqa2qB)-6i@1Eia8)Ĩ5q8h. Z(|_ ̅㐃<!sXSi"G)b70bG_0S[Fa'` PQQ1UvuW86e{n GԠ*N> Tw::@ nqmEe"Lm&G$~1}E,YVS`~(R5fϬBRRnPUt>_k64_6pY.}zt)٣K)}K{,;swuy6Y񏉽^\N-Bˏ[ϏNIPPpYPONn~/'T}.ީ J'|tj:D*R`VYΤՏ%,^q#HT(ElQ!nO3jc"Q)`- M@=]X">Rý.W8ߚ+;+1^|ĜVIͥE8B6 pidTF ViM߸kN4߸@@+Z:-p-H¢,Tԁm <2%#{] iMQ9x阷K&\$ Iht ]}'L&Χ.cuM|@A`#0ޒ@NB$m$<%̑Yr)RN} cxZQiSkt2X}H6Z!I"2i.BP#N2|MW;R O{d?V=_f7gNgrQ FENJ<㉸)YRY-YGKH7'V/ԫ7&}.xhR//rA2G&HJrv": NXɣ\ (gJ{8_)SdfF!k<`mc Ijvs>xM6[I)GUWeU"*⅒#K(c9%%09k|LrWroX}]wG%|ެ/_y8?olw5( z|4Lpn\IzƖȏ7kMf3Y%+\ F^*˾W9k JЧ =b/RRz7fK RAJoMzs(WB,(,1 }d!aKIR&#d*jٲ{J?N~'Vh5ln:;?RJ&ӗ{1;7XaIb5r&2LBPH蟔gţF(o кia{g|=`?2oTutތY[--v}j|}(bdQg0hA*Y)'t43_K3p5طfG+ 4ހ@-zK2m}ttOlq30$f8ONT Z =^z{SK:RRɷK~ ?ˈJ;7\S֐ "p :q34Rw184 R1[ $>ذ=v9`u1yq軋Ql|n4)> uNp<.y9캛(M7-g<&{IwO{U^g8y9O:Q&[^ M٠9KAi+r͂ "&!**Jf`NI^]B< $FpNrT 0++jmat6-Ӕr=yQqt1 d\.+eW4Bng?BtJ`X6Qˠe"3L؎YA&tY4vz!+6!},.df.=&C[ LX9tf9 >lx ,tnb(zܢU?>Oɫ<{OVfp!jk;P z"Zn .nY->3g E'chֺ9]!h3mvrr1]/L ŘvK[;=~(=}/%F7kCL89N 9鞒_YMcC uL(=G8ܑR2{z;9.z;;}yfh4S{uJC/L""BJѵz#vHGv5DiZ8\T@W OyÏ"lW_<\#^ls;iQEHh~iKp[)$=\ GW<3tFǝ1+t$L XB~Ӵs5F=ҨGHi#zQ4l#zQ|)!zLe٨GHi#zQ4F=ҐV}ř2V6F=ҨGHi#zQ4F=ҨGH+o#zQ4F=ҨGHi#j7F=ҨGHzQ4F=bi#zQ4F=ҨG^L#w SU?eeې`ǧ){T}'Og{PnRwoiQߗkyט1̥,ݍVV!Hur dȄLNh );\00 >L\Fަr|%yž[D2KM$sm!30Rk܃BrxT|'->^&KZ 9)LcvǀC>Z 1ǐY)G!d,!%R6KYc"H*G+o ({kf &jkdR[@ǘs GIbzj٢WlA6vD47D eMÙ[GQ5im9cm:mw8yf|pxLL*r*I0 6:O?6If:H6ͭ`gG딴/QS4T/;!{OT P1ruێ=:㮐ZYw YeGcJfE4VȵWֳjRcL_\T%ti70*Hrr'e"O9s&!C'״:h?T |Zi-1j HD\i?D 3V&%T,0A88Iko& y  Q( 0I!gӠ&_)#BCf8b&/Yz F۸isb g)Y!т5J2 3hˊ!W`¦m2")җ==轡Y&"*|[G.(=xR%;E7&YN&tm ~wEA܍92`VY482!]XN'I&^c:DPV(cGBЍ#c] 9±TMr,P_[ӯn݋LN6<Gt(Ѿz { &S45K 'b,@Q/a\Q CrtkNbVxhoՀugtb)9E( +`,-!*-^fr*:k(#@OT#xPbqAHpqÐd4jzCY'3c' Y|IQj!ɵXN>ۈN!=S5]c/嚶Q$T"DA>$3eǨIQAЮW ހIFG!5+")R]E%0*qBCN -wtIaĻb\:J.SQ6ŌS$[%/?esJakɆՖ!IӰo/Oq)Z0[Lqf%3#Cn)(i&89;<.y⡟,.gs2g8rr$V.p!e2޽\-[/7] 9"G8s߭O0u!Gu/s?];zx-zh~v~tWcAGøNt|ak.l _qRXﯠ 7Ҽn\rn{IяC[~6|w1?YupvbUWBq?΂$$h_̇67'~#m:g3m:x'6溨Bjp<lr'G=:s6UϺMgtL:+]ҐrbE_i,]CMmsPM2`.8ǯ?x[ٻ綍$rr1y?t沏^6VO嚧E2%Y~=(hТ L׃~D}?|xay=0 .%mlA""Y`󥹚e5KӘ\܈i:;QQ9/v!-,юkEQ@cb!: NgӘG"h_dbK*'V &*1X.-'(p@1V.QgUpu꿣ܵݜݔc8-鰛l^n=d|9IXkhїNLrvHDX D\723Y@OH$#c !FX>4GR T ^;0isF m4a* 94.j9Bu2)g}gX><ϓ>h.CCz(ڲBO#ԅO)xVj.+W #gG !H8! 4bu R.1Bڛ@(/u1kJ!t_a[XآvEVOi|Ry뤥^\,\nHEE 4XZ7%|K >fiKjjd;"ÂX+COtcvZ c>8:'*#GcsvNzu8eԪfok"Q+;yPЇhbMՌ@LVpڶqnorn/j}`3뽃OWyz$hlj`QTqVOߔAޝM!23_x ܩWo_a)ɧ\ 0}I%XjDPN@MIIhx@!WS7 ǡ쥙@cr8gSW ~hu`Οs/߁9ljɰu\:*eD[D: Rfmg/(4]k;pQD6g8t ^W`C)_<$A8^<|tYٗumJP:S.M|*˹A,hZj}+J=]7Q6Ǜ(7Qo"2^pg ) Y1IdbD1eAE`̸Xu t qu4D{S(R[B7zf!% 3B' /hv&6UY/Ϛ .lԞ2K>9֬պ-/ ,gJzz W rF[?0g#u&5O1t2ɇH)EԒ)V{Є~_Gl/7K y I]{C!fH 9HysQ bǤ.eszؗʥ[DX3<*KrZKR=a7HHz椸X8uX,Qd eJ+gI19bN;|[Ҏ6vywlj#,ZI6# A1ib;<" HP)0~qǻ_{8,_+ <\h׸4/Mh֗d\vuoAP#nj=mIl\Q 盻~Vj:?,%E–,PYJh]b,1!|ͳ@(Åf# qDڊ4R  -弥Qfg7%v e&#QHYu>qSďa5f,`FMFSpLHKDȰ3q: 1+IcDs-$ (|p2+1cAui1s~!pwΐxmwWhNTYoLTU >H)c5]ϐu(nWu,7! Azh̥B鎄cʗ=꼼 ~pO$4rW{[mq~ݜ'xx&U7MonVMbg)vsfK[upKdĽ[T!JuM:-BemYoҍA+BAKҕ̚͡!Ā]ŠCS.a`v rc.W{|WyK%׏౽́͸b̃ByXT脑C٧QM|g{;maGZSY4XIn11T2n3"2L#juEMCJb #52c}J!tE N&xNޕ8l1f]g8u1PzSa [ɩ10Л<η h2{ZS&31.$sIfRVOWd}_+Wk%p|Sbፀ_TᦗkXfl Қ pIUbgShcyz5u*L.]__oǻ0|v,w\0_b G`VN׌Ԇ`=@%؉0pozg<"g*\m`.{a7iD0 _*Aݩ` Bp l>4 's,N?&bn2_[>g~o-;Gl5KH({a璭f& .}v]})p׿K`V.Y5i/Q?{׍dϹ1Eh X`'3`Fɢ#ؖ4썳ݭW+c7?V%NSj,*.بwGG#.A=?ב=t*v)ƢU<0krI\t=`MTߥ2dJZ%8}PByr_}G_W٣95n9}1܄u>{c#11彔IDߦ1u%_{-fߓٴ;Gb8+ILZTs)  Eu]Xr§DRޗ6]ڸD]bW[%P1A%ȱp@,Do k ߌ7%%24M1 9b8Ҏ͒=a`Cqkܘk,~b1s>5@v4KnƉ5][MԒZ_+gbC#U8xNJ1q51|9!8)ںl|cg XcD"o<5AO;2L`1y0 2W4ݕfX5Ϳ:ؔg[_ޝ ȫ-x'S%Msh{ As H ;PTgn{%j51HZ] 9gJpg_ݱwE~C[,7b%8+bݫD̽c6 d,bOj!o&mIjFlF-9Wly֨}.5y}#[G>o V57X[^|!}KOFmGgqQ?{yjrh!Y& SX+a-U4}Ps7PTowg(lY^nZ>RMV%:gzFMzzkI{N*"Un twi.}8[U[Mf^ԝ-; WcLo=B3Gд).͗blstZtJqu _-d:A4nzЧ"6}zZv ܊R >Jd9PSAFm/H),J@s%YJ-e0-NȩWj`I7ya9~:9=.G+O^R.|͏fo,Lw sU%v 5|kjLA"*֋iRQ8%t.I"' kӶb="7=o/_?Î7N/ 3fY~/ݲ׮_FfKP@8>8V*Gcd|G^z-h 2tfkğk'\_?.W~_noi,.p$`<: o a;By8ڲ]ЋK+RY17ã竏lZ_{pz{4}{ztv1//~>XcSY5]lc>xj: K]:n9l ǂ6~ҿ~%Lݩț6hNfGdqx-%gccoե)n&}Hr> ^)U((7Ex|&7:֋^^4gPzU|(ٻ.}tЕ8R#B)o;!^/}s|Q/P~;{{3/?(kybk/qV^C3=?ߠR4~MU\ s.?^u7]8xaPxcrgH^ebJa`9k[?㠁enZ /}?c^gj qPN?ƀ hQuU1oz;c>泌K!PWUN)7w3nbsy7B>iV_UWs ;0dչm?uZ#@QC\H}anf,L56 /י) Cfu\MiZrJVq}4K.pW7 bc)j#Պ2ܔ|==՗gH?jMFo?.Y~a.It]l5&[&^dJGNɆ  &׶Aϭ޴\ZimLGJRJsy7A&Ӣԇ~wcjb<.2X}1Vk)5).imRUxcÅmVm%_ZfdK2ZNެ"y(1gҲ>Ց7o(3{cXk5׃d| (D9#&Z.MKwb@0cG* oƌ[#KQνSKqh gLW}}mmKph3Ɩf=6B3I݈M;gK* A0!ER ^Zz= kl3}9/GpǪ?L7rJO^K՛l8Tj^M\(ȥPPxR$c?nΚ`GTUN5=`sFۺ'vtr=")<„Vͼ$lCJ$9TaA##;$}Z; ȜfRKD"l~ѼX}ɴ$l+֐&IhtZ rE<"u׭P,+k)!;x)K+#og uFe`4i t~X3)4RAn1 x&Xlvo,*+Up AJoG28/upuqppcqZ L_ 6_\ǒok'&pmV7>ZE (h0 =ba=K+IrP!3YcMHHJlHV/ŌSPvPrH#{7E᪼.9csӒvp r=4o#f0L@,`Pe2B&(\E$4`m ZKC0V0c6o! .dVcha;\jRSI)՚Gcbj1 T76GE0e3XGf"\+@ڛWT,g ƀQ".Hʛ=[+$Zte$w"=@w/:;?A+u RqJgkq7/* HI(beG 2G@!THbY8P-i z3|$2 ݞ Ұˎ5%XZ wVMcSItUb,VP44Ww(*duJpiyc]1vP`}XT@@"QT!-Jp#\/]J0.Ab,9hXb6ci6aQ `J} hɐDFQ<7Ӽ6(H4!kkP?zPZ3;M"\-By8^1ۍwjgYK157O5[ ?zvTH]Y,N#&A5weVeo Zn8zkF{%CE$ḡUPfh.z/mY YHm0:H,&샏ޕK1g~eZӞ%ơL$[%n%ZG mp0v[MSK@Ů㪃bYq*U "euIhX&f c6,`ׂ?ߍvQ,Ps>gJtH܍>0hcFa԰(FDRTQذ-YQ/"e9# ڦh{5JXK̝Xw`B`! 0aj@EX Q_^ e*H 'Ya]a(*bژ891pIm ط5:GV`ba#/ JX ư $> qT FlY+0 y |@>oK@x,Іxy$!ɵL 1׊،JRB&Jb䃡?y `҆'҃JA{U:o䘙TW o2J BDr 0H\ fa m{5bap˂G-k1AұR"m,ςvGc|Aq,5㌅jiTꬋgѦK-".hpdShf!{QHJyJ@u(ᝈFi*[ ^e#a-hۇnm-5ѵR/7-g_!8={u|)^R'cSi.#%Y.}Yޑ,|V!ω |@rb )E& dkA&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!2`91@`Caqz.L"-7*R. @/ DB&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d\&D-Z?'&q|6L gqpQN+2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@/ $@ }FL (y6L Yv =T@/ @B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L^hsz>T VPUtUp=(M_,?i>M]: f(n7̟nܥPN|Ǫ<8?u=<کR_9r *@$\_ ܁sNAF^NtCXFͷ(d*ʟnG)9x~Yby'q> ხu'>I%HxK<64{ Fz!r,5!ղKqYԓi }C|kKLLjU5DB.AL514Y9 %D @֙ JCfYpMq~7$ۨ% ì&V Nbmu0A k,>BE^[>2Abڊdl$xTJ!I󔕰zSfmNsE--XA"|LIrxF ꧱kN4 1c=jQn> gݿm9pDž.zi:PZE!Jvst 'zmGۀcW/hta>I]Ev`C4@ H\  '͇MnPK})dX?sݘ^yvBm>n'ֻ]]΃`gTp?pЀW[=^_Ov "&(s]a5v%ZK$AX~s_.CXn6<] %9p%PU=[T[}L-a{/Zfu̯`=M\^?.Yg79wrtG]_6Kfپ9Ʌ0{1YC+ateT:<d|1p1n %yΦ@PGDuP((ץLWJdΘZ"k}9)R=`;7ǜomcް.)yo;5w]) DLٺ h2 K( *W\Vku\,װ-mA&&٦wֻ}?]O qJV*ޭcq:˽a[A\#V^>>msTkNEve&Lg yE4yьq-P9>IT#a'm|zqib^SU"^+x>Ⲝbslo;3~ԤU7,m" QN5 XܦNv|(]RO/8M昲yWNYR-$̤hn8 ʢMɵ2ayv^^p@m'Tt nvSno[~[?};xCmвuwmxoh|{>z^j]ƣіfW71tJb]tnzѵ.iTs;-A6ˍ-|,`7ߊ^f ,;BqHVV+-ߙU O5a]+ /j[nT] SĻ1y~[`2mR%{ϴmغmO0Zب d&rqfK2{j#:gЍ188;~9b,t9TfJ$^r)3:K7{W[ƹ^sq}{&U[ `/_H_=<8hV\-b;ߦ~pxe}wHfkCpyh@c4rQ pFNZ̭0"Gׁ60q]OW\Gues)lsN^& 0kU./1:x^^dJ[ !qc "0R$ql(7qz!H90,7%YBHBr?qP%7>'~S&~rǍnC-?ڭ 5'U\208OeQP:)wU=OwMS;qUh<^WpkՇjNS:hy^~z|ݫk}kWzHU%j-N2yzrI݁ ŎAŦth̝Ï:A?_ X N7EoG.u.׷|ڑdRE+[;ZFR5V}T=l?a۪脓eQ\5a`2LF T6a1@hX3Ӫʅ_ 4fgϛooOnU~jB9׾ì%dޣ)c|آ_9*V`' ਘF;`+`ɶAXc՜\LZ)^-zq+j[iоYy}_eA%Q1Z"S[ F&oHmr.:i&^ 2^*ܺ =^/dY !#0~fkô҄s;z)ag^̝Q| B'>Ts# LO=ej=gV0 ?%+#gطA웁 5RX&qj)**e$IგNXFd`dcz9|xt_羇xDŒ[?w9\{âٷ2 {oG^ {pXΏv×F^V ZTj"z5?|&YxxwuDQcs(GcC]IL)cܞ3Si)x\=9{S [wD1G69ZtU'YKNmSeWxu:jZ8x"hI$&P:墳Y"#TB#D6a)nN%MZ@'OgqߗS؂뙌.hhqK2Ox/tW'iQ5)v,A:h; tu40SZ(hi_fJ_.Fw^-n%Uov}~=[aU:EoH"H٣@2V)ElϐNی&JINeۆ3VyއzRn{i4: _ФW]{zr6_͋KAvSzwdsVq&GhG.,,<{Yx|GVAZv@ZτsZwo{/͆]KoafşƗW.ۜ cJFza3ME,X$E0+15 P4/C5BVbT.Y,F+l E1dhK"#[Mi'y*Ru*uc c9IPEOQY%AT,+T(SqAqEH.l!32Xd*R>YTs1̬lT DԃD$ $d#US$EX,tH9 Y4jZɒkcm.NXKLm蝅J"e> iv-:~'S|ؘ]`5#T2Ict,MCO ю=V` 6D[GtMbMa;OSeWj5e0!95.(E. 4T:]nDQ} U8Uʶk+ DR)A;^885%b]tbvfܺI_]>㵬1ܱ$G ,L` [3ŊN()Gz(U' f r*:ѥݴMPˆ`s!g< ,)$:j$0 dTpڪ'9$&T I )x<0C{K FkUd%6n!~ =,?0I)մ#!mj嵘"7L2G@XݫegVT=0R[F:'8(&KOBR2ZBՠFbx$"O3x෋ o)c,'֬¦:Ͽ&\J{FF?r.ydPNMj&UltԽBIJde!'UvބDM>LG #ץ0~zݜZof4JWVϔHI@u%ny,!键tb--zjN3mQ/1KhtBn r)p󷕎h2ӗҾY߮C< }TчMEbxemq^}}WA3gix?~0w&-vKޱUoCn{3Z{٣սƿn̏nvCpe2=Q\/'e|y~1_Y} aBx7__ji4bH;GfHWQ;A{au/x*VM~=X.~˫# v.rר1ըӚIK&oSݔWUO+y&G4 )6)Gnj&~r2?LgkX5vSzwdsE>OX篪@kNZ8E,bc>61z[;ަ 3HBDCPEp}U.S{ <_t^m|( ~:K6x7IwڇbdFHfޣbxGh#N9'.N(2]DsW!)lI~<"lwfXC,Y,RJ޵6r$20 H~(X%/gnψmF0Q-1K ZYO<(F+jm3͞ꯪ=dß>?[GÕx2\Dv=m[t<]Λ6ӘxA|mPylT2s(D 4$AD\N$JC [^%8X*7YHGp B\$pTkdZDJ}7r6K8 xI~ۑ\h>x_YEf5W42obmr33x}*u4ȽI nn=\;kP!?Nf]YmRʉ1wX B*Ŕ14ʺ^kSzB `4(($! 8(J1p hҠ wsw9q0}%T4x`fW!F!X+ F9()ʝS) Βֳb荜-J(~ YfB$η˞ݭȺ[&Kjn>`͠FW+3*1 B+<%ByRm@4b (-9e:x\iL+i%.ÈP'3AT=iVsN! L 'PΤ3/,3yAy2\wal)x(g/^3Βj" Tda40%^7߿,u79/ކ+dv[p`hQE}[v@l /ijR{}j\Pc @ۜJ{I ˨רD2kHѺE*jj|SݽV[c'E\zBP@O|^c8o94d\NYvSƓI73 ;Sd1/ f <0Br~mFHoڦ!ƩY{&ԩm)m")+isD?!Ϭ~:]J^rdlBTIVVK /ͽvyU1͛p갧v2޴lqRKԷ}̝|;[zBLL6i_H%$%c9+/H{Ƥ0 ϰꅛ{o*\.&'Jw '(#CrX|;i$Yl`'"HW4̭a>z]$ 9\6Z8 89έ@Nuv䆓sDbf,xxcg#>eװn33*tŒ{8}w~Zk?i{-m~yr_<a<<_{'kW7=Y#2^*>~__"e2>dWOVCȚCXebӖ՝fܩn}s:n﫢#&񻕱NSοǓEH afͱr;[][6h׼ Кk(jM֛҅{fjCVMMBq(cu2aE~0vMO2!'kms>XEO;?ӛEMz9s9U T4eS{RaswQWd|O'G\ rVi ʿ{b xS\)JS:8uXN%C< 45v>ŬP(=j0HikRȜJD9NQbJ|q퐺yxg`ك2y2yp JhH9pVTfR LUTfigvOry21`$VKQcD˄da]2wMY[>2C*aV%#t1ڒ!UV;qXAa̪T&: l4ɚ$DdY6 0{n~-W}D 3wy֣3Vn+VK%~_nM*Yqc( Dh,@{Ђ୍c!TG`dJVkbUnW,~]etunZ?p i̓Ay 4n٬,5"&49rS Ad1D{4쀥zR;:W7.|{gq3<'ģ-%,)Lāa@W Kd"ڒBڧl{84+Ⱦ3\]-~QOBiy<l6T^e5=[^:QgW/'gS^xMV-=ǯun]_ݢQT?UWWpZ )n?̊yŌ}xk.Ր䝼} nl3QZֆ̟At$ӎAGEMğבquPnv)Y:R׸?Bd v24r[5f S媔wq~3x MSO#-9g};hY KbY- ZzB:J2&6$I6WhW\[/HIINOPE#DF{vFNƥT+'Ή <<FrNz- M ~rKg"[vyW%q`yQ(Pp=&r4OOVUQuFIw-x:+p׿YqE`Ryj =R_k޽ʔzU:0ɩx8׋ i38e ,g9]DkB텛Uw?G_X Mo7)YWFPWoɤ< ]?@銐,d=X`lnݽ yU6=8Q0?h 'U1N8#h{pTXCs`6Fj(@\?*gSc.QԹ1||}ӏLU 1 c{/By,]{Lyr0 c֜T#jPeI(C@фɄ#WUpGh\8-{_&-8/EUI!ITwpE6N~#T.W#hNe ~Ky#_t#NN*rޣ.؎TY)ň.bsd'wAӿ=񧷿9.Y5%4;yF 9ry6I]!xcy%G? ͞ =_ it/kHOu#][Uɇ}hd"G`sJ-V T?! 7I@J"F8k{c89"2ZD|%Q{#gjwW}F%ErR]E+>:_$Jz$﹧STY6YXaxT*oM/qD Nj5J5x 9Ixx;F'e]j(@ErV Fxy}1E3qԮĠ堙vL$h<hHtx'푌M0E(zE4|Yzn+Vfc5~\tqk @h<'p_R⛗bE9OX[U7zQhr| Q@)d '<:pR<(Fߓ(R?;;U(zϜVF(4HzcX"0LJ)V"VPOAУ=CsmO~ƪ^Jhd9QHʼnrG‰LdTJV* CypP e: ! L&M&$>OFCd_=;_H$A<:(X4$e2V®sT֩BI☋ay as=h|hP8Q$+&I%iJ QI Ydk'_//+/h/5zc]j8p!B76C]+ٻ6cWCU*Yy3ZH@Cp~ $IPj)DFWvw._frOxfkP㟾VL/n"D7eJtv*FDT~hy~c7|I$%l@jxЛCrEe+ elKS$ZKq>P%MQ(z@5V`i=9#Zh-t/B?\@{O生3w/kb,Q [o2GJ|ʮ^v.-6ATBs¸NQ."H#^Al e[+l %Tb>J ^dV;6NgxqiډӆO] V[XN^);6n72SG.o[QHlXʖZ諓 IƋnQ`Qk)'1o]oAvgpq7lCe\^-{{|;7/߀9&'ԝM/vY:~2)14)y;&~{֓FA/;E06,T(5kć$ ;|DTPcI-9׶01$,Q&Qn- Oc5hFC8js#/{C(`C/1_b*|MW"A; uVsmXm(35~׉elR>jzuWa^,TaNi(R,5Dg QV68UC J]M((YCK X}Eg f RS!($fl};8j#OjW|Ԫm~浤L$iTO)Xe8X,LR@a ]"!j a%0fihӰB]{|3hay`:<{*ߵw \lNo,|w}!Uk7? Bg\'D[F})D} ]qVs٩.-/S2qWZ[:tk:D(YcH(TL2&g>ov I &kHXfHI$V fP`5"XDkByl"S]K1p3طbT,_/诸xuN 9]V>]q> G~'itfG4~FY "No:rj8.0Sب]ǻV&e:|2;]Nz;`7_HO߬]6l]s)}mvkH"HyI F F9T^Gi3A.nk_;ypAr^tMgHYy&j/?EC!Z?{ozwLSi_7~[&hG!7< ,{6__n@ C{+Kc; BZk#zC[0}RO;8- s1yTtSrҝ/*Ѐ& ~c۷|6(k'ZcYAJ^P,JF$x1 6, J ΒmDN'QI9T$Eb2|кf3r߫tWcu:IxL<ݕ$cQX%0%jB!SG2@SA%%M%D8ﵬebM!Bb5sJsVXPEߣa)]t]ƾ/վ؛.1;1xbvqu~vITzgyrOcOmWۇg%kolkFmFuւOJs1ʱE0N(ŦĈdsiԞORQ5Պp:pV=WLeW *F Rj +֚9kvX.lBl T(6xC`p4~B؁}N[KE r1%-0(/$}ډh.ƄFߍ?aQ2uhl2*Yd[ط`Ȥ)HֵBoE;M s_v3UkZ= 8>99I:إVI 'AU*6( N*`vFƆ=dFR,L" p0pdſ1N5ٻևyXN*}шcWkD5hA#ۍ$Y$B")Š,j{~CQ*֧N ab߸cal#PP1 SdHץĞ4(1+Ԍ-zYzq-kb͸dWhE3A/~cC65G}# 0&d(RyzJ8ŧЋqǮ5և;n4? WbEq_kz5io#m}B6(-Ǘ~P=Y,@gn:4H{4@*Pt]!tJ\Ѧ!=%;=[PnNwe's\raP8Ѿ?\An|LfWs!NeR,MH%kB~7۳L2|l'Zc^p Gϊ!cV*|ș2d6 D=CduǞrpmm2ZdmawH>^`F ƢT/hFCcܘ9+Ñ;W:"mm<%n1#Tyf8zmFquml:Gi6:+uE|X#xp a1bD ArInwf$xCLƪ@NɰExfłjxKΤжeAW&N]+*dhR8"댉M{c9kFΖr$f{}>F!K[NcfSJ!5}9+f!çZT5gHӃO /z?FM$DK]"]2$T kğV3fOw' md1#dF Vze3-* Ѓh8t3xWaY` ]{v{FSd-:k/3T,DZ Hika./EMf,· x@1(Zt$zIj `7oL2Zՙ$bvc{Ր((J1z95z>OeHeBlB&Zv͓MIV5!3V?6J@|&uvk~0ّ*/߽w-Pc" 'k2ȅZBzUah| EI,jKOPXt`MV jR)-1<$"0[l!ny!wK[ 򣳸?{kOQ<y,]Nj%Ult(Bĺde!}'UqD~S`Hu),'wRuPG W4}GiLٸԁt԰)$ 88GwtS:㫃}86sy$eHc:x@Ć֓kIЗM>T>ɬ/yz:8 (&Q~yU)z]:v\pp6mNW4hnvɶvap zQ-O5Ss~n5x4~2)㓣EdoH&?_2wazcE i%WRzJxؼMV0QSl2Z>''{nΣظ*W]U7ݫe3UR>?]z)k2Cy Gg2;Zl~Qab'Q:9?fvӯ/o?|>Ο (աH<> _m4\bi-zY״ ~!>Nkړ!|I4.W~ףn\LO*zGy&NGZ\TULbt{A6mcPw+&YG%ORO:K|~R +'D KtTm:G 68D#" OzRa2aS_9RuԦĜ>#0JN bdY*޵qdٿЧڮC`v<&Y#ŕLrIJSME"C$bso P u*u0FR_wyt6&]vnUԝIQJ31xΪd(s ePoM|JylOUɼōOi\-0O 15UU]5}c}˛_gɧq)/wSVcoZ>oey1ƽ~o}U5YSE}W"[QAt} j  FBGVt^oB~/2A¸P,im- ⬠&$y>`rF}2ӯ0Sf:٣t2Pf\`,0fGe7–\\ |m4: -KHEks V]"C4q$Groqzd K~~^ 7˨yyfThR{H@F"N&rX&!n:nnn.wK8v݀9qX;|IX[~ЗN̈.KJ1-e2 ]w bpxc`' Z F# uD@ =X'EM0Sn!IhBAgu^1\k ,!6lyfqb:e`0KmԚ3 LIǠwjOH1p. `lBG[3Ζ>9|8ݮz .דDQlF}lI/{36(7nx"GP ,m&)-9e:LD{ИWJa Lfx0=`y5Cji9t"Q%ggRSSB&/l*{ԋyɫtp~f⵾ ,^PN\f$KuȖHI&Nr LJՋWFYC  #@ ,"KCݠۻ',-q3.( F;JIRJ{QaD2ks#T.V[֏c'G T"U'E6{J򌋱M* .SphѤԛ9oZN~UDT]c|nXIqc۔լ)W&թ~_S7'\Zϑ;FWAI@l|}S{ޞivUKt l:o5iO3tbq,O ;\l8mӾȵ%$%쀴*h*͜ՌWDh$i/Ƥ7a<:_~N=G?gfw1=)to,34 <=!Ȑ#V+NIb+,weL&VS,`Z֩14xMS㧖|{Χn 笋OE o!rꗭ Q$k~w"wlbeĺJz3Ԯ-k.ZL/}웛D773ˎ6uA9c2#=zD_~tz0kɍ7G<ݞgӾS_GeنL>1 ]{E]H6cx~5y.En\Zz-,P7~گ=ØvW,(gފ @CD ;n 6x e-/D g'8-=g}x}f[5ZSIZ%>PAx6`C֊)bE dԇy>߳!e lóL>H<5ΛI&f"EK*G u*IjKJ$1R?cޫB.L7=#ވ V=R:}ya3֥;c_ fW0upL}`IK}UHVXxMOTtKfi&ڣ[j/n T8]DÌ0^:L&4BHؠfl#4QiٗDM*Sϸ(bNٲ9*Mb~cH 6kcrjKzGPgݠ:i ˮ^}j btH};nKM{W۴նXJpf˷E]pc p]\b*M"BU˜j~6N˔7>Y 0g9nmP4O&6?μk%/T]6lՖS.DܠUj^5]T1vW_!l'?s9*.;;1sX.yS\--&pW;&;͖iNAxb\N4܇1,&|}k;&KR!Db6=Ϗ90͟a&c=Zd8=GݕkebN>3Kj)` s2Y&|&?-)䊳+m&5YU[s]^;B?Rg.\ަQ&ǾBN]ql:~KےXiZji"?9ڡ8#꫇wmjkP[ilZ{{3@TVtWC.d5zRc.´ҙmnk8l^yl'v2F_J<{l{׼~1/AA^ z EVكȾhaLByl}eYILߌ~+6|+L4Sban,/R5nGͿopvțEΣ0¡>?`KීGwy(r-QKRTQ&XtHИ"Ŕ,M5rVFo4Y#G'mq`Maz( MPj\n=dדZǶ .jY;aq5_gqJ#W+RThWFMe[e ڽAˍ[7:s?~ߐ͔'wtQ.k)ӋjJMR`b2[VҊ[keDVCnmtԅswh^%ƍ tfyV.eIm[eev6(.9u[zg0n;=Z#!wX=OAMj޵y\3/ͷ)a _Ȓ}=-2a s&MNT,=xV(NJ/GhP2+ qSr.=n/M' 2%;۠vjk?xhKrS,;PpVъdנ[h⭓Em^yI\P 0*hHصM2@&!^cE$yYn)R5ʅ]QӷT2IJJn_w+JjHjuJ.%H(3xCmJK1 (4bis.{g+Kk\ Ţ .e-%-\Ip4$skHV F.[#4-A}ݗZ3=EmZi+$(`R4f$r/$AˤbbҖ>jhIB|Px34` #DE\Dҗ({%.XAW\w$v(Bcwfm eiK@4:ZYGBYK9$B{G$sd@Q6l{"ZSUơ lxGF2a ׃N%PdU*bYJVp,a<eK:sp&Rn d#`BwQ7!2)!q2F0[i`! & YH&PfAUY !#zycBfqL Q* CM0 BQ" B  XIaDT!J@$J\JA2<*J"AB- S^D s5ϥoA^DAAWV]^~AJg/u|  r*wmKoK&^fajH(U %Rb ȫ{k9շN5R++enRB1^Q@IS vFD0 ˋES(  JN8HSX(+ЁT-C^b-bIZ3W])4a7A5Hmn)z#CjW$0_\TRX9qD> 'z20pm`7pir5uyp4'6e#s4Pqإ-,$frh6JI>2 )e8bAF9\ɂ",2zaE^Y`H#ԁ=UdK'R% fcI;.i3, f5*)8\2U#+U9u{YߋY]tfQX@KzMwK#S19dE">ON%T`ҹdBZ4fx l4,3khXm*$V%< 覫2o&x ѹa0 mۿ{FŪcԺ`ZC]kpjhq݃(/3{5Œ>:I%\ EeBw )0HHP#' \ EX-ѽPݼI^!|^lWȊ"D W}\!O.};& Z `S!7B(!RXFՒTwG=kUˠ~ aQ'#r `Niu),7279VQr@qQ;byP`ҧ D20Aw-c|݃r5Wh[Mg TfĬ:(F -,W˙-+ZaċW iF8|g,yɌS:(d %)XI%nb%EnavN^u6Uuۥ7a`ebBtJJ+aB% "!J؄t^Q튄 S q1y,j?NtSӋ|2Hr,=:HVئ;_>j nSd!, a\_T;u٭b_}wk=R2Rv1y1 uχ q{6L 8-&SeAJIL d7@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL sb!:P3 6φ d^gAJOL tu[yrerYI=cwYNB5iDM#jQӈF45iDM#jQӈF45iDM#jQӈF45iDM#jQӈF45iDM#jQӈF45iDM#jQӾbj2˟5 {|imk=jO*H$Q44'H!kB&&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b3āL 'B1δ:L`؇T&=&A"x&aR'01+ML b})&'kA|EL@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b}Lk;OVuJMۨ]\,ƝJɳE M瓋ki" d|}wEGϏIqoG_m$3Y8ٴdg>G, a]틍[Yd(͎W~0Thol\8.ƅ%Me5&GÅ* JՄH<|7t}D3'._S^f&G\` -rJKdz,v'o_l5O1(b$Co&Bo5~*qb1)=HcHc{x$XC@L npU\*^t"+۶l h{3DY> J٫++tF3;p=S@C~C9Nw|wuJH}hFJtC䭡e~ Hs06r=R5t?zظ~Q_ism0mI6|pAgEF? Ғ>lD奄% =0SOhKicVB.R`VEκgWIg\7,!٤\vOgdk=6 mqm_W,_/ߪ/Եbr<\t`pRæ0d>ʛ&m@PeW!q3hBYMߵva|˺d>HwuNMi겮]iՃ|C'sYr)s6-_r}&Nr~s7ٛFl۪Jg] e`" S2 c?uو|!7^7*{kh5ZaF4hl^o-`ۍjh 7l5- ]vyRy^]Mo!ԞV탟>WC)˨Bzh!b }}?t̎~#Zqݎ۷x<]a9 KP7.Ɣ'Q6tvϛmD/qG\#Y\4[fյ|2 9d VQ:e^q-R%&"Bhr8Y=mt/#nt4 뾜NM1]W(v}ٸ,G͞>pyA>aZ"t&W 5-vT) 9U%Bvdml гb|}l|~i|{G$`24U̥ ɲ_V]V2F<35٧bi=C~cylt>k5qMn.v ImDon]t(Vsץj+mp'b\n{|3l?(*ItC11U+H(iKW9W25Am[K+BemkVEU<azU*>Nq%ݨ7}B[ {=}cY.1t(1)Nk[kg Nˤ,ʗtn>JuN 2WSB"vnT.XSQ+9O((_*4cHvdk)12v B֭ڷUcٚDBR!kHcZepE&L4c!fpC9E{~Bv8\=r#L6z5R/:(8bA.ub0яJ-duC]X= ˶0͆7ݮKx+ux!Ltv~ZwA;l@ۖa3.8:cG[gf.Qi:] .[taNvT<$,b mB+7kgkq'L(k4kzT&]L_oӐK^9|Ε,g 3:_GvҐT(s#ny9b BH9R(vpiD˫~W x7Jq>&uzpM>=v b3T6 ٽt'BPOJ"XPkԲ![tW<E)H֢SAԄ\>02sn39tL'GNzL4pW|ׇc{?g-"}>Z!v}%6ohZƇή6휷;y4ΪU<`Y޼Qb+tFw*H*<:Gp1H}DZ)pI9xL~Kګ{KeΪhiqE.VzgMKELj͘YTɓ&n ^=dziw~կ/#B`n)jGs_q,#+Ğʦ?17R1ջc> wW=BR-R%G:_f ?Nq\eNs zd\(tUx@LpRd$JF*J$Kw)ɬk=1y7]+ZyƯMտeHva QH ,e28 5)Ơk#J{J""|dUN2 (Bzti BwD>龁 s.YIC"_vfqC:&kA9)P T)#&*-9eC.#р2JZ0"p2=M|ixjΩGs"1H.z#$rupM%0 l[#K'ӋpaW+'p> ,I (TG.A ̈jl%Dhx^GJ_+2ؠ>CC(PԷg7b?BȞRF<+ph}Qjn%@9b6Si1)aI$ YKJ .@ɌΦv=WG=&U$E'E>[r Iw k0x2t=z2{TO?EVyfxaBl47Byl)b-OmL|نRؼחH1=yr)r#h؄ VO͹bG]+,pM%OM1ٟ괧eiEPo Jg3wl bP2D3JK)2H҆"sV2^wI]`}޸wbr帬tK鼚D_ H#V+HI#)Wre=#L"?1A]$iZxLj:ċz>95uXS{;sɵ[m9<'^,ؿ(._RݲkX :aPe;,޶> _E;?e{,~Ax6u>wvӫ8~ ^_"m2=}pg6CȖ=NFjYN]t7Ǿy*t 4Mّ\)*ɿ?tD!m:Nsjyp[7mӦ VX)S!ܻ> E%]hwl6vpw @mr-mx@m{13lS޶njrf h]d{\s ]|7ܲg!p"8ύO9|ijϮ2ZM{ǤBZ%H皲Fs[b( EE!S@zudCi7q`".̂"ޗTTU59Is?F/@ʣml<ve+'~Šhϸmg}Q;ϟƓN_ʌj5}tm˦^)+g6K@JRhK-/L$&,SǽYɣ,SGYBbN'-EHpȇa@:&LR!MJiiSWkc6PcV% , QN8e&x&>]>&#<<8qsX8ݧ9˩ak`y7NNI=V&e.^}!JU!և{[JjdM?E"J4[|!ܺ(sk/RBNY| TW-YY*3dj@Tq輔/@Ϟxӫ)H)C  jF!)Ww&1[|8:ϴ?9_k0XKd-=0/pA Ņ#=gme_K!(g0EV NBi79-N/NJϷ:\79@H9n  u4qZ0ڠ ׈HuƔBYB4%02jN K;N{Z1XqAsg(d gO-qѰHW 0u9$}}Uve$7i.nqY,5 7U OБC7aeC*(9QPu8Gc} Zx?rǰD䱤v>S C`eQND,-Z+*ƣ~x~MVnK— q|MAowKr~cl/"q/'^/fCc̅}ԅغ]!%+63Nv$#TMIE5!L;1oG-By>Hn Λ3NC(w;[ %7sp}= w}uݏq]<|.h6}뽚%Msnf;` W&iIOvnHn &kϮ#ɻ"V,]9q٭ٽAu rm4:;Kn]6{ُ.f| .v>qT] -\zs\> ֻ<݃~I=wtܦxT#'Ή.H"> ! H qPВHQݔѮ1) %wE;A뢌40<)/^_ۢ9囪*xoj2Ί7%\oV\8)h IRU!3vPM79 ǿ%;O\dj{7|]T׷zP:R'8:,Gswz}^ޡvQmngi 6>91DŽWm)!$SF E%IO;M>L60h vgoqd;,_b"Sq#<*~_.Ej@>eeUAL^bQftRZu7ĠHJ?_/g 0nm8UCuM|$tUaU:g0GUy2|I-y/%6ʲe9pC} / :[)`BFc"3ޒG*[X29Y!O_Sp1CM@vvGq]r1291vl:"S<NL(H1?2a+FYe.M" gEҹm75yIXBvj|1ZH6N]6Fn':Z$ Be " 2gOଓz7u6M+ :;'q}y ~'<B5nGBG7smQo4[*U%k"i* _`^Al Hcm0xU[KMd!|&: ALTA~;۳M:,hs}`|Vj2_ṁVf]DXW#j $Ei<%+娼]w޾]lV)[[9sk(Z[o^!ź7) Ss/-2PR ˔EF4(,ڜ\bSwY\\u(;?S$CEeF O7cਔV$Ŭ)Z\ЇBBz 8<7rsjw=Qb_bDt׫4o:@P L-,rZn.sz-ZYh;AP8 @aQhy#)j.H휡j6w ^Q1(1vl.ll>+LFF–ZH"Di䑐 h<0{Π^|>$gi++u|k{ ]߬n}k ;iB9:t&rDS 6Ymjm33CkۺBzf.qH2IUf,dbp*HT &ĠԣvQE]~J2p78dG++4ނ/KނGBZrni|7G|`1{WK)dѱZ$0+B,$;.2D7vJmSǛ s6[bdq:ckqx3[inr {&\J0/nG=ZU)FJe C**ҩ:~SJ_tɘLRTɧ`-7N6!7>ԫ'ů>Ƨ:9;o{@~RC_;Ba2n~ȃ-jm1}?=}ӴSnL^įФ~8\w/|]WooggK4+mn$\hَH념fvvx6t}|w:]8=)hk12*%#O<ÎhX;&)kr&RYJj!P.5EAbtQ1B&lFYukMjZCO^ZKBS1GJSQR P.A݆@?&Iww{u^Φpz'Ou f$˨q5i`[@)P)r-]tdIY _^ʖB劉ā6LPdJrNYQ ~EC"u.Sơ.ػ.>>ۡryv~͔4zc2vp @0E&ZD"hGxg5~VlPjSmTM:$Zzu·"<Ȟ!R6Z!+)CƭYZB(:=V[I:DfHw_ny@-˓K..ZMJv]]]o1p&f}537B4*_npId^a֣]܇]<{M:v=y&}ퟵXy7Ur:OF1=> n~|g-QN~9,ⴤ?6 㴅?+EaK]=AZ&q2OΧu[,J])~84x ̕Wھm"JKSii l[_w6J7aq.r/_ `o]d2oW6HEd~quR~|XI_~yJ8Ms&pvc P[ Vdm}j?=P1E.1 :Jw>H?U{GA=e2(k2,)z1X o"e4A+GL짯&fF{=6BR4 mEr>;'I5JllP#Beɤ Lc\OYehUޯ]ڦEW@-ΠC45FssMTXA<<(aS04f}@/S2rXHxnΏ&W|GoPC$S[q꒮d1JUDRIATMCAIxPM5aq@ͨߵ1Ð2Mfqhu`(d ,9TFFeT}(˼i 湔3՞}.k4("Ý<{V'LNV;dȓGrd)hv- yuC++ C;3cGZ>.1 {+ri)r:[к`MI\q&bVZ`I@Hf0u:U:`Kn/sq@Aϊ2 *)6`[:75yIXBvj|1ZH6Nc.oZC*%c)9q|C0 ~'.MeP!~okoG2}-_~K@P%$^`Z$ QX%LAЫA  irL Z-nm% 5aΐMd2Z\߹<,my\ c#ۋpq줻,| Zb=Z\wdYUA6wHӆy }.YEy[e>^f8R^-6k(ZS[o^߀2] N*>5zАJX,7Ga0bGv b$BCiX7YJ2 ?xuy! ^E3=~=WI#yԲFv;-Ml6+EbGr5c;O%d"9}ib5r&29LBK)ϒGPN xIu /&%Vea^?۾2{*ߴwܸhN,x;-HlSJU*Bm>k#jgiT%ZNsc"$ezjhBh9눥S.10cE 8^N5@xu\Ɓ-ZuYv+f$&q2Mg`VNVyLwR;5KTNrISμ9 ~Ed `K˃Q`CFIzSۜ"(#>gêǿ.3E]nop4Hra6w m&B 76b`fN8{eR{2^y-);?GIg"gSY{6l%a&!D 'A0Q$ H)\Yp<8]hj=g9'@sZtGivJ%ER|TN$'R!AM Up0܆(47L<"ByXtBfwn"sbb:">rn0`)6G,`bVXN`Y%n|pVگQ,WJ˘聴+m0^h (F:٧#7EhR==gR=Wy8vW2 hm9h`ՖO~2M}HXɩFsGܮ~APhǮQue{f huy2.d e1m$ƥ:m8ްQ DxmI iE]gt6;T8AzՁ⤞ >uvJvՋc{ŭ9N5 6y}7BW>#].p BY^|/ձ>@HQmM>Mx[w1>s '0&Ap}#E?:Ty&>vìj_Ie;_c1xΖZ2 MV֞G]DUϿ.BFlO"w7NƨUYY(EM@,hOǐ JdRхuu)vS}}J:v]$Ncz;BMM5TݴlC@1/yَa`EwF` ,@^0 0l_.c$ϟ/O8q7zh0K:gW˂,]P:)\Kv&&YN&tU~al3Uѡ^je<%\Պedq2;dҼ԰opM:jOItg-Jgqi_j|R"6Y`PFXam Bltz;l_'9sthA( -`, !(-^f2*:k r$A 0R^ i/8#7閇sZ{l{lY ^:)OұkUߢk] S'..cZɤơĥemQ`2#_5ee3+U(Ui6_&n1,!`@7xZ`b29oO^:i ^CLnJA%V8D$#K|O W7=tSўdC*Ǧ&FZ3 h*!$u[I@ kLɹ^t&~Χ;_GC/xª>tqBwN>X]R59FyLmU rqz;Qe%\\Tpu$`fUz详FͅstHO2G-r #Ѻ2A;\*9s%3$+Z#Or+stS r3hۤ+yʀ1>>,a-,+| P\jϓ񨾜L_`LΪ(ki}eVv 69T{mVW!Qd:ۋ !|2j/BX@@RNIWBC& sb9XɶT~D|7{76'Oh ]i4_LO i|-/urQ:z4SB :rLy̒)DGH0 Y\xs$ߊw*q itDxX؋w9 Fa3znD-PȺl8KFxWX*a}Rsн͓߬VֆD[Xn| J0T:"gmQC"4{AHFc=ݠv B dzA㑓%>I~$S,~c7Tț/X(1Q*a:8.dCY5 ֒0>):ZCgDxD#<_ƽb#(1\$ rgJMiIV2"a2(SP,1%i0D9'p%c@(NZyAr \ILD(:V v0 )`"i2P &[A!H} aoakoƚ9kT!D= m2MEIZcR0anL\{͕y~z5CqzD5ZwH\OD&_10B%L"@1 ̮5j >@{&n K7_5V[Yg[C-'n=oش :aPeO8ɥzg^W緣l{M<o'(_~x{7Ygd.T}x\"3/rO eK5y!r[îb9^oqmV77Yyr ]#Ѕwi[(½nOfgms`ѝLP$uX<kwz^< {b -px#;e2VLDՃȡ#Mi_nx9vޢj00Z%>@9#O9lHZ1E,HlB145sX]fzzbY9)5U`{OxNT S H`+]d:(5)%Aj3 $s)YQzCO $ov Om,'JXөtjeȍf{]O"`^Jk,L&c5Mʟ" JAd&OJ: @+⢒9ŝ`):)DQ^K$BSQ*T)-m`E]\tS>|*Qϸ(bN|8* '˧g > um k;g3NN]m]ubhNz3/i:~b xS\-oMv::#8VΎ|o]P`5Q:<_!^EqݒDN ?w!/W@ R~Ga| Qa"Z2dXn V!Jgb >~X||tiإ?Agvă9ySnUfA+VhTBZj㢨E4(UZI^Tk On:!'#ETVes\Ya@yArj=5r|WԩAaE/YN}r-&ϵk' F6 Ȇrn|@О)MXYAkK %i1hNдb)g\qJbƞVo| fT K#~/)=sH5HGp[Xl'|;2DG'!#rhRM \F7q J ='g(' gHG19P:8)+Aoٰ7xա6& `S?.ݘ89epN6yda,l29 4^ T #k'AT[gL-5ZW{w$drnKl&=rt䙽3y|C^x䣍{gq3g}mQ\cLDDq -|wlJ>K,ПO[1<QJOu]b(~BE+.__9{u{.:ٯH>>bi{l8g4EX `j&DBsTkNp];!\R 8?G zAi䩀^H Wn/LP*(5_~vyK2џSn8_y( Wᴽ/ѳA4t? ]ٝήZ঳J,Myo QK{`7MnǣOwfvH(q:`o;[.} FA.T֞^C~ȭ2uf7AqoHmImuݶ.X`? y/خ>v{q'hwwW9XByTmf-z/A\d[69ya-P~J? PgWÜWڞgMm!q?,A'dN"BQ&A\ĀՂXc(` s,Wh;,dףQZ^al|!L2st[:ھcי.q"bMs}6i#v 6 C3Eէ+6wRk&5eU6rv|Էsz84g e;T)K,ܱC8_Dص\1w6.ҳK<.5ZpnfAv8A?ͨa0 BEV -j :XOT?{fMhYxDgq3zWT߁&1rBWzh5,#梚^cSW}35ZQ957oM{=Z:y;fn5nn7t<U?j n&Mv;/ܯe.#^VDyjV礎'(B_K:_^8q8|:|RNUm"!m}哸{G,X{%MpVTW.OYyfKe@h TZbk/E=%y f}鞊6և#+}?3yӧ^s~}:ٞ^[CfKJf©a_jŷ7AOw+)V\n~oLJAMb㓮ۍ/h}E Mp( M觏,2}Rf ~fO#W8oiԚ#s\jDK4٬ފE@\_XH~Y_|kO_\tkxY yYr5;Hx~P5溹mPA^ |}ڞHbPi†7>Ksn3Ǩٶ)꽹Bٴt6n&nЬrߔBFa|A=& ʧX2][cTz/Пnx;[ni|ś洫IVѣf!pPUۛ}DVCx-OhPE T$K*]}ldTqvۡgR"/w%Wudk~?|4Wd$\M NFoa|XkQJNiB(mV .H/"MYZZ壝XrAVCI0Z4XLdms)T.WPfXētRZ.E#OAQ ׄOQdO]ܔ42f;xpN+ ]0+EO"kCuYv ӌT"yˌwh|Zo} 2Q[Wl;8 uԡnӊ#Xe~V7P&zT|պ* %~, \4UFߚHx"fiY Ä!6K$co0f|) :)8=ߨ:%@ ic- `I. pqԬ A{ T gB@Q!] QQ&EWWf֑5#yGx)Z6NjC+mebEL]VkT2ϱqBi5sRT1yQH!D#&d wngv0PWgtXNϋ^p vV2].UABI ƛC G^Kh[0f%SdIW=$V*u2hyCs ~X|YE )|" LԴjPyUA}ڄLkptqXq4мL}$kIu < o 7tJrA+ПZ"hFF3 d-Š/kI|ӧU&}sˣ#{:~wu4S*#h6A4ue,{] Sv$=)* @oB%|}z /s&!Gu} $$LES{pUf}Jh v% Aڱ-P iy Bkڝ2֐h ΪmBbtJ߳yBP-%ͱk?m,̚$ fj2RAٕ ~ȃR!*q@ơ#2ΪA}J^c,:fa!dEh3A6 ?Ft"h3Vd6BkB Mk4C:b8i$[Ȉ9H ,}2$S!Z; DVz$H)+l`I &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bIϗRB)@0C .3JeI L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 @m |:$ͭ'Cx$IϑrV+&bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &H#aܸޫ|wLKMmqfAoKku^w,ϏX=hh/t"GZ˃w/,j/JZqet(~DHv)ކ ] )L/{C~'>U=Y#ٞO Ţ kM?/<˫}lo/6gp;}e[!#{ > DP9О[*64E)yeqFAC8^sc qd;kr)N}W$ĩm'.nzFTf{6}S٧H IJ7? {ΠN/ '[]iQ:@Gg^-N|좫)1a[7zWk?)(|Gz2-ls|~ttGGPSɷS-o.=r}{J[rƭf<%hLhŰD%Rud!ZhmxĻ$QF 8s ]^lWThIEG9&? Θ_u٠dH3[7yy6[_gpKuĢiCϮv?6~wKuv6m}zK/B=ۓCZ(m|.쥷7Quő۫ý^otL/*|W{4 9ƜT~ m|Vûwֿ>` ooV7bt/_쨾.|!BO? '|gsiv~ڍ?<~访;٭v" CBsBbne-b7]_U8|kշGx@}.j@gvy< [j&RS8Ӑ|MΗ$/,[?lgq?nvZ7ǣv؄Ëgwu(1>EeYXw97w w!wiqvIN]ʯ',|In|j yhG;7ou.|6?1䬍KwB Q9k*+[j-ԉ3cIWg'Ώ܃>~teS&{],Zna>Q i}- AjPƋmVduF㆐L "٥ҫuG*<hاR<2WS~>5kR<&Q:Đ;q%| Auթl"mT\tNXY'8g͜u2#&j n1,:,:e#}C=v[ JL9M,n s55 A E~3`C\ΛԂDf$.f=%TL]f]6+PBq63l2Ƹ1h5XMdB(9gj3"k-\&R!6ʰU|r,IԷӟKj MBH$"ZֽKH\⥢tK,RԚMj#Oǟ_/v3ww]Nv'FrqjNyI{[oʩBXN$U;;?y`ژGuN.8@ `Y3N{{r6g9psImʴ Th I]^8;*hSe1onrV鶷 9GۃgD`i\c \g [GMv":׭}7JJim#}CgY c+Ɔ>26*Ǻ|~wtr ($hł^lDk9b;!< Pt71[Wt0I!Z L -DSS&~(q/@}jх*hCͻdѥBCJe˭ic0QUSoSe-o#bv8ߍO9Ν&J\BCv'*=*̔(ftB+oT"--fiS~m> CJߗEKz!SI׮ wmG_ٺ_\r]tcN\t:Ś'`<$3*] @h)B2\ fzfgZ0Lt܆deg#cI?g2M,$ l\>0!H#S4`=nHRmq{UI4d Mp qq\H.}Hʚc:QtЂpK$! Il"tӵ.fr]?Jx!G[#9^|S 17?H߅*jձ =} ;ծK+*BqVޖiǣ> ARsl;2QB9a'crr̔7+?>@ ^-{# S!z0 _Z/b5)\%nQj8uv3:?\Lp0Nߦb*UҌ7*]4JL0`zv) .j1}BWK_`+.-IYqGM.kgMՍes28[#5vR\_<_\#.mD4U_\AH[K$ښaXTy 0Q08hGbm Vlkiu8bԤgşG{p> >'9QF)ƳRX S )5>sÉtSdIz>,X~ΖPlEF.y.24߾WӘo *<9"Q9W Yi=ȵ$L)s-i+8X*7Y,HG]™dep1C ֌IZExqgl_kqW',ケŮtb,j˯)*Gs_q,ȳVֱEzT?rķG?qHjۏeg"c}Qcн,F#@nfK1;i$8DU9 ޤP;]͕Q: ~{>}/ La>gr=~/' k6k~)Č燻H,MB*Ŕ14uǽ 7vA $UhpHQ:PBQ4KpT )"$ m pg눩ލ1\k ,!-=&Y\ND@0Za6j͙əT{*Euu:g>X;-p}呂n)P]K(\ݲi`CZ3|W1I?Wx",SPB)XOʂa FIJKN>y.d+i%a@(GӑMfzZUQ|m՜Ӏt"Q%gaIMgL %X^,5O' +GtOq>W= ,^PN\,!["%!Dd:1Az/Ux,u/r~W.hY.mB=uv /uq7gE4|D8gAg[lN@B@j3#Xdx(G5p5ϩnAIp6h1==OY ˨ФSfmNsϓ]b$4;DZoLOJs*'E>Vg0xX(cXM r9gUt6ofPgK;)pMQqk{ Ɣ|RS p䭨m{855X.NMmOM'NMOsfs y~r#h .Ίnz*U1H'bPUaOjPo7f?}*[iIL6i_ȥ%$%icl4izJ "IZ1)^zhaZ^f bC9Ϊ+H;t:Nx#HTjI#)Wre#>E6Ap93$Za2k.UU3ޭN-ۉ/݊h ,&o^Abb{n(X :aPenj𐻮xKx f^?&iYs 8p<Fa4z3c\g"ͪ?Kg_]"Ϟ7߬gO$Y7vHCfgfO7izo/R7 o)> 6RhJcH b&9}<+^o-]aez/lx_szӠ+ynSn -j vhc؇{WuWUO~]m=ԁg t GXy~o&=$M5eAGUv0دjyd'ʕOBP:-X(GuJy2[emuͳ _xlgQQ"~GO~uzAfHJZQx#-$ۿs̻a ;LM{4pܸ5xS;FZq&ZD1j[Fyd1lƳO&7"4@ Ӻ)&5rx9>w>?!ܟ}s=zWԲՁGjG0-Y2|_a^&? 5;Kņ-u-x=]LRQ#63A.hZ)}zCA퓸2(w[MǦ4_فb'is]}b-ܿ8lxv9}>T)8k%3 K³6ؐb$i}ݼXN; r7xZ SI^pdb+]d:(5ԩل#1spJ30;_`dO7V%dܸOp$x" }omTD٨h%$Db)4:n۔^2؛MR=Ҙ R_%6Ŭ ,5 &!ri1D>Wz%02iNKub%ۙzwDO`S:wk xlxLFˑBL\2,UBuahtB:7^ 2-< Vf$.Vtuq<'BGR2W9P稊ї(_u8cC h$E:ŸcNCrE8y(9AE01a4U<5iD27h=ϺzyrNNɥdfoM)"~3e#RY)v&N@]Fsj 4( Kh"18@՛Op@7PL)ބaa4>+^ބ<}<gҧt: 4S4NU-7E5ao*w@]*Go9OL`suo6O[#RRd0rҗ`nL%IhR{l`/Ӈב8nOY7ij Vs0:x.,ypz<\p* -JNQj FDsͼ E &RcJ0km#GJ#8 f[Cpfo悀,yrb+eɖeʖ`fuM6>~E$H,XwJVP8'Rhkk @BHB9!N'j&冎]S5]d+~V 5'U .p8=(Tu.OiU=MS|z[Q¥gէj͟pОS냥L~==98>;op>쟆oJK:YO.Mɽg?kxF']3|a2n/׭^ħdݜo%U-OLʞ\h^M +BVqM6or]7fF4 ;ᤩ gdT쵭LlN`kqx9hxupi}Pp9e ]N:~`gokkI>*EHj A뜒av%dޢ8 c9uzI_ѣ-NBs; ɞk8^3F}[OϤ%6sWͤ$oWw\bW(f9L4\9P3?W%۳gy};ۢcFZ*& zv4ChE@K gqvԴM4 /UN-q+5YJLDOE (7Dz'WnZA˯hUb|2Mb)V>:MٰGFɬ5i7X;$7*^Ȑ& EDA#&X圡pKUQA(D3XkΚ 3x%f"YoS$¼Xw6阢֙8+14ӎ)4u'邎Iӱ=ұHEk?ñul+q0=X.T#vs4LG.:hLq| Q@-d '<:pRĊoGbTϑV}!ļgN+#gEm$`1 8{ʥH.Oh.e#lGomt-os8!$SFy#Q'JA  '^k0Yc\TJg`s Ay!bBݵrC&J~&F -@LTźP׭]Yl$Q jP`FL2D)CJʘ*$).ell5E@i&Ya7!&N*ISbgJFHɡ&IHyBGпFnCξ)& ށ,SXdqG;٠Z;ggژRϧ'<@ؔZ:1;j^"_Bg8w/E, 6*Fs*2U1`$xC6h:22.%eJ%Ṷ kNXø-+N+Ajι.::dxLmrZZb rTbЮcSq8~])H_}Zv? wX[ڃqm>Qߗ k@sH( Dh,@{Ђ୍c!TG`=4M\syyrԡ)tя-{üm^^ k|- ouS?/` }_n[A& -67g* ݚfwnýJ׬ݣfssZnyljbaBq{0-Cа巷'5f]V>_ N,7 bbJ ڍS#T̾9Ulu#RީBK(m $)g6Y>[S#XH#sAt(!j %ՉKٜN+Y%ź>O>o{^g;y9O7q\ Ï;s}JJQKKF2U[>@d ɋU*Ԁdd8~4HAJh ^kbBaE-֝Gm'RM6վmؗXɣد"xg<: Kww2^mrz힖ᰙ_JtQT?vhl՜in!b&ʊF5h a܏Q? /XL>EDYE"ޙQLyϭOQ,($ 8 E S!PBhH;qe}P(ׄ4;4"tnb  T{Ŧ*>umqQEbwֿ-@K撡m# :L4P( 8x}bұ-xA ,@ыj1k佻/|#=&E\<ޏpPb|>|^}VD?69R^&/UXu 08篦PHAH[/Q>ևLHBQGq B}\7„6̻[A0 s~<& Qk WZʏjߦӮ][ŚݷpӇWۖT%hz/`"8i|j5[b4h&u~]\F*'N5W Ef՜AmTM+\L t TZbk/oHɘ`5FnKÑf>6j'Z:Ü˄dA&y|3:s6Y.,p=Ɍ&)< Z#չCRSۗ*1&Vg&,ï}%Wm9:B愻QYQ 64dR|cH Uy`vNcCfS{Nu u3F=`nh)cq kb xS\-jS:8-$6HH|ARqcE H4Rn_ A$bI~)]nH\b8P6e~n`9P U2c58ih}3Mk?Rs:Ao7,=cɱSF·?U88>38&28ppK hBǿc~&ȍP,b噰D,h_GS_9`\!dMU3,u0n ᫕x5նf #sqjq Ͼj?&rJK!QڶOWc$a)`[y>cU,}H M,~"۟=#LQYSw_Mz7AU gZ vSݖMu5Ѵ͠M6Ǽ64Mɋ1m60_^h`2M2gMM2՝*]o! ]Jsm ]#Djݛڵ@\].{s.YVixa[,']THcvFɖ"*ex\Kq JBY{0FN*Ѩ)T!.Ub֍BNyzSqǽPOnG[ J0"XhZ&0 ȅBP8(3w(e@ڧB|wrLXX V[7qe\ݻ 4SY*ƥ%I7ݻ8v?dY:GUDZ?{ȑ`m8#"wI~ ]""eOZ%qd3lTU_uuJ) "(/pX(3 COE  yfLFϽkEoR1 BQ !r_^ȝ&*ᵷp.H"K.1'XPQwIT*d*hTnᏬT 2=vߏey(Pj|hv٣LBbTWhnvj*kI3RLN^^|D6/[*AE8yrӫ;r>IױMiimY47,` :s`^4B=@@ J|0j&F-B=$(zPnV9줬C!y~iTpr4r;_;_spۂuַ5"ǟ╾R,Kl*A]#WiFȝuS2yɑMFɭ=ٻ9 >g7wp)xv}0O`~gݍwl:11VvtOUy.32sx1{W*߭_{}Ul朗B){@BK}fRM9#TH۽:r-"qe(wNC$rӽD㬵Q&}ũJ{q"=8{ıĩe{5A3K?Go2 gXϣEVb0Yq!hV5tY)(" 0чdzP$M,'Dn'0IʡjT)WtY*Gaӟ f+rqYk>"ޓ&{"-Љd]3D7'|ҏ",hiy`>%i$b\A 3|9nzoBrpdqݧ0l J0 SXKb6fr4N d'CSiȫN#V`''䨄> ]0>!rǾ][_h|D r̚ܝC7lQ<]/d_;͇;)GZnr+^򋾾[u;9GYɦS'C+ޔޅ#bo+A_?A/?5#Ѫly5EݛgPſ<*䃯}+>Ve;1'.m> lk{xު_^OO4ڨZÏSm1rh=cJ3!f $&*T"W9I>Nz(r?3, (Z D=5% @J&!93 %·9yj&#;?%'[H] Q{ZC'[2WahVs( LY}1} +ʗPz]mњUJcf&" :9h,+B։^5`-_;*3$y[V VjFl9d'C҅]m#Ro'ڷuݺ m #+- x3$`<s)\!-roX|b"p1Yr\輊']:;oJ9Te4 "XX<Ób hZ'83e|.0hy΂3) :d\xu#"W P-QOW/;oo&Y$@wJ\nWxf̽P`s4)_;)M%&ΰXwbJtvg+=6]1]^"OYf:M+ Z Z7@PoL7v D% #{FQ1e7 қBte#yfrU (Bk&FJ#7YeE;HYXj̢2&~ x Q2~1OzI;/ˌۿvǶώ W/Pb,JmJKg y(mf @f$rh3Ud Rp&,B $^A2F+,t( 1lfH>QD= I9tui.MLIY,TceV mp \+LQ6&~D>q6:.#s-8#WӫI^x|@~VbړYcU2NXLe 9J <+5Qs𜡈)xlwOWYog9ip/j;Փ7ہ߼Dpn֮oҤ;HY_Տc'*|eiv-5Hb2y&~GVXp>K6T%ƇȒDXDFR؜R) !&VKyG®-/L\\ k ܓAj1_ b nn5"1^e~e{1l"rmm6+3W~/h-qzՖ6gC^\TA8 *Rs A*VPjR+Uhpܖp\ 'Ja 7P_ʄq8uoh=tR67*Sո:7ܽKGp]P-k!G 9<2xV zdK2E3k6`+X]wK g ӢE^S|w/?ӌ 'Mst{/H?o{gɧd )ojɲ[6Mǣe[n˵<{Վ:FoZnas۳D{'d; WBZ(O^˖n!-f0}/t8O56D z*pUS@5!〲'KZLx|de5zKG MeT%T1gu6,d,b@4: 8(.!REf)ỷ NJ`,K8N}LI{U^8z'Ow9HDcAm$EarKBZ=uH'}G!TJd=&c%R'j=_m*XIpC+e?:onuHgʚ8_ؗUupje{aQG(VOf^ RM ["]:<2 Y*;dMmQFR<}7 )Rf{nߋ/gy (D)̆)a d#K7w Ql8kdFmAw %.zƼVۨUT /FSכA-)JQ )roQ[q}"R4*uK>(pHITA,* qng3|~}LF^p $T4kd%%Ԙt\d@ƗA3TVBb {Bq-EI()Θj:#gC>kmb3q覅?/mt>$wZfT9D2el.'#N96/B0XJ$a=ޱ@,VPo {VD S.@p..^MAdI*n|I&~wgjufj`'%n52mx1 浗A0l`^ euʼ`I uJLCA wUl˶4o7JKf Λ\,3,"hY#i2 E V4hZ-$dq[w;"Hj{ljޚ;J-2G/X\:.k, h |3oL2Z/۱Gw4ƶ6sg+!Y!ٗ z:%Ś`}0""&6š,v&lB¾L8,ق6cG((IuFkVAkyRG<PVK hx,448*f\",Q^/ʆw{LLR; T 瓊Ԁnޢ"zt+<]XwFΆXKiF<ɍiTc8Es(F=%]g  GzVKkt 춍Ti:ZA&' 9C"D)`JUPհNYt# #8*s{~ ,(\G0$P/ vzY7Y Vάs"eSL D8tnx 8-GYӳaڭϥ6a$t ,&:0dPIh)V 7.y!Wq (.͟ ]57͞s Q*y 8Ntي`V LeIĐR!b8$[vտfMRsCW\UI9<< ssxJ;?STږa3.#2Nlo:C1x_?A5pm# '1 Nm*>XM-g8 tܒ:,GQȐP }GΪqhDI;J8_->5۟/>N*\_|/mR\~c.ͅMSa29<7uAuGmj^cMw͋ b$Ax>ҟ \hOgM.Wa Rk˵-)u͈xK5aᨑQ84GbmO:%[۪`7V"eS 5.&>*=^7AHyP5˼pJ1.ǿ7_o^r_߽߫D$x'''po_xmWMc{S4-xn|vM]^5qL3mR6 $A/4D9Ös^q.\Wld[VPTMbtЂ:R>Rrtٖ}~R +X FdP5FsV'-q7]z ew6RA[c^f ?)W תgh%2w{ɈI 6KyTڇb[ݣL'kȤے 7;SL=I'D9Y9hXx; VE{t*XA˲|2#-Lg[ͫLPI$!RȬny<Le3.J)1w{9QS(1J:)-3fhoZ/0 oDPB\ ;#gmID:8ۖ}OLj%zmҸOVХNRx^y&>|$D(4CSQL*:Ir$V8A %qo;. xN"*8 v|cqI[+EdX׋;#v^ig=>Fw=e>t~_,ɳ1׹g9z7/jdž#9&Ӂ4d]Ktke VMƖ*)7w~o)ն=;+,KN3tmքW% ‰c6'PLv dc%MYY(,üWNxRhdN).rV%(X0tFasIҾ-tz V&dۺmN ߭]Mה& g.x22xt? X-)BHchQ < dvwgI&yXb1{+%O(Nb. ,dXDA>Ӗ(Q@wKDҀѕ՝&8Yr3l(`=a(TG./48@d&?N_<̨z^R2o2-3csgBe #`R , F1 eBN~*b "u`Z %6( ' fzj$ԩ-~[S[lM)iYcD?ᚹ9 x}eKxV%Z: :^US4ɮ Z;LF'ՠ/jž]fPORߐf?,>4]iÊ>"X(g,)5sEh4k{.j!+4Ҽ4Y˛Weaӓ0"Rzڟ!8=f&aͼ5DvK#4>  E#: J"+ x51ΕZ:5+ګICpsYN]|y!:'n(ؔ/ t&-k8]W~Pk^?&iﷃ}p@i2h{'/ބ̚pq 7؋HR|$]fg=Aof_aR7 ~< ǚ[jem+y,< |<~A 3U_Nn脢(׼5~{Ϳjx]KgRhQ V'8v4ئ;٭lՂާرN4=atip6G{$7b+Xȓo{cѤGT(iF{vofp{O .FfButLu f)|[S9|_3Cұgs]3@9ΪDU4ʫ4oźќiedhh켱7Λy7¢Y U& @pbvp*vjinq]DžZC3y6h9n<01l'_[GKZCKچ44#UihcK ~I6:ͽ0; `V]x8'&L}:_A& 3$;ox@Qyc:/y.R:2A.` P[i-a5Η T -v*vݥOLDt1;r /oV홗-yVش~ا/:-&7M/nI]S#o~!V|)ǯG{I8kp̔J˜oMUNMjda u6Rr 3m[sfe*%+9m|<K\E]HcjRٜ[}/e7f qOnV)a&BТ$#<i'V"mg4Es}x%G}&ߥY6vd]Ci-]/hE gKz>շoy5Iff\dOKflGɫg+r+_U jI~o|m0&y~^]_+Vlt Ifs^9rLVR׮R؂#% M[|ԅlt</H^HYuFR%BG?l';9vc4?87o/NUu*O@*P'Q\ A0 .W0=emXjT$F5w!m'`!J:_.sn{ .gM,K+RQTZrJKM "T3qJʻWhգwݦ>wҲTamԝdY &uRJi2EMQ~0&W5= ve{U{bI%eʐ rU.s0a4F@(<$`DHÔYbf4K{_:JoHꃝ3T8CߓpӜx!d+ªU|\%GJ).ԄXr3|!KBmF:J#v!~0p>#0 uU sGnևp)zZ].?8ۭXY[/&roqA\o,ĝ. NkMͅ]읁0ϲҸ3d&#}$}v]'C}_q2<\n~^_ϧbȮ?ev@;Ǜ֛wqJ )cyZ11V7?hӺX³'ĭgfYAw/N[˚ &6EVjg kwt ti.لfҌdc;xov)_;H!?mOm.FrM Lu'` b>]N o:Y҄,ڗ/b͟o!~vv1n3J\^N M,lt̯.4r:~7ۍW0(!2-MC5H.]彍xi+aϪJ3B˔e%K.tJdIa&m7 >^r}u |^fp0it[ ~L`(yGYi^ߐOF/w659"Q+qeBNV$]]ϣ)u/}⵱9~Luƌq7=)b_*|\>s6tɠ\KAGK"ry ˋqߤ6 S@sm@3rը꽙YJ>A zI8!,5|=`bRK5[yA"p0-Fr4VhǎjeW'ZK@SN_M^tsz]R7e=~ޔWu)ݲzHp^~5Hi9,!c?UQ*o. =c|0儭aa"HirF(;G Xqa`]GJ)ӎ"`Z3AK*v$״) v">F (jχ@$\DŽmQ{]DaYpǞ+Px(#M8bjӝTfj6/V;0X|5qUIfv 6k g{pls/QӪ.F-K| 7$y2weAm̶o o\՟5~W<jd2;e9u53>E!^p&ͦ*vs&_1x1NGU3N4 |#%)!ZU  Lf9tWYLrFm{X|CkPNM`),D aBiΒRT)g9Ee 0ZQ%:Wz>PWDKWyO,b KN39焑}(&< >3( GaG= wi/g80Jzm*<)\RcV.&Wy0慀성мrfQ@Na=/3쉐"3#DQ,.H.PgM-LidQX]-2aA@(&܎ճ\̪t}3󏧃3=U=6RL’P :΋9ᬬk5 A;[\hgDkҥPeۉV5b;dY`t}(kS1X $ ҙlV)投`^EF+ 1)!l§(ONCrجu';HNs6ly~0ֳ]^Sb%C9"10 RfD@ r**)2FY2pl8`^zj1R)W`HV{nܱsϜXW!xS6&i{@C!ڄLN#MR3_MB@L)gUay$k7#yMp(BjۃJtorٔ߈g8/90{\XBTu:{_7´ےx654f+Z+a4!^mjzfpJdw.K""#~tە $ۏA;4+ >OɲSFq5ed :=r{յ$Kp~O{.mJ_3u25cw<\"͊,)ˌRLn<{;q]2ITh89gCt}SESNen*!TfT`(O؛VဵӃJ2 m-1@/'ףbosgEjJ d(͙da4"(nm"IomFZث୫)wǚV/yye֭+ԛNGSt-[O큎$P^]V\V ޣ!5OWԵD\FXgwصGэlGM \fk^?q"ru={5͌Ȭy vvun#ftY빊(󄩓K1^ /'|)8HT4WC"}Rxi{عau[==qh>]9p50Dx4&.KKGY(>u<5H\ J*=l3ZSxỸ;|6E&ݣݦubWA$eBbNS~iL4 !ZfgQ:vAJ&wq"muZacӜ{&.)%(L!a "!3-B.H0ENAJfENbt9FٮP[~΀PL LTrsJ3H;"cBW{ #!4J١rɖ\C)Hf-<{O`m4rlp2pb^"Mx | WD9g&c*K2UV%x5/ʥE78F"Rb2F' HIvB^YBFZN\}vU"MQ\Rno l?p+Joy~%^>%jJh!X/˞NϿ|7iDP~]>qǴLQ*߂66,@ڭ+\Yѝ7^q4"~@ ?U3JRmxo/܁$D(fWT y9߹wh#u8hK, :_o+U2Vd+_}JԐc[D}ν< c17 gvEKű|yս2w.QNRIJ~樞ݍ}'[`tS3}m:V飲2Kq6R1 1JOp,Ʈ0}ZxLtaS Ivz˹O=3sل:Qϫ̑OIw톌$'hu#BtKO< bLrT2]X+師niaA>Njn}B_Yg[upxux]j:0bAIy{k`(^/ Ae؉H  =ok]~Q`w+lXWL)Q4}ϬXWHMs+fـX9n–-),`a-9๘뇵T5p&:Hm>Rڄ>e,kGxYe8>ο]rW7a"$6ɒϫXꏾ{o9fCiwZ{,2s8$7Z2 ؓ7(3Ⱦ]-7`i޲}Y5[%4g(e &"iTH^Ko,|j| A-Z8`jh͖~/eJLO=}\JpUە]{(h&Ua[P]e(,1)w-A3S)r L0#j^ !Zӱ pٵV?79Cr¼ zҼq~S2_gO@i€G2%=*%*&*̏}ׁ^P'YI`Lӑp:==B6yjn_Jspǵ^/ !P|ǪH(<"Üi/"(B+'|PJZ6ŚYRʘTYtQN 0ADIޟWVIj `8Nfs_x1C_g?[Q cޫ)%>egG_̰1*۱$ T{a u-i VkJE0 &NPŋ88[ڧOM-0E*[eAIQ ZG+?LJ%0^Az!)tezru( SIeĭ.\バݻ0蜖7z1J&:?/xo.m*\ &O2^G {1;1a"Kd<e,/#(*ӓxɊ#CW`3yMSXqg4g`Q_H d`E}muiYnNkip jQ =QA!IyGIok%;)aljyx{uEawg|A>}`)fLTHk?*HW:pd0] ƿ@JYY@c%`肊&P)z6!"^}[A!03x)T8J5;5m~zq*ŘcpVS-|Zj{^J + Jpމs!,FBx!x,4~ }]=HGvrm!@DPlT9PoEfM B~N <)42T4X U"!6A.s2y p.X&xf׾T=0@ŲU0Q'C2S!NU*F=ǖx$|h.]^Sh⡢6{lb]Ҡ_UzC,&J}ETzTVha50X'OSᅃ,랶>^~ AHU u׌zu b.L'`yf2I" B5hoPbBgn~oMIJC4P@*|F,JP VYulbxxuHHRƀF0,uSҏڒ$D#_3)v0WC{@pn4MX^#`!M'oo_|^mf~ 6 xY!r%PSP!8pYX-;#Qz^ԺT(t*^Hx7RAYʕQ9suzTFFAAsB;qjpI g\AAb~\Wo,!^i лz 0y5W<XnJ54@n^d}b{!SE)H 1L{E5%:AIk<QDoxڥʔT+cyUV#0VsY%L쇠w8v3[j`HxX%mxKLaƀXw>eh6FM)#almԃ2G,Zz .v=<]Oe–dD?le?%z?8[HO  o՗1+DŽdYϩ: O@F 7(Ӓ,*IQ$&c ) &kA 4ښP]"41RJH$)bIHx,j% 6߻t]oH~$$ i Ѻ.Ƿ{|ikgf 'DCPaw" eJ2mXf˞k5HՈmJ2nrM'fdl2 %NpvKT]bwEB22RIM(nSe@#вƒ{V#bqT@Q!e /58օTȹ='B6iw;Zmw0v_&9rRj5)pphenRETl>CgN ,8Zq9% At۝r{a`!z!01yqt t8jDHL=~:5f!5>Fc{Hy"8M;ǃ"i^~.N9Oy6vsܼsw b.)*p-[Ʒ.OGI[ֆZ4̧3G# 0ʥV%,#h ~B8/ w[;SwЩk/%gы)βyY0Y^<Mwd6ᅴ/gÈ$*zc|EJniqߖHcHOzN[17-|`i.d Ʋl( h^ɹ\gy; %TxVe"dəYat$'r9O[{8 ְGo?/8Υ6` WAR JPbF"M7# 0(؅KgR3,vn0 Yף6ԳpX$" 洣Kp2,ŒX?;&ēd11eJaU1"D4h74C6%hw ,+WP󘐅YWJQtP'= `8ۄ5Hw`VqL΢HVUwd8Xg^x@"#脈ۣyɧt3 _``+<,E R,mV?JL ICH ,<ƚieGEC%B# -M25HC 3vj3#%`OMs&X4I.awq ) `8lr[zg&R$R.r&E1ndUG*ڦѰ5d ;6.VS:4_BLD`MT ^/ƈKkAZq(/_ ͖F[eC'9dDm] Ͽ@څ*r+itr6|Oo8/;a(H%4fr*_iO. ("Ҧi΄= Q޹SeB 9 mq0AT$GC$"6#^Ɍ<(>(q7 \woj{k wW8BEeyJQzDjA`RO8#4ɻhT|gaa7I2_5ƨfbTpSGlq56\Ս )XA#VTY@j<#•V|ci:կg'6;cka6Ϥtd[h`2d,`wmRWk)qYO[*rAA .5z۴_0 [҃#0!3id?ƭ>̘xa}),kȉpÍrm2;###HWY0A(^ЪqHxaQ!BH*{5:۵j%qYdy AHthx(~=+ɰkѡrQځwtv3IbTOeH뉗jU<'BXGR|[Ry5k쏂Hv$i$Y' vv*yd oW}38sWiEjr{}q ;Rat,*L15kCun Aw%.:Sѫ.PWr@_p H|usOx(8T7JDլ|*1MgݓEWFer&j̞x KB˺$zcID˂(* iE k LI9@(.:J42WxR4@]Th>)@}nZΪ[ݚ!լSVWr~P~_</=U"@tG4?"se-Г4 )!(fJj#M;46d`ie~faga8+Tr##n8+mjhéN2]Z/ ۟6( *JwX0tVD9 yoG>~xOTƢV{a{4V+'2,M㐏wT76ӰF}Vr ꪱ/OUigkָX-#:yWZ$;@tZ Ͽ LF5&iEN&hL:]5N4)v;GFVm< ;jr,O|(MMgFk.dyw-w8<rJbؕkd+vϽ1oӐ_#~`2&elGŌ5n'Y9kZI" IcҦn,qgC/C#v沸.s}ݧ]Y.euԬkEB];Ǽ\VJgt57mmp-|5@S](lt] ƫ(Q=aPI QWgu:QUfH©R$,^NTqFkI,=NrAwDp?*mqQQ͚Lmp,YآP36uup6.^B:v͇ Yh!2ph*6:ҹaw!qhGR6|K=$Ң5]i1#, J]FϚ9 X XNNaW"6,"1@dc}K=r I3u[{POқx|b镳49"IqZcOs C"g'F`UR6q9,ͬ.8וߧf??~)aCqlW~QAQ25$dN%⻿ HB{DZ> leĕ,cbF. "VRʺ  Dd CRG$qiBpTl?\Y Q/̐lhMtϥ@JN%HY2jR$KMm;Ü ?[ؖGA3HqwY EJ}ѳ%uУ˺!6mx N28Qe Զup0"~M%vc.:;*oʌ*:3ȑ5ה|` OGRRpbp2.+i>q,;) Eiqg oufRk"aW`9 @"7Uڞ3R4fHDW  r%xzFJ+1Zgc]v&ܢ\:O^y_tPvtG鵠Mx%AmZ~o#$] ds?y.D\0^*;-ZyTqz&?iqZ>x{[s Z䡘~I`P(}a 64hcʲnF%דCЊVE^sjm}S>`M " q8F<&ЊxRhdCo|2N=6!za[X1"ɹTU^ӅmnFjl۠FrqASS¢Y80x(*"$dQoEJ$yQҢ4'PY0Q, dH"CF$@^?ٰc4`Q$ K,,{2س.Lx5,_R?*f(Nqٖ5 5m^AҴiR=ԾS(v{ɇ1{iVGpOX/OP)cO4~}RZUyŦMr[h/?az=Z7рIR qU BJ]~U;Ij@k NRd'$LI7n{I0wE0د88LGXGf_@fj W`Әs TXmflJlSKyQt12 Bb=Æb@)E %b?a10΅i1+ga.$=xv(o?oC,>\h15 +?ےPߔqbx"ʹ'SpSvbt.!qU:zx"9'rU{?7C)5W).#HTN2U;cfjQ4pfz2N>`;x5c?o_%]W9Vn%[ ɠc&Ar1w2)9WY%CɝQPvoIh9˩tτ(e3碥7ᚐEnp)!B"-c~Ź%Z٪<,|:=E/qw Ȇk:Ry:ɟt߷+|߿NtF%Tl8w|AT>{Χar^})7H cOvxDݿuuJPIp޷z2 v%-)U[ P}lv9~{q22D災SCAaGlu>x%3TvmT.+q\6obwka`eWd _H$b:b/P';( |(7F J,mChodD>\f[o9̀t9|[SFO&4` e×+I5@ׂƭYzqrf O&XLy5fk܄3R2@"c;ᔤ Rzm,2VBCnQJ*E@B wÒ %_Xz^*_.v$@_ &kլq0 S]bM-|5nH1ؽbܭTϐԵ[|6?Z-k͌,䔗nΦ[?LBp/^]"F]Y g7ڀQ:0Q8gXXW{$*‡H8"ȼ5fٻqW8b|dmv/ztw߻lŒ,KIJ(RNvdpp4 z顷8FA\6"2bdKnLS/Pș5g3Z"Lu֜(zEF^ BR ^>W u #h_֣&>wf`AhtTn䟫mYo1FRI &P =M tXX sD3C[*N"ml8JƠ's؈_3A J32cgUfRg7 iR12BW$ӈZ+qXe;T ^pd#rgQE0 -wc+kOm\PL9MgTcS 3̭WH@XPJ,AF8:l|mJd--*etUBfJ|b0!"c%觪 le*i}/O*Du*qe*M`뻇"=\K!Z(.VavJ"ws (31 7ŝńbPT}kY닳_lZe>| rUZ8!#4̤S SvUٮׅ\+],{@ Ⱥ4pdyi8H)BrA%*ER2*u s2+q當YVJHy|cKr#[^Y8}EPYx&êG$FcdkWPwLqӗt+bin 9*jb*èү1 };W0038A .(E)cPh\uݗ2V,C_FzX!ܓl}Zn7NSNgV%2eVB(H4gmxtA<>pzl%cS=< f8cDcn"rxN 3^-q橒-B*-ҦK^٥5o8SfD6Is9+XH(1T"c; ۮJöR)%oE>v%oS-4L7΢lgHfsɨbm qQt.sc82X#ysDT->ދQ!Nɓ Ӱ9+,#;3K`a> s>F [@mdpR^Y[&(I_Np#,M9R 7G$փq̣ .'^F=wlW8RKƬt@H7ws6RsDLGkҌOBi팾Lp eXj#< p(\f@'fi3fW99ʅ4 ( m/dɸs11P^Ε}?D5ai,1NFqn:yv}%/0*5(i95}u$: {fCѴ;;|ǔ'%-RDe>M:MP>9t;T̫ٛ y1Е(zZZ6U) oޓ ɯFwQ._aЫtRlҢL_fd+\2B3r29LBR^tfJz3 $5~NmonJEP*jKC\Np19Pxx׈ڴF_w4M]H>G >g k+-q_C7pV|:d$n::I8jf Xd3?NFr=I)|Ͽ:xQO["0~:>L?%dq`\U]ʛWYc2 aO3J>@h;u܅g_0B>/8n?%$0&^}A֘ %Cx'˷sSPL*bEI6 csAmn0֨ra;\Bif Ѩ4Dn%"86-kuQH˨LNb8a>ad{v '/6z?Ϝu/qv gۢKF(gW0ܛ#x%*%aF'"Ry/Gkư$JD} rRxAIe0{w~L{)u:$ J F2eEڨfYON2֙ pE̦z+ViUlbW&S$Ae0 - TPgp߈kqIɉF[]* h#Zc<~)G2xg١+tb9(l+='E%z (GѧٳMXȰG9NF8 .̼7b^RY {9l0)32FR.&V2u 4Frܛ}bjDXU`54b3 ;("MCgiiyTܒ[3f\JsP4pP\ GPQzvPv-h F浅k+ Fȫ\pL~=4Ec*"J^nccQ3d^.4k $sktW/dZz?M#i]0^q%izsLR=،j<[FJmgB"7gA]Z߾w}7y*v`{*J`lE7t|g-SR5U.Փ6@+x=kmۯU.U%|OBXqF첔%5(9:6R(JVbsJ/6@[Y҉ySǞM{ w!nV蜪%{~1uڔSKs;Yٗ٨Bp]"uq@D:3d2мd#9z+xn1c3*se?5*n|nwS`F]Pk>OgSAkd%2:iXoF"Md}׍hC.Ӿ/@;9nPh;K=QQ%+oRJ@˰GS3S8$0>91? ׀w' |)="dyv`Ȯ%mqW@Fes.M Qݽ ?F@WS۵mhE>u?vw\+߆Q\*]rhT{}įZ ۙm%/xwII W^"*,veLE}֢{IXo-X;.{][ r;e|_!>u)DF/hX³oewWZ#*:qw.%nh,x2Fg&%2."UST}3ecw^f'.iySS"8@rhWf%^]hj;wuvJi/}b(BfWך\Ophģu}DFooj{*!'[4JV/H${gnےU=o8?c~'ҾW & V$`krт2>cՖ3V4ćr i~9+Nh5W ;DU$t _ή~FSzlU_+vRPTǍb%JNgV%2hn@< M0nށJ:ț߷n\ S>s?&+aqJ3woo090Hf8^3^ڦ$s-X'󇥪Jd;o͍-WXϑ4^.AO\:'LLC>O4H)u<](GhY+$?#bua'm4/c9l5H}0[xL}&OI0X꠆BTgi0)G.)vg*a2RgCP03%LF2eYHħtf h0^,0J*ΊYR0Oџ#jN]_L'-f?}L~N?ƣjݥ_ǫ*s>48=!f_wE܇J>%;-7]_s&߭'[b>%tNB2H4҄LLdE # tVCm`Z99A߫9YU;jRtWcwUN){ԵbX=gHĚ f=lwU p`& 5| ZBxw<"mV}d`L+a=k@L5FomO y8 'Z# cMm\&hOЮ2YcA)? Mȭp2*:dcRWP1׭hU)bIs2t2`MR5nc+ʛGx<&z1j[sE$[WF{ky.`#h;j]#P5~w!Re2IPMքy!]eo2{#eS'DskQjF!g,7zp܁C0-Rhnm}4 fW qlηȐJ?6~փ$ 1T}ʿ5TV+?`boDּ(d{\ J;j u7DFC!_ށ;Ds;/`Q`7=td Ŏ:w)k%4e\,í;A*zR *yl|%JVWAYy;p~BS\vӱ^;{ޥO{<ˋwJP6:*,H_7˳>IT${n'3B\N]_O{:{*2VVt2{2wB^ȩx6ㅜ_׉i6浚dzwy3x{}"N{Ou]z79Env5,:Ӕ2'j&B)_}HX ;gډ*uAN8ס4A!Y%>LW%[(R%e*D Dq!Ty saqE %??SCZ ޳S |ߊWo%Wn~3Vk4e\ ~i<%].a~ӫ\xsscb_Z'q~*:4Gg42 yZ ɂFa<>泲"MX*I>n5;&=9RPgq"L}Z+ʎb>0_׼NYt)}lr]ĚPY"ʲtzZ)$pa5,cU@)C 9>݇&ϫ W{yUi Vǰ<8O̶tjGhox?3)Utp~vgEw !v~=?I9GRY|0AزHse~7,_l'ʥ̂s:_<긐:MD E1Il])w2 %\gm&dd[6xTfC[ DYN mƬٓSYg$† hCa5U>i:HXYNe2.YRl%UxKm% ]O4l>9k^{Щ_9PjQ֜J̴6۾4X0.0W5(9;̷1U1Xql&Zv T)*PT¯@œetj]9J B8ʕC84?Ǔ9e%j,& J#ۏ7Z1~<Ո5yS*cIo (ݪ`E5vAխ"wC`ܬ_GҔt)S-z):ΌRzdL&}*|)DSΗ;Y!K`DX-@Zo?cZ|՗NV#99<C)vg;C)_rwNo,ȓE> +I+L iY?\j)Lx׳š{L7<οmll> -Bپi%Z!BLJ+ 7_;4#bVIM@(C>’tk>=ox&:IklY4;} !x,~ E7\M#բHy & ̹K:ޭx|05(csno]چΏ{ɜ[" 9[4+ ps 7Aj'ԎAJj shCjP.GJfWoݺoث k:¦nA%_FYANca Ki6Ncm8suTP3E*"v)ħ1[s3˟1K{^z?7a:<_zy!:Onfg6w|6 i<TrrJ=aΑ}wE/WT,Wnᘒ@8|"`a"#9]00g$* 2KEnU?79!/CYa7<,C0hji,&8Ŷluiy| syX,V}X$!ĵ`DLon;M߫|g!)kws|R/{q"y(ƗO=|`|y:{qw:qCjUc޼73#-(!ưmFZR\e,] ^XG(4 ^p1pق&Z[TbG`HZO\N^I׿)DZntp8HtgşOvĴ0>#sy"fX|jlS;5NiX\%E 㐣W08%IWSIB^ef ڭ*zwe.֡USW}-+AS,So5R4J)A9-)]۪ 6/J>T]8SN^a,:+<%-/[%W4X lFt@jRmJPǮ)hj5*!C$s6DIdv vr,|(g(6tqLD BgφcLX顉 1f#]'(=Dzl mK;A nXǥ oB}pr,s 5g'O>k>0%y ܥl߮VkOSZghЭ/"MψVeJ Ozӹ9!g a^}Gϟ8if@5Y(@^׻.i;<8v^ #%X$q]Zn7}AAV\Lg'[D*qYO'(Emӱr'iO>^9owv Hə2~LO"D/ӓ;yoCB%O=(e*aL1Fնgvql_u{grP#Lh湧&=u}SfraņUyyu3.p] cw:> *&Go@oq.lIoiloVgExfDa6y2:BqәΡ|jT G ;+^O=lpH\KI_aW^ ƬX2_!tړZcFF1 TD,ܩp熉* 9_D$RdL$\ xXVO؍[Ɉ(`[o*٤?[Rm!dg诉P<8r2Tdk3vtJNnZ &&86UEi c s/2*嚤eK2d)!6\\m hGLu&H6eƕ.CPE]Iai,yHM}-Lh KtU.!).DPϡm-¡1Дv6>hM6!F5Bnvd7Qaج0D{էn7U@IX /ŧmdCsc+B(xň.amW>5śl%/ ?^HߞzZ_~ %+M~ُT`{bRrRK8]sCk<}8tw*AN0F& GeQh[ZKAi8 =4BmgE%n l V\w` ю#'JZΥ`U -G伲Q lCShp;FEaI}-s Gʹnc|)Ç:fr:CNY5+(}gMJRvVyҋ6rPiTҊ*(?TnĥE7jj^&v;ɆfdWkBmDʠpSdtN|k={݌3R|@KCk y7Qm#{W3Yywl[M#CF"wPSIHS8jtz6ﰜ9r\WqVWR:>)TKRHed2y?dO3k4O=O=|t kM:p U6L*R䮳aE=% y%ZWOc#]fcdhBYp4\/}wC%$/_t&ϠI^ ”]ҥWi,x?7*@ۃ̈WR/Ry/.}E {;0+8*l+a@MK)F _- {>dOzb!mbks$5X;);+64f{~-Grt)c:v J@aFrHgL_qΖ$ ƽTs q'?>vځQOף!Wώ;>R/2*=r`<αz}NJ};JotΈ1,?ظD9K0%%(g{l\y?z-ІmPQ6Ɇ~0*K' @Y1d`m$8jW(!tܰ'wU1}׉+XvxDC݈tȌކv# ء7nnn7~*a!Z#ȨpH׎ 1Д~6>C HK Up]h^šNv*QҪgS JPg岢D: Ճc)m҃;!-H+Goj G\_rOzV磎ޟ/; kup5ֱEb3]"{_c77%T"X;ĊB3Tѓ"-@kRE/(~wZ_zz6;k$h<{ )aG6]8Xw^oKTާF# wgD >upSfg3A᧭Y阓δJ0E*?ͦZVWaުR![OV|XtMWΊW(Tb؂IŅ$C oS*ΪQ'JrzFR V/[3ګ#K}`y\ dW2+]tgŔuaaMG}~,:lFa3Qt>Dl-,[@51c5:q,Ug.yt .\3vᚱ ׌]fwAܷ:%K5``rE 3*gQ9Ìfr!51%YVdSoY)Y_:d܂*_L>BjVZ j eE b,ՖNM( Q^AKuNt3٘ z`gRܪ1ԼU54(`۔nYԂް_PX:Ɍd$jMū(3.튏Jƻ06褭]%$]ݛNf鹕:y^8{b2{WEd mm,lNfǵ!>薺g_/ IDJY UV^u5DD&"jDd׶X#s\ٌCv0U` D[R3 /N0ۢ',e:J!X턵Z'C1'$fdԍ>9*(\TtP^"ˠX$SU 19AցG$)muZE_((Hj(V# k *v']):k < 6|PS3[;c*xdЉ'{K q5/ǖ E^&@n˯+ U 3£Vm|Ö_y5髆x2i~IP^A=v 4bύ>Ɵ"p3/!7݋WY߷ryðst{*!ÓnNL%?L@[|2LK;{XJL'~~`Zѩ\ڢ@8'؁]ˈnI3Ej.b(qWB9$SK6"N7Ƅ5+Nðcc'7x[dh&F35ɨi3euWVo~j3UYjt]t$X˸|#0XɭfjOܙ&cߞ~ԓo0[9!6Rȷw2z'9l^ 7A-4ϛ)p~KMb15zKMM{-kˁ)s?[H9eo zȀجңsJ_z&&]ƺ:bjӵ[rK- "AUILGD(#|A%BVfY@:K5 4vfVxFc}|j=Wmh78 m 1mXs]#U20fHs+ NeAu: J]5:I:B `&2d=jA΍/;:]%xME0s1UG:jLQө:ЬpКuyPmUph%%*0ܟ<Ҫʰ V%l V)/jC 'mpqbS Tj ̰h,nv==@ʰݕ^wUo&Z{*K u2n8 X2Oo.aNGrt!u5Ԟ{(E,ayÎ_yU]V[n17|.ʫo\ QR/>]-| ynӬ҃)c_(jо,_zcH%DQ ։`rnec؅ g?E`X#w)O|L7m,G`g5'~$6lL{cڇ[fܞ z+3ޙ66ፃ >[j3#MtcϏiϝiKFr6r{'?<3#<`>{٠펛ݠ7"nv͜Tzoh<ƑI6hAC-j7J[e]{He[WVUvKFZVJw]TTI_~TR<][kSuӊSd+TUu+-c|I׬Sv3icMA &KHgAWb^{1jhc9˔1%<'IxcK&}jIB|~,>:ăl,1X&-4a"˼mF^o l,yq:7*Qm8.6|7V(]IQWu,dhaGj.<Fk\ y8 zjiK*3#Yro#|igՋuAO؍]:?7>gd^E"w~EMz||1h}+7? {\ }c>K_d{`vLG_/Gƭ] f#鍱}9؅='Do>&eN0}b5o^ŻJ52kKɒA`$_'U"Wm15J (L0̏zϟ ϓKw|vnđDˁ)_ H =Aoٸa8ni@3@\n xmgy;L7pN:accW)!Կ^W~7&2|(&%eJƃT8Ub- Gj!ʭuj_)JuZl]1B_d}w&diҕ5a%EmEbdcsv>\\3]ZxM1W8ۚ*O9 w\K뛋4CϝekU-LE!TjcW|+:E#O!1]/J0ͳ )zfGa  ;ǀ-*Z}ʡ|w}"jm(cT_Lƽ5 @'JP+Ecu ߄VqbHEVRV 'y'q}?O3b"iTPCNx+hA}+vl=&-JޛPe zBݘdp鐣3,mu hں\Zq9_R;uJÁ%pJG^Y%hbxq7CKʎ(۬W`daƻvQOd(Q+\@+=c  _0u"էO'.LqEh睭P2G:'Ϣ8Iq̙y ~s'4uBEgqðK2]:K;a]N@9#f&O&8^6&^kk xmc V1486⇉83\ M|\q?9w'yރKٲޘsϕm6dmT9YoŶVrs[E6xrkf{3[D*qʏښ0"K/.ePGn ͞Nم#3wcK^ f@GL=+FW9qOȆÁ*BkߨgSz;X[>xf?uo־Qk}#cй{pr eqwg*J#Nڼ!яuܾ%TwϜӱBFW"'[zZBO' rBjB/9ֽR#ɃH)XcOЍw}.hBl#Jq{;Anx8[Z\>h5qGJ@Ķ ĠYGۭюiޮiv_wp ДcCJV d0Kv e`р: 018++=z&HyGPT8lEC|h۾(^NDu~{C y#iXI*̝̟tîZss'g;a1ɅBorGQw\s=.y1i6w+m@lxe+7=)7řj9ĮR7gZ}9Fb5vK}Mt}M%7oyMgpL1l+dW0qrӕ4}N=_svhv\P( hJUav+o^_Otkv,}mG)DNJ|.UusqtJKZUq(QY浏5ŽhTw37!\OZU#U9~>\uEj;!U7\k[7*w,lOV>JeSj/Z&*Zר %[S?p\WOxn1:\q-m6G`<>NHjظ1I셛(zȘdb0 {_lN PZ1i>b:X]0NKHCf'j`0d ߵjQe1'iZrѭe3tO66$m)IL؛UI DZZ3!؛؏A.$cAn䍚(Xy'͛jwZݽ#| n+GI}ՎAH{[bbq5J#ɱb\aWt &:E.Etxv>]sw )ݹF_i*<-ݹ=U zDT| s2,Hƕ/8ARg7vLhɰLQ}9v~3"qCG[AA'ʙ:zk$'}!87 7CMID`yX\B{bJH m*Jo@,UţTA,`%LC uC""~]՝V\Q Y]͋H0# Ϯzc?je? ,rqz(M.Xa5[ylm.}FM_EV7 PSBV%*M.I_IEJ|~.>v7HD оd,k):k}% @%3klƘ0v^=RJ)אn~ ĭWZ.뷈,(;[KރVg94BJSO¥х "zO艠y<^f9T ނYo,M}a8 Vò &&N5r8B)x 6cls" 1F-͚xZ8)geȨ1&8!0)/])8i -JAJZɣSEż"Ç lJLQ'ϢBb=R Um\l">-JQ*ԕ{HrtO)j$ GL*os'Lrb5cV@qpЪ`Z5%I%\1Vq;PTՉט}Z@Klz6>-B 9t{҈7=T9Qࢡ #e!0D\L.؎}`xd9:Xb}||O\[셺 ( u^8Rw}9+]USF!t9ItV\)]u/dX E`ˣAvyl i B<.R3's@ sR3.ojU-?<~.C}S;|ӾwfϬA +w_o=mWĿT1wY΍:9p-[6a_p/ȉ{b% :SX3a LX- Β!Yǵw*`=M}ցt9CZ7 +{qdw\XB'B`ڭŊsXQ}:Mg`yyQ۫o ʜ'c`ꍆ5x0ND:Ti@pwɋĠUGhI ~ VWLޕ?L{Ĺ r=9ZR0N ɵ:mRFjQvIoNTTNoD\l ]ht Ӹ1݋ueo]ҡZJ A+85 [ ӬBz)֠`R+TȉL&lBI6mRָsR`7bk 6,^lk I-Se4O9[/ `bݘ Z_+N恘!ȵ159[Kc%ZMMReaƏ^5w(B4|conyx{An!0]LS\Eըe5|x{cX۠KIsf`+ehqBS٦BU&`+@)RIa!ۤZSQ@2.ڦ]Ng*GiߨʜϣO2<;dir?S& F}!%O~ؒ.ZU`[tdKoиR.n $ uG,dBȦc֛4\(%f-h?1mPxG yI{)چ)o1) VsuCWӐ$pdob'䘳WRF>8f9g0rLW9HUC,_v`h392\|ÆPQ7BkU 6sĚMYzGGQU*SSTc1EmJQenXyW {5EstI7e"T#㵏{Q0ʧb1su։_[Γ!BiYPzV]1C}64}c>%Vf@ ew3LRBCR|f|'KOGYa,SN:R" o+tXcH!WS,0e}; Cn3>&J*wA3&$P6GDKl҇P9(л4-5,`RA*@%4C M₇EG\z`x\jW[xH!ЄM}Z{5ܳA*K4]g`|Z1ROwQUUd-0 od=|m_俇5zL5HC 3=]$UEqPJzimgXkͭIOlY=MҧID@+=Q# {2a"sJ 92{uDx#еp{ѳ4* P<bC+|V vC{ftʙIڔ}4ѮϼtsoG1EF"o, HgT:՝)|2-m)j^kZ,-xH %;ƂM׽=8!vDrIg;cSJSx"jL9ueFX,;"@3HV'=2km׾q;:}jM3O=yt|E+q[7nQ&4yZxO-O⽻wh9ə^=Y;H0Do#]2ȼg 뉦uwgV/qyuKLt"z=1C 1 n[Tl7sBM*N*Hޢ]gUPN%y-;lxd=RY?Ū|0OU++}\wV˷nw( L'uDEn)E*BeEmlob4.rp\5ViAFL2w{78g@o9m8nG÷(KE*\袜 %FevzM>ٽrf'>1(#Q/f٭>:͌WvM,aukKHZfN3h:KЍ\,i 'K%6r >4ό8iglHyiЇo!;B mu +%o}hs4d|# {z>`^5 )u6u{ɛ)( O1[ԝ 6jxDaM*י5F7 CӢhCkOk/AۤMAzavF{al";!4U O3:/uhFg*HE@7f& V3:ŷhvxlr4G+/~4M5LK=WT|YzYCm]^SZs֞or*Gw]T@mjkWM, + wPd+m[]&Hzˆb.eIR+U".~=_"LS*FFT^|qDG_չ#_:Mc*@V#ɽ/knl%MXD0N=a!j>SehBRS uЄ:l1V.u9b6wy`!AyI*4E)wU1b0NDte?H>pYeeb44(j ,֟,2 *ėzBj `M˩2%D*WbK%$kd5.~Ϋբˎ!t;c4΅}@Fl\n8Q: VC_Djx#ᑨ#kxs(R 0'E ixQ?uJ^9蜍D0]GgR=vq řA\.!,ClPjs^b%]s>5c`J 7$r"VI {E킌 ڞpczr]ZCknVu9hg)c9J-A?qZ~ECn>ׅh׫_ti@ x_O9 &OA W;EW=h)TX7n4.3s)-;.cjRVz.)2b#sح4&|VSj.nu҂cdI8XTplD-8옼yl;G:0z T(r.LFkxQE tSkDx#[CPj>ѾGhA_UjN;@n RUf]Ùbfڍ7ڙo}K6\^f'cvYo: Ƽ.8+<q ySmRg_○Z]F-^hc'h"8-wdPFZJdĖPtq`U${5 {m9A8-8;;?~MF3\BWv oaCE┙XeLM2A\]|!ʭ 2K00ߑVwa.7E 4÷3yV_닲U AM0ѹ^;C1Q832΃gvfmB.=˯Wu:ohJ ٵWe57brvWߎ/UHv|%zr#櫢e!7=>Mi˭󗫬9>—ԾM"ieW)wp+~{IMyl9 />8; ug'a'zObݿLv?;,Ymo?y~aOg?~ɳp_WiU([4%OprE,3\Sxz5U 'l -B& &6Edk(#.Ecp5qv`L2WQŸ%T@ݸ{Fj| -27O F z0/d@s5Y }`Jvډ:b$v0"bzw N]KE(ڽrj'j$Eǁb^@FpJڭ>oh;aB:H¼}&**{gk`b%DĒs*{ ?u0㶯p3#R1˛eOiǧ@7P@b"_ͅ~{uqv.}6\f,d"כ s>?=罍7o ㇻ+m]sC oK3˼0dn_\^ONMʷ6s`̻?ߜWv|z|eon_7' v |#m9u3]1=8q=-" ` qމ`sld\kF<;~+j_1Ԇݵ};@~}#6t'Ǟ{:B4AJ*jHȬH1c! a`?0 B-%:sA BEOsʈسHNi4]}"x8@]N+a'e肖Hv}F9GTD*NV/S03G{4<÷hӊζ7r|8R9??i9v[~*MB}3. .NxiOn;U fXO_n-Tw:i&~R̿8;|V^\Mx}O*ן&@8n #u2L$1~{* *-f"侔|, z=kXj<̪T`\Ew~;aɶWl5߫dʈ :&4ٶHMӇ~].K7Q[MpRoMumQ'{0랸s`>vt@%( EcB:&O!hvA͒GO6L<Eц} 99b0'7u#/[sօCH~|f"1 JvpVҮ/"ydU}yL%ǨuL?yx[~qtr\r䣛]~nj4ʹYz C&pj㥲; ,'m %}to`{6NX$0Ţ,%`"%$Ij1Llb;fRM-˵`t vYloL0+kaD Xv^҉{I ę% 2aBIꜤݒӸ"[;}2 5Od&@\J0aZaPZ0ֹ-q Ұz׹Z1 #zEWStb;`|L5V׈ܿo|:hn҇ zl8Pא,03,F<E7G>fk`jr]ߝ -'t{߰H|@8; Dkvr~&5GnnǛ⩕Bm'y^m˜Nw)2lkUO8M{x8#72B3yȸN AqFh/ز _WΈB7-gr)n|4P)t^ա |ĈD# HU$Gt1٥rd)҄F9T&CUD;>%7Ai*D݂ID;e.Q.=?Y@FH™Bw9.UIJy͕읇SB}lJh%:>GiRR| 8:LcE~#fO( $>iMۢߛ_v2YNRW}9Oumq_Y)j_Aj~ _ ac~T%b$̮2 Q:g*.nNJDpŃҾ?( !4VGKˀ)NcJ2 ⟋6+/&?v0y 'Yk;~J)Lo ҿߏF?郯]]x8 sԁ% gb!r < ڈp2wH,IlL !&6bYfc3iEbD*iL7"KW>%Sfi3d"P"=[΋OuxDEVˋh,Э8}.nL:E;NNtR)"Iw&⧨v+oW/wcZ+^m(5L[0&7# N5Ѯ*.VEԢf}ӾRYhSl܀C3Ir]{9H/ʛiӼTHg ޼AfbLuv2^mb'H?IL鴪4wޔ/`]NќJqr$jgw6$2 mll< Ih _醇 h}#Z01[{9~ ]wy$eǏVbg^ր(?Wƿ=ƫxX)>0tpzKup {-K h1qsTg??0wwہߵj/^ 97z7 ~g7h9"1<3= TT !I8 JO-6P.PvM1m`á9/x4(0X1XcX"0Ğ17/m 6%*v^B.6D6(eJꈂ±2R$Њ&ޝgHA S,XM^vaqfGގCiw7ܤ?vo^@O[i(Wbֲ)]iX [qJ mjpc$T $ֈ -%~ Ψ09Q9aĒ(87431p n.W>O}͖QeAݫRR:A7%=@)pV,M//:*'0%b<d`6])@~&_Wa"U~"n:C2UGUi"G b8 wUH"ŊX9W]*@//{qrBi,李en5ُ ʋJ+.m0{k,X{{9*~ _׫<{G%NkQfX¥Q"8X@442IDR8x.UUߟM(9:>#Ƅ^ W>y$T`JOTιn]+^U!L*GIq{K*dm 2Ob&+/N(AUvjO}o Lwǣ; 2'"QB!=J\>|eE[ouڮ 4_>-K kڍlL0 C< ?cYJţ5S Of< N: dCO+,i6z?3m-_܅B8R hJ A*7b}v#$ 35 Ĭp0N"| Q, ɭMq^x`OFZ MkYI7!AA3FХV|&pvH,q2m_92|p%(SGqu !v^V[H\^8B$qac!(A>1X" ጋ'$J· T' sG7KFƒy5ۈ3-1G qg]:K(/Q_|f*\R!Ω1%B\#t:8D?r"hGb5 R)aI+H'elRO$t1s GI8>NZ  ᄎ Q] M(Ä4?>*K:4(TРPAB WY:S9^DS=ͧ_/E(`D9@Ե҆xy!!u:z#hciJk=6tD+y4V-d΢i #0UA0Qs/^\*iͻ 8+C j*\r*)T?Veqv{SKJįk}Q"4g ZFCмRdl2"1{2-V{YUB-pWsM1V7^/m3R!uqP8uϡv*Z1o#xwbsU s{o/>W*! '[ĻB ѳihyb0}<~jAz8믇ovtOV!THKN'O҇_mZuȥ}OQtz)sL-O쳻-?೏ /o-qeTo :h퍄f&X7-xw.VJep+ 6Zǒ+ We"c0G'*0OdPFˈL"e?|jbwmqJ~Y,Zf"i Cݗ ݇]@5FIv.~HHbOwF8x8dU.dWrτJݯ?~(mjQ7wܲP/_/L")!'9D׎KX۵Kь>r5d/Gf2[չǰROݱgl?#qg=Z\OxI;6]~ex;Μ?~r?I ʚtbMJ@a %6ӭw"O=dJ{{w䭑zw⤿ONlhw`Z ƿٷg)KPEyT%m p &Yf=L{}pfe34<F FM(Rh`fdUCt\)Y1Sw2?x3ӥge1u|nvh',|z5o?ڝK>]|pVJHfs]LwZg*$O&zx} ^WrpBiI6DJZfkQ!g[N2RPH EEME V5N&mS1eULQSNWL ۍi`+k)iH&k˼`Q*lQىJY*Q "o$CH 䳄X.^MU-tM@w_W8=m=a":F!d$A)dYE,Qi"RNM.1zH7t?)+Ewi';IY"PR"2 ~Š]dH;)%x{g%Dhc*/rUu6D3.,"a z{2J (rf.UM#&l݁O;M.mZP1v$ie $vU\/ x)>IiQʧ.)Xh*DJ2" ?-H0b)S=H"CdW:)׃6XVx"L'Z:R NIz V@r,w TܾH m9NBf/@ iaX+c_4ؗoI=k~9(P:/#˦7]~ɢɩ'aGPp卧+}>_ Wz﾿yWgJ ݎx+~{3:ޘko؆Ozygz%NnAFuO߼b?\}>=+GBΩE p}dR'6XjؑĻo}I ݫkU}3o(H֙v& pU@``LIMtFUyׂ*r]H_K l8~!;R zSsہn2bxOyu?yˆo~vaX=Pjb뇷l1+jo(޴~;Nj뷓mb&kFł/KL(r-JT䠧#$LQ:UP@D]Gh\ő/Zԍs,I'p'ul)MpDm[V;u`/Gp:vzpb?}H7A˄kg^UuB̎KEX@ik@i| WHAlj> ֫{/nL 7I8آ(hp<9m h"=g0m7'!ux'V'"vw/'&~8$ 䐬Ce`G |1lz|! sB0t 1tG1n;Gj@vsЪ[yݞ6|$6Ov~v7w3ɝP3U6J4y{L_&z pl&AN?ӰyV< ZVt| =ތZxHm;RO o^?Up9'LI(ASO|qeNkOG?bQz?Wקū_&d׋woxQ:Ox N^~$IM`vͶueߛzz4pTVgk5lyŶI%zG!!0 I(l@ϚK)BRLZ+(FX'oNR%.6Qk9z~..Vj+GnM4e>2f>B9cJ4#!R8Qڙ>^A$DϲM 5S̠*iY%+i]Q ǡEɡR&I~0*DT:eޔt e]-e-48?bLDVL ;)Xg49YK `T-VLLCr.))QPD1 L/C֫nt%ENHu}f+ *;(UAVΖM8TPr'ًM {@f^p% v$I+ZrIdt&#d3 (AL^,o!Fq)GINsUV Que,U46y!8tQۀPxC #*R JW*XiH"X; }vXj#b1uk+p٠Swu]2ZVo7!~_c1mǖkKZ>q$[~ JOn闃=?z9O9-hM"l ޘvxeb[e߬Bc90 e(L]D9F20^g+ aDH<Y*2*)-aWY$ʼQJUT+etNdفbNjcd8F֒3o0XbB[VB-Q^zRgYf(`KrDTx􁃣-ZrEѪ%(UAkh4O) *ak5+`l>:M%sEВ/xOq@ez$ >x!bbW]E*:dGli_Y:aJ-:[б3{g8R]aB:z`5G⍑tKLD`a y[WN[hS#C#>,"1bX]$TĔs(AaC*K 0`J=14X kM9PEHAIGM,X,r(=ӊNc鴃nb2=%vs<Ď&۫/jy&q+ǜVړǚw;^<'y{6 @ҋ)25bM r)ic4~?ܹ@=tقW3ɠ7`{ Ir5F#^Qq3sڊ䬅4؍3UtѧA!O,iu `L9ā3M"͵A֌1M/軧ʘqc ペ- DmI8 q s`3DsS3UO$MMj\ rNӵX3?sm@J’>[T;bzvbrz1QAЋ EP Cg>e܀-j,~Qțt7G߽5q$o^iyQM}}-fQzO.㧣WIg|6wE|QeyO!n`CF=Ě;i-bԝ>v tαQieT@Țy${TyqKrR{g8u j+E%E-%E+EY"J <` ``1 ύ$1I :8%&"ϱkF},1HV yh ղx#YR\4I'ĘZrdf$l:Ec'fl6K42f0je_߅jo2Mac &bc3tqݒױ&f&U 1d6YeY#`AtTeDDpDmIsLZ 5ic:bX2FŖʼnx(6 6Jȶ 9ZGT$I^R.J$' ZCd[Ѥ^X'2?Y==y䛏<_ܑ×mhg5j\)<͙u-/vLNL+QvKԫiij;(s!d\{& U Fcarn mp2ukhy=-^;&pv/¬ڷ p\sqc;nژA6A `f`k.9;eMVvnV[NRJȚF%g+pk3%[r9` %%E/C"+5gx]JXb(]k^T&v'` Y+{k&k0\':\bY+&n  Go և9^7pGK9hX嫞Acwr3ў ^Zc4_ELڞ%Xc9C,Z {|5Ơ0{-'t\,42ox@ H|5 %Rs>b0]A䳒:cP%cw ۳=kFX=i~;JQZ~j*IF L%O>Ams*dD>#{+x3}*r+;,\ [OVb\ĸ,jJGJa cާbe,h1c鐊lex* 41f<L![6s>r.uc3Ơr P_'\ر&נ4ѪG3%#rDd ~u=ŸMG^Nx,3{DWzA3ň\D, !7K.Dkԧh?)J^H$'JG"Hs:bSaZ-}lx($<"2+h-E))re(T-BǦaQ$EKݧDLif5 AG/0Ġ ۿ`u>X]'3GU!4a+CIl 7 aaPDPIbÄ6; }Kl\牼0Z/Ngi!t6!aP5ޙeHz~SkN4P}bJNCa3)`c^Y"&ZGJDh倷;_Hd:E^6'TfFD.'4[y"Sʹ2ײjbbKۓ'ZfYƚ+c fks? 0қ$N/?,(O_ loRES4ٕTVȆ" \IХlC@YB\ih"o0t:dIr`Sɵ?}Q$akUOҢ[``%81!#A$̧1cf$QLU@S*6epN(9/ڶY+ΨwkY Ɂkyj,c{[qyF)4vJ!, .6MoK=e*r8<[aw,=iզi-zLh6h G,ďObCx}h o%mcl@:U\*v] A#}|n:<~^&ֲEX Z۪}H]Eb+$ק҃6܎)#Z(}h-:`Ko HXC{=JݭyTծ(ג@}JB-wvW5@1%X(PlV@u{ åjFq*M 2G3T9Dk3A D}a8lNIç;}޼>a4Pr)F>K@ n9My׿61VrH*B{[cOޖw;<ܰ,J,l[@ppLO)э)ål|0a?-)e:qxazeXR*UE;*$>5w;(CkUomֱ[1 ˖Rt*/'Rt):\KsNnh?Efm3*} fc{he:/uPpvX^^A qY9R)~ SbTa':D3A|")8Fm#ʤ1'|_"$B("녂*1UZ U*PVQg&wrʩLh*~ ݔ~z)m-^1I~sBls7 큶 .~-#WRV+GX`k!m^ o{oƽd|mS؋Uŭ?$E k89(]xmx71]6ROJwaTb&Wq fNmyw$?MUXӧS{*!šJduF`rZu/ÑG.MY :~*XESΩji5j ..)di`Gc.]tjԬK/Y純pݥ֒_E}[_vmU7Hbއ3_Z@?9Jǂ+q/J'c8D #&) j5X .Us;DC n2ydusO#ߝZmBs-׆ A2o)'b:|8|GHj$ڍzQ2\ҩg"׈{"BǨ ߏ@[z&-txFwc[b»"'VC6& J͆Pp^*EܾU7 v`;nv82LOMDA}Dā˞ XmI/ \.J&ڸur烏V ~:K%6`z/Qc35 nlac,vWbǔz,d aiiX!.`&lSq@uH*T4ƔObD(KR׳2fP'$< $Bj| 9A@!648:I iK(2(Ƙܧ&1㳯|zT>{%t޻K6LZ 7b6n^ڱ/$LaѸ.k uNFK,K$AmȉL vB-A2) /̫i-Pa^{p-;Ps2}1̳Z,7WRl/[']+9Lf{q۵(%-ފ:RǛe8PGk6!Rb I+OSb< %^ Js]SB(9og4ֶ487n7a']nW݄ `A$f#cOd Y,U{od4'.vO!}[QA-NԘh(Nj#a}Nᢹݯ@,tݰؤouׄ"K"L1* C9iD,@Rhnvc"p+1S2.]xC+vKu谷u( FuoZi-cAGaHTDh[`aNÈ% bx*řZocUmo ,bXGƈum"֤˥`ZP\8BI@xkE%P*S}Jʯb`)R6}B>zTDGT_Pk ###pcp%RD5!1d}m&LM\p,g'ZMts5Gy2yv6^YzEzNM?$'oN'DĽU=z+Ke^+RGZQel2Vjp22VTTچ/mbW6ޯa^ӞXsKPbB]^Qg%q>5-!VOJ2jTC}͘w0^ &;$Psd4YFIC >k)Yc[Kb"yJ`؛SE=,=~?'&6huacqa2_p}e>_޲>VuXũ`;]N"m92`+Ӓ򏹾APp л/| n'Ĥ|r^DzȳY[\]3Ջ>Pů7_#S2<3B9[/__o*?Gz/,ێ**I^}]1҇yB\/3>:ppԜg6Ɨ ΂Rd40<|xKl>eip ΅g8ְ8xjAŨҠluoNE-b[vQMafWҫ٧_&fo*[Gbγ<&աoF`b_>Lbʅ//r/#z/|{$U|eGS5>O^#_hzC쥟ᙹΞ|0_2 Arʞh6n4٧?@)hP^c/N >O,_zv "ց_AJ607 ?~ѫUzB9bGl\4%Uk,⾌"YlɁBz}N_f: )c`iz xjghKef,aX/U}LY+{~y/X /,u̸ܻzwlsrڴ}ޝm8Y1S /8ZzR9 Ub¥  G},y9,-8C.P5> Nz ܠrSunPmuVyEW(6"U8Q(-0TTI+?1Piv'uOT|_Mȧ|'~wM=FUƓ̨Ы*V&Bz7B+p=vaRDc&ʌ dBaʎJFVBɕg%clr*PiTw܁K^_d%+AdW.(DYcIhNxSj6όKr$(d䱧\y\2F_Ve#eO$G\D*R;yD`Ż4 x&fO ܻ>ꇿ(}ޚI5 `  0r3W!f6j8soe&[:1qh24& #A5%OJpM Q%E2܅L7<)IUWD2l;2L`8vJrV-G:."cRucfQ>:'XQNHqb4I*exɲǤcOiU) N bH i+妭]r61s&X=Hc\,st meR ~9ޛe{e+49C;,kTĂ[ӌba9`ت2Qy3on2'FQ.NI$qT9j93/§6SC-3I$O3j Kz9+Ot)y݁zEC[:o ٬h9tB]"Hv|w&eSO8SIpksgـ=$J(IB$ (1`uS \v|uػ=O'!RI9DM8RIa0GΨP" 91/m HTan@C5G԰T6w}IţP(Rq)3ŵ *^Jroߺh~U^~Ǽ}%JL96-m_H]@g1zz?օ8e֝1K7;7E)QQ}ɥXZO9GB.w_b1M-?H Tg; 5EM8PeǾ],'ޙ-MKܻA8I׃ŽmVս~ofm~T~|<~c4}\^<\/_Q @1a޷w}HM1AP@l_|F$=߹1pzҟG#>Lp܂Lr.ݞRt%@B1/#fx0Pb5eXW+O?UV%۠Z3VJp\$pPbZa Gy@ d{>Bѥe%bM an5|&U 'W awv!6+ߏғ=C7ozobUJY*aml'eS{ɪHSSXZo&#!a%.TݗO1`1?McɁOӷ¡D/Xzt:\\ź`hLi6#P9U'rmBLrYT7c0 ;\ Wt(Wfw_x rI+! BZ9`Aq9喂kXU[gɫm8Z,D ~~utſ{ _|4vhl80{}5(gY_u%>4kmz͞n2ӇʞqHo!)sLy btߣVO_+^cyecQ<򚐓~!޾K}8=mXo̺MJK79pF)MZ{9&rfrqRFe܊ }3GW~T*6n?zKHbfԛރ2ͺ(G&?yWgUq .h: 38齼_oe7WssWDD[ebB[ }iiGiU -EUz.\NQoS1!DHߑҴDޣDw-?RFf230+oB'|ŽfZ%Vc8e,ckWF@yf^2RުNn%%-c|Jɬt{8ד أu]G 7ĨI}!lszPI8Z>{8P!Mp0յ֑6A 5!{&87#؎&$ M'\>e-m ڻT܌/:oh Ƹzj j#W]p!0MZXɴ%3ےYkZT-{%)HYI奉'w򳟽¤sh- 3M#"I"F;B ڰ:ǵ?rt;+@Z|gt[^DHJy4 4~4~4~4~Pr+#֙LUTyI)f8-2bN:HVFuʨ;$H2$E{- )>xf6G,7;T_Iſ^SN8BϮŰB K3PHS$TQB2K 3c}2+MM|K*||bQyG֬~$tX&|~G@P~*a<,ƻ:TUcAmCP`وjljl~M7WԨ鎛0 ?GG5}Gak X"/8^d3YkE2cR@B)bg/W@6}zy- J&w#y{:ٍoh<",Lu(N_H|{QЊyXYy:#0a={E[A"7?EGY|A6\SP?\osg%Q]ҐO\EkлyڄƬ̈́hQm 69xo!Zg׶W:EDDS c.TGY'y3Ԉqqq% S:(=dk=21bxH}-%gl;,U:ٶRcu_Zr@_nUw*dG#:T[ JljDE>1~sK->LA`;}i\%{*$Pw  r 6v"۽YL]UtOqˎcj,@"Npshh*r'[j%H-Cey\[bijpLh)'([:T%ɋڎm8|p)үę7 T[6z7w x҄`Ώ|(jt8ˮkj VTKΔ{T5 F7wN}IP^݌dT}5?KǓ?"vq덗* ifxhz<%! {,ռ{@,BPY !$cRa>д1’l}sMx GImn&Kgljk3G](;JbK<[qL_yUp_^nw^n.{ĥ=\.P?Թq[*L1Fzv9hNUcHzvTY0OׯGݑ p؛M7Zi?]}@@$BtKE &;O*"ƅ$ =pޗ/w$Ue;ս;ؔl5cڗ4Fhjv?h h h hq!firZIZłqc5GDXe:."]wZ5zðzԪx1ꢺvܭ Zs 6@K3(]D2# FK̇"~ @8Ϋ){Sx2.mY[kA(T("#IB/0()~Xbwֆ=ySM&i'HEle1AXqd_Lҗ4_E1uMxyA\z3eW_=$ɁX[i'Mb0jʫfA@3wR@q,p{?-u5:_~lQB{ (q!L*.<ʪ!hWC q reNŒō{!|cqLx\XUtBC cf29 vo,`Hvq3(b諭-KYa0E8,fpP}rbtG&PcϤr "p[h[h˧i 8ը{=@o)10 go|Z6:.ϫ.B2U!zw7L7A%{'[/\s &J3`3,u !O beA 0t~@HunK5ErdzeC+U5vu>_]J Z=yoƂbSW56Eo^NI>(o}_i9RwX&&apF!O tJ|y 1K ;"S1O rS:))VSkXz?oal3ۧ9tJH (bZIK04xE0|;|vS>ݹ;Wtj\wV/p}(RFiL, s!]\"'{[ /b FHPۨwF_'~uT:лѪCl@R?7a~?X)#oTpOG՟q h)v hn2G?J= V80JC 1zHq"#\U/HK/COF;4ӝq8Nw;MKI!4TQ c !# lV(E$R$EvT IDOG;}qO>NM:]ϱkGq\ P3 #X# CPiGIH$'䑯 ހ8z?mӆ8mӆaToSB\9=)6Y[!QTd0pջt".`VN/xf-X rnfށD$-!JnCY` :$X}Ij|ekyfͱ4s)b-Ѓ%0, aN,2{!+G |P[FO Lό,m4(@Kvso8$&KH(vAtCIAFxo- £*ZctNJw1:>vʈPrHq")9G9 9j- :}ElvXI|-" c,ݪ(FWv~ن *~;Ԏ퀠b " " nȄ1(r$JʣԂ$ Y`*FE`72Wǚ  id\ϷJZEyZf3ƎGw6 W'~-ĴqF<<._%,F?],wWW]B`~.zW7L7`ʦ{m?T n?8OM>Vz_kA~Gw5Oߦut&e-nj{~Ų9b8 ,k ﵰ S}/f&/唚~kSq!\V|2+k ~4{Vϝ*"&_K'oX~ s,!?#3o.Fq2/V, -#ߙd~Gw- qb/nK!A],_T*]_Fۮ)/dUjQKw_`AU'&|Ja,ٶFL\JtuH) è~ i6f˒yg5B UKW DIvB-#yr\ %QHI! \ yaJ]A2IRfH׎&O ͘orU؆P108Y1/#fIf@" 'F ]IXDbD_2E h (}m$\.s:}L\_?xfp2 p:fqcbGýQ `HMjҫLtǸeXFYvS-~5jOmPNӘǼ4?o$Y%ܓ% $ӝkۖ0)ԃ`*)Fl#UHr/C 9=cE:HrNPQ(]oE_`#Ezt;IW(XI4JS++!$: PaY0uRϤqiI 'TB$0MB}f |Ҍr 镇U[]I-@4QHSlYyq -b0AE}^mBHpn[N9`uLT)Z:8pe\H SQ'T;Qx 㰤\Jv(שXFL&nqy`l[v6q#򗭽@]uں˭+$_ra0Ŭ$*޺~!% y!ȩ$Πnn3Ϙ(C-8:@ G+W)H,CU @A=%`v}͘㚖]PL(2鲔pu#pw XF<{ʈJ0"xhrB(Ȩʆ pXvơMݪ&|G Bˊ( FJy c\meg" LBC겗)x 6b(ƣBh-%R.z*bβbGyI ,Q<#Ѱ8=^M[*8P(57.^(ۘ-8h ew/4˧R\zʔ_CN%ʜz SZh3[j3 +xAR ֲ}<~25k"ǶC۫%>gI6_vWkfV4zqe</z-r"1n1g/_:$HO]^.(((FJtY+ir*@q6C-F]ET,jK@ηZto[:R) &Z1%|:Y;vه#N[<>dv3Y=7<9)hALMS;L iZỬ$k^c N*ݡDNxOrܖ^䈣_Ou[zwrkP׸y?r W-Fg!U݋TK1@"F4/OR{TE&1JIιEٖQZ@(h Yʔu"EAFLs1,%r+J/&䯓"kM^UKzFk2ld'D ~*o&r5@t6ߡ5g\q:EyMi%T҅l ^L1pToJQ[ٛٛ)pJwCs*\fTBtJ$7@Íި0{xuX vT*f)EVy]U z8hQżɴgw~03z/Z4xH  [OWi% ]﮽[1iM4W tNޖV7P 9+~8 h<i^_Zw~+Ipvy)U@"ǺoGxHQWF<OH9hbxKhrhAI9NKX8c|Pɍ'!B1\iJ= Q[ 5Rsi|i>g9Z' %J.>kE'v]>n>/ɓt9ȌҌw3Fb鏘CYϡ݌>}s.)5FTdƍn6! ՚҇rTo0p ;Y+BOPX+ ͩ\r'D8HNUN |6k SB4/|T 8FZ&Og@E\8QԚ1Mt"* 9ɵQ jF}4h BSLZ?L־:;`o@ʯ@@ ʌAH8@4#rw0)"`C.U,wgT|iw| WT~!OIqy3* !T'B1R׶|T_j=B[qT͛9zN5GӒM4Q\rcnMQF=Յmd>/![DQEh}~Gxy1/d\N-t|^5.V(ŸFʚsϖJAg;P5e=uFK?ݓ7 K*st*jG}`oG({}[M ?KO єn"7/7N_G4z;t U])ǣb11-Hn`БzW[xu ?=W2c{RPqF&a%Y?E>I@ŋ3"K'aK*cdsS8&S!Wr|y Ù cJOhtB^qu&G:,A4&%0Ǵv̧É:C֑W^,?K O^\LWvw`R_T9u%D #6C?IX/'P٤]pYEſѳ\ɷ/V&n~{QQfKg ~yqVjfAwl2iË)?>$%%ܮ4M/o11d6Msq߬LvKg%Jpᆕ' 21r>}h 0`-w1C3&sd|m}/Stܻ;:)߫I{G 3"(tjOhNh.oQ] w4b8^j{' j0xߠKdcx*O=ety {/f{K>Nm7!ҷ\duLso2/9U mG}_uۮyrM2 !>W`! BA\2U8xf8,#5J;ESʒ(0l./ ˰,}tʬ[1Jm X}G SwEu&z\:Lx 2A}7ca8c γ<c ffILQB@|VZ xkᥱ(d/\ 7:a~$`T[,;_=͠u^P:ޕ+8|qLFL. FEqY1֏ s.t ڠoWc#9ho(83Wn/lkt+¹.GCI5Ԟy2fJ,x)7B4IM{%[3]{DGVc0Wa.~yzA?-']9sTl|1++I4SD|aHža){>xf؉|>f _j@{Q U.d\cy>xq6pgp@jB+ӋimN2` ?߷D.S{}.}}$6"#gr\Zo)H0Y.>[Kz^,Wwo1"w}ӁA&8+\<-_ B}Qژ]MM22=U^].(L-[Y˙" V#XPaʤ€ht#޳PʜOm {3;0bNwկPaAP>'1R#yO^h5Gvوb$DUN^i#H (I$T^\HN0e/-o2E‚)SҚj C=0-4a8U>L[*c 2S4"H[) C դk͐$+$GWSv)p8$,G._Rfq=_xI`{1&L*]й`0qzFWny{蝂?_i*kS՜HnSy]G0q(ES遪l'u ~??Tߩ|$@9%PQW+l0*(2,C15ݢwq]-zwޕw!׈:( LX=aar!H1,[b+WH(:DgR=QW]2BӭxaG"(C! Ȃ+`@1F:4vyx% x*{WたvovonRn-_7C;}r?nlWRWqoCTlDŽjnU d i[6T Qd س*%-GZ,e)콖 S&A Z9n*䅔l \$T f(Ր܁L VU."crfI\lP%6xg j/q=$B``|@Li N]O*Oq Qp;SDiR]h8cJpVFז oSQ5sGtE#v4$Ҋ h,}!pH䵢[ $Ŋ?Tb$F8.L,x<2.樼{W%'K㾻sda fr==mD* *0wHsIM/)fUN) X( N1Fp 1a c [J ͥf&y}o}y 5z12 w~ 挃Z E유jVh,I&F>,S=+GAfJYY@pW 9 &R1\[OEQ̅/(5(4B;5x(`֏r# r0K |NmS0 "1 P $ޅԶcݙBh&,(M\ %*=hW@cΧP>}Ф- Rّ6MTS  Kt׽/ޱ6rUݿt!$kr8X^>RA5']FIl:}Zi/0^(aO +Ug율&w=C χX/-اFN_y% |A'.t:7N&iraNOOϟXyYĔxZ~5_s]L j0eroƿ,Zf 1k%QR"0Md\RXdaŠNq0/uuxbG"geCqnJ7\fө1k ,P)u@'E/ TzK,j Flx3u4/!B٥z8@_0>rGjSdN d, [_Ru yEÊSz痪5_)Eܠk+H$VX";j=3R Gvw뀡nY@Һ 9xNje?'岭8pMֺh0og ik5>5Wo>[io͋r0 ƛKe舴7%L{ݐ^6RH;.j9B師zg1id 0۹0%cපؤAROK˰M'w~XOX~q_?=_:f3^W1T0#.FuIbHUZC!p-9:fG[pJ\R}e=%A"-Ma]rno;췁)a6߄Y[X$Ş_˲Ƨs/aaƟNsx%QFf%~7oc͟WoMj_?" 2>MU}N?}0cՏkWk՟&ƿ7m<nxbigWf_Uo ,`Ə[o.ԳCw0!!Z!\uAwDƲx1dNRafbWƝVcB$ت (AxYޖ<̣lzMտ?}7[Z#?fc^-ݨHqe T6豏ղG٭Z5w嫟lc?,n Z~ُ~:*Mzd +Z3^:b5[&g4*~c meb]yibC*6D!+Bb$2 [0[cpApbgZQ/' XA8 oQ0WNM}IDjYYrJLۯzR. ̓LܭjT-:;5`Ղ|=v<:([;kE{V۟=s6* yϜ_`9x,6%Qme8H^!%JYejg'Qpy_֗xG3Ws7gߝzk:ad3*HޖFsE#b[Mx@~Wn|. ?xCN?#$r屹*O|vC@!i=VDltKH@A JEVod(A;qԡ`sp:Ehza Za:|u续|3{ ƍEb%R!#X-nC`C]{x$ wuN!;}S$D_,hz vL%K $KP^[^EE@b z`վu 4n|c#Nbf2x4H A+pRs0&;NJuoׂ@!{`{9Bh:B@"GDu@TKE0eCdJ*w@a8۷7U>rh|/#_E1A` r(dcsV !c\nZzDo$^6Ȫ զTS(Sﵧ1eOCO2ϫsR!'%]S~c&IH~魩F R7YσPG:{} yťgG(D1x3{=pfׇE>2kEij3#3ti#侺^N\ᣟYpʀW` u{iֳ:\~=.O9]~A, Aܚ@kqcaw8e;=!RΕ ~+H|ԓsXKւ"E*醺YzEO(ZrUߓ= j%.AP*(P8Q("PT("PT(v(ʊ ☪̓<9 KşNGYnN4;gV skT.c2mWeTW'')Pnl*ϠB* =yb8=dqPm QD#v*Bnͳzl`K-@Vl{ DHkUwA%+a=)`i+ p=19L†P0azͼ`<1cw>2rc䁑$.~_^%IFKn!j!:QC TI.أ@nA Fp,Vd  [J ͥf&> LZk*y[ 09trYC?$pZ q>m< |yh(F$R:aL ϛu0im gn1 `'. \ٽ΢aOs~vӇ:S3y}No^΢1r? G៓7I,|G3)#[xB)M6n6N_]C r7#!_&'FسnX2p;n<":cԱnn )uK&ukBBrM)JzY7BX/[*Nu[q&`*Ŭ[2U[hLi}&T/[*Nu[Q=[2U[hLQ==FQ{-UD'u:֭}kYdBZ&$+$g!{֍a ʃ*:Fr[2U[2.,_h<|?ͫo0 x76GX!r/RP,Pyԗ B93_wꜙ.ޟw-̷{m#Y!;Ralvwe{E]8 }_EɲM,Ia81%w+ޝ{Bd>rR^"^r6g0kFeK\$ϓC[VAz R YR B*Rybbw3n5eOcs b|wӼxsJ0VS^۷/f߹C?8A,A.  B8*/?E4׃d\׋'`^R7ww5.(^yOfy7ρpK(./;oz69n nbr`.7ɔl }\[? *B";Z ;Hߙί RT$H<(9M-Hfe Kfj%1yc܉D2ǝQ9f6wŏ. *tp 8?!A=5WdjR ֚+jvTʀuj⡸GL^!GLUuӉ[L"@jZfkWn}N4mP3ja,Xь!wU`R(3ZD$iD&M,aL8~X$U|ZQ#V(P_+ U+j IOՈIѾ*h;5#1ĄU`D$o@̷JRd;j$yjE$#p}JPy>dۻEW pSXV {OfFHlJ1wYĥpAXmMG<% -h%jX+PD[yN}6y *|'P*[ c&".wZ0D(6~4qpi=K~Ǣi%g+hƟԍפ;]K.]0Du6 /g"\%_FҺjaDD qI9e r!ŕ/!v;MavESȄ#J*ƴ IM(sH$!€@H]+𮎽AD YT˹*VzW^d::UMA_kR7͠iiw{_z4;`·(9əd/y&jGJsp?Μ-aي5TIz_f8x λލW٤/.%َ[SOz{fWQ~/8O=#OG-WT]<vsذG<^`f,P)`Y 1Y<\xNy93z@DZKW5*<Ȓ\%Qb&(묹 í Qd z̞"=9 ϪGq$GuU(+/_*:gLO`A?eFpp`{Ϛ;_}YSX>W54 T-3w>z"#VLdf,>owүW cn%BO$vlK9ֱ):IYZ3T'a?L8[osM^^$1g黸 777М͂RBb-6卵c. 1W.Rhx`<;tQ!UwJotlU:k+}?w3{;@T'{F؀8c N=%ؽڿ љ-(gRC<VUR>սh:;`A/;5c Vg&؜BP7@YI)B#&R-ϫÒ:FDĎh &Xܢ&䓜M>mX"15&I vXc1v0J%J47&$E21&̇47Anj2+g1] R୰N7KoiGH :t/断Eħ&ʷ&kcQ,cP**0֜CD;AO( 6Ki$Me0I &D޼sowHj;05s<`+}*Ubj|fE˯|?]k,xt9J/ Ie5 ԍ oߔ/Ņ(!]'SgFmt3D}agw,!PJk/k+~%)'ajL7/.od<:=*-uH ~]s w]AEi;ڠ"<e'lPQP+Og5(Z']^=N4(qDT;6JVVa# J0~ +ApGYؐ&9!!EՈMk9vfm_j?M q W[_C/jB/| bnbe4FG;|dM|F+ȶ\Jw;k`dA]U}P[*~ޘ|F $7pbf3\nhn^ws D5tNYkl?eu~gfe/4I˨Ǧrwl̋eJt>.1E!gv`Ov9(7Q2ܦWdD~cC8GDa#(u~A_ýUiDCybA[@_nUEQTaWA1! '? Cu}k[T< &dt~^]^=D '^J*SW^XЍMbܰ>Ȕ"NБ j@c NMH (Ǥ{ }R}]/'Q7&&aj9׮Ss&Ѧ<օBl U.pt$8 &x`bmzmGN#?ՈhD.5h ӝevהx&Yg[f+b4(39⛕:K ح^Xv8\h`< r[ `4qZlw?>7@$pv<7G#F>χc0It'+@]33!itOG>(׿J>Bn23&Γj zAR R^6hRV Sn) 9CVô&Pk}ycgZae- VI)gCArtܷY[ ||tW#$e0J!bAMujL6a]u6N2+.F7= P(f6|> Fa+buB=-y"-F@k_=Ίx`Z`:DT#B{`gv3Ș1|^/ד>YIԤɖVO0,NܞUrj2Lm)PR }2R^D 1R0BB?hXhǴvvHxT &S{nƐ<- ֒#տ7z8K't$CROE;4mՊ#5tthąVQꚫj(I$$GIdT2Ɗm{AJJB]ʼ"!xw|{|QHBk#[}/):{:s0d);!sచг!cb-6{ giy*9(`%8'\W>]3'#D0WK d#GܯqõSU ׳޳"(>YE[ euJ Şl]`E|8{޶=b@EYzW3$%%*0b̙9sCҮSD;CI1x{;)(u lM݄FQ91 IhRdX@G7K г,{:+uւ0@'q.p^jj`&b%XuM^'#{2mU3w=`^7%T({_w+gXjB!,wL |qc4 Υu:&cIʰlO 9 9:O5R3c#<% Y֏߾|y0%D 1O`vxFUAO֒:HHTٽ+ l"UUWʟуI,%zxmu=%cR(`qD~]6ijdKawh6/6VhU߬e9e_SA t`D(Nu8PPuإhmh4By5 ݢW)>[S${G͛/GvjC9ٻPeW@̋]^;r7cձ@b5жTM5RR`k>k-1dj _?̑P|x֥|hk鬁ᘲ.* l@}5YC{\"o:7-yjmg?{7k75`u~WɡѸGj!?@-Ph@}^IƷZ2Ւ񭖌oq67Դ@ρ/ )WJIG pE! PF] WMm ʞ鋮zӾ:L_\K I \-+kuG˛/hukPXz W;Xڨt}1@1m˨`vbX(g|*oÄ{c:U%S_3RfWwCD'K%@BNL;3π*y,F^޼'{3D;+˓j@)dy&ൣ)%1IRjƋbL Pb3N& #i羛b`y71DZ^WmCv{qhR$!E3x=]ܱcH U)v:sPt`/x㖚PhJ;+şY,ن FKǑzIm3X= WE遀wݻFop{{8pk/iy[R }vUB3~;ֳCb@Q]Ɣw! @@9xI3Ndzq}fI69;WZ^zj[n{K5"G8ރ6\eUW] GY B\gŰR¬%sIϮU{v^|%8VZ=kf9v~Yl_#TGBhTJJ@ ų)ZpL$fOʹ)=aa";(3{1^I \9ņU* 1Jp\]a\$󀹾'$i>PIvVu y4[3m7EP;̦vHuc;YZVDYl;Ji[`_dv/e=7#"In]/3 KI@(d+13 0DpB5y>HxZ?I *̧S5?_av-߲uhY[/=n9NAqxa[=v1wO:m}&xr{220ޤ{ۛ՘C@ 7zu'R|Ɓp<y<ȐW~_W^E.G#%, .$?=wFN%3NW4wˉjw`Xo DHwU: M19,Q !S2vʶVv8qdhTǓQs8#vRvJQd`s~1Bq[i9P? &RkZr h~kJٍ!"I{*BpHeiсqʍ߿rUr=XEU#&Wz&SF=(ٿ0g~=:)G0jVgٌWD].z'm8q{;ZE3%k጖Ir,9Jֶo\0@;ˮ {(RospBl掿 t_nU\,@[NzNyC++,m"~vTxF:i_a^zz,եO]R(ȹLck唒AKa" %mE#{"zRǗ2lyvM*zq4(UTH<4ȃ<:A wG%'*IGK\ yۢX{ˈ>8,@@Lm/RS',t]O]Jne=9#CB9!>2ߓQBy8$!ʨxjL"y} ]jB+qr.^4_W Bk% Qk\,My)+9Ъ8E,`Zrf5VOe#T(r#KNdVKih۹Um}~4A mؕk"|h R͹A`w{Ab `ͼPydQ`^zLž + 2㻛K%!J=)5[6Y`kKu̖jZ( 8}J^;^SJ>j9/4ԅ>@.\ 'ɉ9o-6Εsuyr*QQItfs!Պd<%I*1 qI@ E?Em6}Dfz._~"$"]86d(Ձ"N|k:DRrp"&0CE 1'11`?8P # nkbH3^ܾd)DXc[6;n)3w_f_#)uPGI%i!{o Z66*N;]˰Z"sQK no?s'K;j:KF'ԹjıV^ ^._erfb";}/K%ܷoToR/t .{XXa]HSXT)e&هyC28}5;5'*NA|LY͍a`m:gYln|Ypؘ̻No>މބ3r/}CYk,gףP޼xz ^>7<u&ihCm^JTPYmz5'c ,][Eg%y*ҼGGi $n>;c;ouL9ƫcULZl2FN.8)g. bn49N)% {׾ `6 &PH^)zقv=O 1,WI6 a>ӹj {.~I$At9 ِSj;u@4K.cSJO86sx6K4ER$Z,LKu)!8&JK#wzҥw]ȑ2 _[kb_+;>wmqJz[mw!yZFi8Oqf$43vovlHWbUXŞܽEy1n6Y6Kɇwgx6ho̶-sbڍrؾjҪ<ςfs{ze툽u*.WLf9p[{z}o{w~ڕ."f}^Yʲ> Ys<5g JhDEcQH_= >=ޗ=?[¼xAτ6F?Vs?yS\ vFZMVYQ~)ŅL9#yZg*e!O:a^nm˽;BT㧁*1E#釲PI}(A'\^*/(o vd3*^Ǜ݂œN f{0Y:yoʺ>8ړᛓ?gOoɇ/hO7h0/E{zmpn/NV񈵑9Rɮzgh>}Yr{lYޝ{iv^c!9ִLii)nWԦ"I(kDN%xs{xeș)X3Vzy 4+.c+ W2*Ã%Pڰdc#'64++L7=7PT0_^D.-Z`R( -F'F:[q2̄΄ᒨѲ22x@;^^Q>iZ{V\;/hܴ* ` |K\R]ًOh,;`O#ηv3ι^pw _wڌ3T7.l=Q_yt\|J+rIr2q%Ѥ3d j VզnO!5hQ?^ݱ&hU! J$&v~gsЂIEһdT``4zRYXDZ!O0L~T#2oML:Ekm40P2K rט[$IiFVP>fvW#d#5Du (M.H: fet]α(EifTH.b_?ZV𜩎US|Zyڪ;zc T8EJZ–Sh74[-%S.mF yn6vhvBB"Z%SF-n,J-%S.mN/(>vhvBB"[e_S) =?{U^WI3.83b`u o5KDXͨIpphC ԉƨ =fe_ce'@8'Nj& %-AQ8Vӫ9'g*XVU$O !81yP#2v1Eq3iXS@ *q\g0Uo} PP2$yPVH>ag9L\^)C +|& EzCp^ՌZlE5M5vym:#؎ATu5@B`D[G,}yNfuB#POV)o@KCҬCp_&4ERϟj0 ([ɲ.0SBdۛrX4Z.)^etnmHht.jB7'~2 s4SPF du&698Yve' eٕ|¡,;Y8yyF!jiStJnӧ\EWs9 NgvrQIHvc`):JGG@$֐NIR;aN=F˪K 1tzbgR\+Q3ɄXHg'׌'SA5Xz=kF$A4f'y`]2N*3Q+KLXQ0ɲ i.8W4jSgҢ*՟J>Ee 'o L\ӎuW[w3{V: poqN@] M Q.thN!P{Z{mpĀ IrwB8;'t41A}ptQ#&5ýp4 AţQ`Rc=ĠmDk**GH*d DףHd" Ip<#j4h&8b>ઐHyY%1Z %h:C.(0nU#*m sA=PwJ4k)Qbh9|^snՄr,Jg M{j[+ rmhWQ`5q0tFI0/ WnBˏ]dݲUl_ݾٶ4Z(w~^ًe9y$4jau(ფu52^>j'qubon߾\Qgڅ ^>e,x {XҲ2`IAYҺۏ%OaK S''ZZ烯ォ]SV!Sy7 O=Vq@nTMӑ EWKOM*v 'TK%nރ)bI=DŘ[ULi}3}&-FnR@aR41sY)9 aVn%`t D%,6g{8ѷWfoʴ:WO`Zg3P!vpɣ>(:Rs47[o=>B!H1&;9鏜tvI|N{:ۛUο2A7Ͷ/ "P,F.~+sQr-ZScR~>`yn ܭGEKF [wjĨcNB=]mj#)*UY!E9i+-ek[a2)$xZ10K;&CPzM-12?R8 GjM[ZX+xL(%0&RtH}  5ނx$FEj4+ֵ5{ݥ+5d1wFS>Ͼ'a^BcI`P4~__GfL^}U#0jٺ< Ӿ8Aˍ]c(2*Jqط?Fu+s $WD6}IY4)A(X{=\~+S˱#Zv@Aԩ(R4}I.wwzxǂ=XR]"ԅ]"d Jri.9mhA^6rK}p\$풽"]}%nLz~uvK0?EC.6"Egs\ єv*ʋ|n+ hz7nI qs,o˭%MAcT\~3z5JwheOyyO-G2-\xM,m߈ݽbnFOi1G%UCErه&_R78 `%Em -:=02Unϙ`s#Ve啍=6Fjщ[L#8Jԩ{eTL}뗥:WҤcÈc%u?FIF:L=!ջ3cʐ+ b(\v/%WV&Ǖ=6p{OCPFXrpG5g3 ^%BAMVB> ޱdM *&geuB%xF-hɹϿ_W0 $%Y%%[Y%ϵ5[SNJd2k@\E"cɐ .B}ЅX"̊F3 jh{wiT]*m)؄vr?6*D?%_tj:?f\lFOdL(CCwgGSB)ГZ#laғn*9Q_XJeg!h'="$)36j>R%h!"Ih1)'h *$:H9f4>7dm7l#*{(z.bͭlܜ;޼<`yF \"%)yPYs/C/EkWZ"hߢ)hG(u AOZS zҳ ${T]7h @Wn}h4ʻF&(,* lD_54{87Hj֔IӨ7[p|sa7曔f?u_̰V'v`Xo?ݧdR䌐?~^o_F$ 7Η }wzίoVMۋo_OisB7?g. :JEq|&pm5B<<}rę ;w~(;w?M{qE݅y12_0B/̣NU1_2B̓Y/%#$3e[cdeBB\_kyӺ3~XZp7}Jξzsw7F7\%?9seyEZ^2S|z/nkS~ M{O|fˋfsΘX5~NʖYXI]A2'KT 1;ds@bMf` ԛAeW}BROk]k([gb,Vsr=/Z|ƋX՟%ŏYTAK+np쀏3֋g;5bx{{,Yxkݮ`ӳvix{Yxڮ~f 1X>&ӧw}jZ{dA&⯦_ڨ PJ>q*tf/B|jٙ]Wwfi1p7ŀ+xr@]`BΣdh~G fDMd{>Q^]ܹZL"Uީ*\\~< kpʗD!b̌PbE?!t%o^G:=s:Ā&ѼR&y*Gf$^Z1:,.W\M^fz,)gi!zb(LϧQPKԪf炛pcoѪxmUb <#6R‚"O׋{f8h=pEL1z.uD2)=37$]+nIv?.zމVv,hfa2U> ?;pրFʋOڡ􎃪2٪FƘQ^-\<``F@+xmыW݅3dyyTВg|r7@V홱 ۟b3^Ra{Sxz No2Y) ,BLPBUEnTi. o2vKh$67m Ypj]JJ6ZDf @@M/ ښ<&0j eCd v4'W!"̈]p[t{؅Hvw]ddwZ r `BdyHX \RKGp ;QIR< ??&y+"kP/9 #G#ަw ;6jM!31OfJf>KQя^4{%N8m{trUaژrJ;',iP>;Jk{Iv~|hx|7~:j cx|AF#$e$inSy,d~6X5rpV_O(.p~1PJ-- ,ڜfzց~ZA^b 93T/_wBIɪ \*|kܬ[Βg|xiE*89fLj)={u*;q*۴NJJ}Yzx_\ޤ #-hԂ]|akGrɚȒc2Hx>BH-:Rr Oѯkn9\-p`[R]$Uz"|RZ EO޳da_jsw];fpi;KLfv%Hp2΢5y|v ;Pfښ)k\E]yS`F35 &s'Oy(W547W#5{BA+tml{'a`%ߗ5Ƿld{V0~W}a-L(GbX6.ALeLpӺ}}_?V.m4Vu0&73)%UGp3w>s|ʦ!i]'DkRP3 +r|UEe]{)S7+2%XhN)Z8)+3 n%xV'|3h~X-٣U3Wͯ~=x׉Cȸ+e+)vN*X4άtfIGd:gWy"d]4}Qt8x!IP0: KT4ę Q:/ W^'kēkѢnf!VKv&Jk5J[).J!Fڹz9EϜZO_lG2 c}:8QtV).Thjq)΂BgO k+]B F,, Ri͕e2Lm)Y vV H>KF4%Bp9Š`6GIK/PSS$F߶ÐPNri0s鿜ͯw]X?Գ8Dv1ۦwͭm4ȤO5r:cr&׷}/mGaC1gH隣տ&>9L`Š<%9_ȶLg;`~ΎA00"՝ysyICe $'HJ xt1dhL52tmcHM½9V ׄ:z A6R*d@;(#:IK%c6 -UL֑fG&YS":# )S@>%ԮL[O0)%ohGU'=c<9.J2qIqvh{rD?73rQFa=. eZrd0(@[r 9u$QT ).t U'#j:Z!ODy4Eiu>ZE=(J{ L4NFEA4@ar8R2\j Sj`Wl\p0~zl(j(v宲ƲG)Zh%(v Di,Gp)2h IeO$0֭_Q`+` khZ[^5Gk?PV|\А&xIُ|}]1-d Ɍ.8j36 P[F{!8 M1e)R8_{T"sី5d(u@rDg Q::YmH~ب$xNXkwv\lPK.0c,K7E |jNٓPpO%%jc#c$(j>jaSFgW.,'.%zs]&ts˓uDLWз$HFjIy%4&+&8%co _8F悂mmYt;;yq7drY yRU_/']ܾx9+?6ʉBN֗{,ԅ"{fMqqI:VYMt˻'G G#qvsNH!s h6B ȇq1sJ\S W$S/S^1i%B5pWLƆ|5ck]oډp&o*]p׫hJN'y7ymnu2;]RnM来+Ȁ#]?~)T:hVIt>wX-y)Uз$ $MD864/4h<%R1Oe7>gGE!8y0padJx B# ņr&LZ.g pk  ~E!PyR3e; )N(Ie%DP32ayVyjrMDq4v始9@|w{M?\y(lZg'c:ŲĘNޝݞ.YʋLo>OרrsUb8߬`IZRùiꝎ/~~M)&dL#ؙK\uTʼnKyQ7M.҇t\/'C{7s] qz oO=2_z4,(ftI-y|  )35sYWjDcyZn%IJrDb%IȮ딑'͙i~}{5_OcwH8Mg U܁A /ɏt/?˳yhee7\5\] H/%; Ku!^9r 蒐K2yܟ_{OuZ]&;Ӻm-E Q.ٴYu:OUYfa01+ gJsA T0Q ^E_rn,i6Ct8Y{q8;˫zr:tKG]< KλWgvLgXpbnIŃgqkպѝ+v OQ_)"`MX׽C`p Up:`m  ÎwV xCf{ P@&0%(0 4#10:$,7 0xD.I5ه0FqЈ3N΅3 gqʸVJa!֏ :ج{KMDW`rp] Z ?<.ԅɢ"?~1M!b}=˛ևg񵽼=s͙T$!<tg7gKCx8 *}bh9!=O:Mz`K}}Z}fxl'ܗh:7-U6F?)!шw喕B+JH11"7q7O]AcGCG'muw?^sPBJ(tR󖇍^kcO]u?InA1~/ ܀}|~G^,4%bo˅ J˗ 킲=EgpB̆r+86(/՚66Ah-brNs= wAMzF#~Q?)ZŷSG^<QJ!8S`$G2Œk ٓBD r%7&h%B)'C2p['[ AL4^`8fH4:kPsJ:vWzA^/CN}J{ f;j^o\|pF1eйwt,NK*ueæek8g.E iڻA#i~WrִR_̀J_쬜:y<1@0/ EiTtWZQRD ZޜoX 즔2(8R.,\ r XJy;#,S1xz@nF?L|!\"q40xС/T`QbQbbeSU,8i !௮Xk,;\ hr Ny4 Aꄐ`̑2*F=OA 8+5yR3>MӝV"%K!@!JNSJDF{Ma4PFETLTTRSD;ue'ΰ@%КpJ18U pA 28$8#wLJ +ێB3]3؜?y׊KpB#9sU؎zm?wcvRǫ!ǡɶgc8Bj{f۷YNlvNWlKDeWڡ8!T*O+-Jř\$a I9mTLkgH3E:"RqTE+nfrp(/o'H[aKHDJ[=H5mV?M 6 ېU7UMUǑ5H]ǤH5 )S84x)Z$@@ZWmGEGVA(UE8GD:[~7*o5'Є" t{e׆Un#ɧD7]6Ф656uez}6eeQ7ߩ8{I')Ξ|vY~%pzmTrːl"E~Abwr&ͷ+" ~odfWw R qB&+^U\ xs;xho{W\*B7 aZh1ŘX0y@b\ƜV>`J-ȹA  f:+ 2vKB}ex?{6ZUi/go2By^IB{0\LxQ vk3!8@XQIU_v)rPbAV"_‚ (ۊ2"NI.#)1?[%>Vӯ$p &H5|(L2XApKT <ރ2VG0<,`:aMt|3@M+yGY̫+h1y u|)X'LF3WgѢoAd`d  @m2}'7{ ,Tb7EYf&~n4W(( '=KQ[oLuFQtc.̰;픥یJ)>RGB8Ӻ׍25nN_|:t=q`\"bJj0Y8ɈX)dABsgMc{p^cU"?Ea٠<"g&Tĝ~ ݸ`4frZR !@#/ `3n.[B2,>IV%p\%96ӀC4S`P8y> ^+g RI[tA. N0>$..*pYq6Xֹy+xK@Wh;U;<R zXp0x&[$H/3XRz {PޙE|_\6@-d:ݛ,b CO+M>.+j,Fvc0 C @c5jV;r{kr=S֥ /ug]8}]eUgyRdP؈̂"@m;-XIit >C:$M+,"LRH_I*bQW/$L  40f 1&sQ|SNǬln2z:]DmD`\kLVZ iBE1Ub`t~mB@1n Ч" u֚|^U H*PaӊAZ _Eָ =f4 !!}#2.*yB._)_8O4:wKuncKK?m2~nU~9~>Y~GCg_/>wkϬl$[ABcΔF`Ařp<V ލp_aOj%1$ W JSYltS3 85[b-WK#Pw1R&̚KZ3\­ |*Jލ qVR)rgmUXNx8;/ vj ;2&p:A$cq살lTJ"MakY9}ۄ[;m&?B QH06H#|^TJ^T9[RD!`RN[Rn*fs27T"kT"1iR! }R!i'Փk?ITᨭ% I!ך D ![qH o+> 4 }k˿fQ%?)W\vVVMUfKtr7/9IKhĆvAlZLȗAZ^Prبy SՅCKgV[^&EmAb.$^fE)0وcO^aw>{Wf: ay8w i2♌2a.L ÔkM/n #7vXM(9 tBy\\=to.j{'R 3F!/j< 3X돧]w`P}< u+.,b.cT-0cU6٧:ܴUǝjxV; È}wbRc1|UrŎ; .׉}Lmdj`Rv0Dt{ W{B_|~2`n1[RpWB{vɽ%sq<ޜI\ޏe=x[fF h&d o$p\Ȼ+膰!{dyBAo\veCV^xlи<܎0w5&zDݐ fcsI)u*xy k5貰̞0t}yx~ܸOב gz), ވ} BEKt#i$)4i&'eH;8GHH:*rԢ9J&*CGΑ@JtR avϑqi<=K4)%ǖe7, byɸ*a7a{4ůIB: E5<7uBX|y'Y/hR+"njEQ P jPtwVXx:BteKBPn{Pm+N,^+Z%Q{!]-ivpt8]+DW#Mְ M CVuka\SG]Ѻ{W;j6+dke*%֍s4##*R4VUZi.z7ְڹ1*_a&=q2.N5QRZUi"Hj"¥Uy"0U%ԫ>xp8I ]S;m/|!i\41cQy)v )0E.Jg'+^1kjR50ϹW闄PRBi.d&XTm.qoy?X6)T9DQ E2Ό0.2䏁P2Km|>/2 ou-Q\E0Lz33vg,LΌ{687,gPV"sc!+0 |}dg3wz!?~r^^qPj6s.L; {?k>|z6mOWp20#iB(1pI*\Qɜ`0XD3d1u^N@l8C57i^@zrN1/р)gٛ A?}'sS& pMLב 3['--E!E8G,? OY*<ѵB -Raz?a BL_ #hv@Jj],R}M0fm"M|yR+Bm`5˯~֏:Z^4|M'*Jg3? BQ8ʨ fXgj+=uHK_ ^x4.7Ôf?,Wx%77WF +CM 0xYL`y3 $@? 7( žO28rQ)ܶͰBHPʹ%$ QI aL=BOoy {v!(se <+.[s՜Xs+^`3MX)# B)IF |kpl!t+M6PFH&p=( Lo=z!8/B T~*)9P #M )g,x`gH a -R)L Q`uad0>V*R "JP.T P y-="rk̜N,h$((h4B 4;+c:()y96'< #),He8BAHЕ\2 2|=^32:Z;`?/ch'cn95Jͷ_)S$=w\ø'u)5lS6YɴSWJ:5{^{ί&OL߽ys(tgȸpHͭz/kb1Xezo-D_ e| <f7Q,I5owWԄRŋQp!{`hGB_}b<1ؽu߃>,'0m%ٳ@T~}Yn|ε—-(:_]ٻFr$W ,X$/ݽ4g_0xVyWKr_d;uLIrUvd`0#D4R#,;!_FtO[{NbsԇcnQZ=v7n0΁? *NJJ9T7|XN"cqMJně="QսrR=W yӯWUWWg*]1SHZ@.]ʊ..<}dPύ&y 2a+]:(,qe@B6 4)`|O(ƒh࢓AZ:bx`NϠ]6NYEJw@`ԎjRȼ.bF)Q$uI&#t .qE YB ;P ʢݏjkβ%u r!,$ d: Gz/;PR=)(ZrFes)2gCM@ O01i+$EB4\|=x Rd&,KBYd5`mI.H1Fwy5}ݐkCH3 R[HteLi,Z@KMڃt!TRY@ d>{'Lds2F_Ҷ9jwvHkqת t]TB>7FhMGΡ :H$ǔUw/J>p04 uX^]C~ƎGecw lf9g!z(\o34J`i-gr4@-gYJj1)9HFP"d#c` hїlLkz0Zh2Ӽ6ci!k@d6d ̆6V-)8\.3-;3?C33 @^7UV;|^ن:^-d-]#hDR&vx"E>/|^6r2Cy0ʔ+dhAfET' Ay跣̠T(eQEuuu5iWςjDW768)z4aTio}Õsɺ0Y0A=FF9,AQE>a@3Ř1 luP:z6^5QSQ C 굟ǜ0E7 }BER!/)[<o8K b') L Nm  Ul61TA]RdQ[7@ZVɓ V<k j y[ yn2ijaZ9 kI(6>6#]/iq$C4u?hٲ Ԙ8$sh%{ Y L!f9o:c-6ۅs[Q_Eu[{^Պ1Ԟo#}BX pBSѶ>Kˀa96VNKjSedfmCh7_-zP9ڭnoNpX]0j}zINrR:\/sM #sz$7>V>Y,>Cfifhv}qg^?|]|czwd&fԇɳOGP陣,rˮ z$ g[\ $E۳RIMu*AZߺGnS\uyGZ@i.*ZV46_v޻ R2\WZV5r.cA(ml8-/8]:.C݉#++h-w`BN7V >vk~C V"O/%uPRp!rAF_z҇i ت],-mUǴ0П{hU+5:Ϯ'OE#dq$_o:Ʋx?c#6$ ~_Vq)NL˯O#mK휳l~-!u%k$:F3w]OVsi<}@8y'\xynE.L%շ?|hNWnoHMy#[F.0vsCнFc~++nQoOHyt?\wigE#(PnLEyG[J pUoEt2O_|~qAs&?m w3/UhCUGx&C 6}4^UQwX yG?R72153F;xFgY34($ei>$;p6OQF~ZF Bdsk]gWɕebJѳͨ*MF Ca#0l==fߟ_&Wb9E.jp{}rz}^]O4+k&?Lh:AI\e^5zYes6SUF_1H**LupdqL lI[}WwSJ|hM-%zަFu$urJ5( |rANeRFC2%ۏ]>L iճ}NsmY+%EޟH~l?TwL҇lT8KӇBÄgvu-g];pKv zvC~>WBWg|b_}WVl,Aq4ƉK,Z +mƀL0q^7AsР FL-ݖ)\*eIN9Jjtbhj~#g%՘~$ Pi`&`̬VQYY$>9Z/(J\ޭ/ESxy*Bs, ^op3fdp9@yޠN!ߊXEbu'8F^ 9=t7/-qbT659b8S h 0k l[/)ϊq^l>2 vkNi^yl¼FUٷ5HVWƠ:~_ +<Ժ~ 萨; t=ufpއo{uK(%1(Icl%$^$m6"K>OA5s$ )oΣ1ȭ4 #pQᘱ.3rbF ViwAvY盐vIϩZ"lu5w!b sga~/VpݻeRv>rKI? v/q})J 7ܼ28! \495Wdc+^M J`l%jP%<âP1B*yrGhA]a}C.ՃA~n+ xzd!uӲpEßzm AQ_iWY<}p.n'#8JgF[|v++Rϓ4#tqG2WF%R_ɿ[XMdG{_+sz7llt<8 =I;Xx,^Y?[}c7$%oaZsu=# F~Uu,#[̸}5χ r$ $G֒42g%7&lʭo~!pH͋RD^g%tEjd'f;px#f}1X;`m;TKQv0 "  CuH)Iї`D +;4Eʦ޼HO=zү&؟j,LDcșHȢEe1)oR3OLX+ixE,1sV`w>eh+bJ"z (DJ,(4 Z 0-#b8,EEOU}AjE#qnfe)5ѽ&H#.nb1M?=\E Вblr_׋MtzxȔNg[2}2|Z*{Ϻ:{FwwBz??Y ]Oh'Goh7fGn]1cnGLu^/\D;ɔ77iDQuŠFtFc=yڭ EtoF=BsZvBr|F$Ё;{vR b{QV0րctf3YߴEP0J. dSJ? F%q3@J`1JZ+Lmkqݾ<^kbJ%kZ(XJAl-'@ɼQԂ,{d?KјC7PTV95iI3CH)0׿&WvJUkJYV ?Wx ,т Kv0E {L3-le=MSdJS0΍:=9p=];JY(n!u1^f_]H@[i_4#q}e{^u0 gсhJMjY/آ>:?Jc=#<|fM7#}_P,꠻P7Pw;D{uWӸ@H}} 7cI|Ľ#XγZ&̢,ʺ{3B`lQ36Aiwhld 'Eܱ7`ͥ 'pIp1%C#i : 5͝#HwY/ y4;y.$oC<0CqojLF'QwK%N3K=G ~㹅FBwT*Q*I}cIR5c]/gӖptDrut5j! 񜄺Bi_ա\` ւ[i7yQD ظX2ןq nn1OP-`_g%&&$o!dYpG)v6p}ÄGɓ*h(~V+f|o敠ӿJBPÁ"ؐ*6 %A/0MT{ظ|q Vo{xFT,oV+nH3;m EB3dk6 >~J`'t $ڄs3/ sfz!jZ}Β8fay UkrlpBPع#𠤢mz N$nq)t%293f pXO%0MBYI--f(U3L`O!a[hLb+4`",XU0`YR2{m93E)Z=R_0gj] #s|DOfgtQH*`=&f_7-jG|muuC /%RtrsV5exQ$ZҼÆUn*qB(NTR\5T͈pstlbbۉ jJ9P0R$\7Ԧ])VN{QsS"^0 E0{>UfHʲ.oQ/G@^;`>keb3؉$r33 1333 13bz7M1UB[If Dܫ T Aͨ͐"6KV_ˤ1ϰsr]ɔNu^݉n{V B?%~L(= FsB-#Ɡr"rt]22bP[ig`aWXe+D4 +oΛNjpT+l\5xPjݒҕ t: +\3A ~w T1,ȳSd}C0d}C簘YSPB XC#iIf %%ќS1YZC%dR"}PwZw u'?CB;ٗ… Vk@HLxz!]cN_@(S6]Q(`?.pł`\XuuLᏘuU)@oj˻dq,549M ϻf܌ u!r UByЛo4aA5J)0FkI=d'x3a%SxT1ɕE2.k݉ju'?1QtxTdQAlaQ!kG3S\8U+$)N:#@80Dí>vœu͉u&kJ{9#;{ϻ~.i~u{9//3/c 6q xl5@7@@]txF w %a5Tv}o+ Bi`yj}Fat<ǰEjSŔK)DN@I#%A3Z%δA3-6~HY 2l^k&Q0T@׃ Ug%JځgT2W< Z䌶hu/ɨ15ցd($efP}.f&TU2mvӽMFoFʀ yvy< H?F0$xn:FƇx ]>Im/=ߙqχ- $G2FnX6h8+6'v\?SovIj~w߿;m:-wb(n $^4%:2s,eP4H LAJj@4Z y_@¹`HCf!RPJ;#*2\Z2|Î"ʀ ֺL׃$S-JWM /L N}H,-mCD FcxT |*s]ɴT ƓOx2?Lashv#;71/1.D=MO603%#ȣ}d ry8¸΋SH_qr  ~՜?L[W1>@E=͠ũ|ᯣ'3^“5 ml5MbαXa *bIѩX: ,HjLRAGReo}H^` sM0R8րf[5?#`uƓ+i6@_G7;T]ԑΟZjoGEZmhQ 5gn5 ;*s#%O%lJacV) [!3o#lŰU$F^Ʌa,ƪB=H0bLrLH ` uX,8LI^ 0!=+QW,_Ve!GŲ8?׹޻mK7c]~@U~~Jо*tix7ܳ!BJxVZ)_X*"=|9gJ]-dM.9x NA Qp}shEsD 0GeB+UѱhRӠ㻿%Bv%K4ЫT6M(e,\ <2TbPɈT3*S):oհFLm`)p0lc͔)ֆI62^*9b篚bT\T%q3U$ύȐ|w]FE}!![ee'o-l MDVB}smܿJ~sl@܆Ӊoϣr8t7.ie4pښ[1o-aU& +ɞB-*W:C''Z4U=WIrH1ܭ. bHU崃SwXIMrsyI,<ߌ p9MfL^? ^Sff7㑽YO}\qiGe"nv3}#4ʧbam TŃƚ rǫXHb#=H2<#BX/cgaC@@@"i;"cQRI0 Mu4M-Nz0(9!kL0fSD:"ԝlj֚K.t,&3!RrB @A!82nҔ*s@5(Zʌ?zw#q`w23.4C!^$ 0F^222'DVe!nƎY !1ҥ\ {]j,X̤TbD`99e+,$y,64ςɉЯdYoAF3R9JG^//NjUB`jps>cNE6-^!e{@Twaٻ6r$Wz؝-GHGǻ1=ԡ@M-Rn/C*^"*R6-2$ėk!A W{ ./C$wLLyHup#w]C&jd%j 6UM08'tѠ[C!2BbdȐe2^qzU2-PH|4_D,ŗY}?))YBʞ"qo*# (>b;_ϛYZ1= _.?zp+n#6EωBi a\)(~u(}3j<ѭ)}GxW[hOi8sctڸuѭ)}G혫8F+6[>,z~S8=J&Nύ.94B˭ZZ?jNɃ ;Xy]*i)\'yn%&j2=7 &I6k\zmM Mr [LdLsQ G0J'J 4{Bz@YpM)+Ga"RrC:DEQ| r_jr.i<}gçk[ 6W{b:Z@5V~JUSQif\s+*qh@8VFp6Fs^.ʸJ̙رiM nXB^v2>~OʲwMjz*hyt֒SC0V"_=Gs\@ǤvX RE{~=CXOi%TϤhe$ow+_S FԊFe'wQryLXRĺAӊina6CE;SZ.xdpKb5:]@nB575cjiVS8hx3cg6ޒVbobo9Xv\݅|.zOa[ '56O4Ʒ]T]5^-[[p.r1͡OdP;8޷2{HVˑ{EA%cߗ_K)o+)dJ # 7Fꁯ2Ԃ 1 ƃTH\9@J/JମKL+WZA\|7W1z0҈ѩcJ%kzY9;(D"qA]DMQ^LitADkYGs;Z k%}\YBpJ BV2\sv2o_lf=k51f;# Ƒ wuge͟܄,EiFKSw8S:fjPԢ)fUuUstR}7.op}9B4pvҤy]џ&/b튔S/i#AXmG㺚1w?1gs jAz^it]/ְ @K;tsꎇ ro"Qܛ FEH-&>be&G]kF_.HYܼKn.b(U%UU.].䴛(g-M:d1{D崙5#8-)KIW,s|~_|}}ۢCM!}JDc~=Dd)$-&P5!+ь U q}4\LM * JX1p2"Dgcyq Jex~ ߙUT&+VS|&#nw^#މʂqԜxyZ=;LShO]˼rF(.[8׉ Qx5%FF 1~>⋙m'^2IZ&8mKgJ|*9Uә-Ǜj4 kQ: ^ѢfZGk{,ԵgQޓ/>:\*[~K_bQ3KEe_u_~#Bbg Gi"=cEUڼJWi⪹yp v*JψkJRYI3O}A/*җᩆ,[:͙sʫ~[*b}Ú3pBumn@H8(7 m!.fi P }E#^sYJr1!hAV.ΧvgJ~[L$ݲk=si, t}H>e%tɬ웿Ƹ>|dɊ3..'~z㤱7r (JL"#BS;bl&eQWQlMnFH4U~ߞb[IŅcNJhҠJr>5ڔ a}yH?$?$ѮbR0E`7wSMhw^n /w՗2n.70Xyr/~?EK3yq?zOi4p_~w(SԈ koTݘ>\>1ƼM&#s٪P% զxwY,)=6 a.lj kyC b+bbz]̭-/L-jrWK8pR،SfIkof\|Jt" CxPsAr- |(aSnڌM]lʭ d:Vb,$&@%3fu-B"_;RL?8b$* XΝqE#JFg&FRՔJt考`RLi ! hqum/u!#r$hkb~c VA2Roal v=*j&:ȬЎ,%FgE"hWor Lj}C!LF 8Bͩ]8p#Hh]ATCR=wcnYƵhM$ɉΠYfnosupUӹ;Jf YYHҕyNݵY#[?k#+#6Jݹ}xE旋kW.mN c| DZM=^s 6::qxxY܈P??r_l0U+'*X+ \w2L|6ʣV"q;ȆWsF&?X)TnfBU);Ldf4M.3qr:w$ 5l4L.'S:"D9 ?_?}ɬNm7L鵰uZʹ&)rS$1Tc`(u" Zм=ޚOf˞J0nT*7*~}wUѣx tm4}7-,,&] 5SnOغdBB5.տY<"-8Z(nFf߽?g|ɟ2*Fc cBrzgΥ1fw;^iEZ1Iy ڽ[{])f kKvݹ+p7Do_'e nE0QUS{:Zdܚ߰mBa?vup ijpе.x tΔAb{EHaOTljQV90 5O1L?N'abk΄~,f봬m`.-``>f9{*c;\zЊۍ(HELO?+c0n1}~{xb*Ba4zyУ۫V|_tfq7)Sq4=*SK%I%tnZ~nhy4x-] *y-a}؂˧bfLna9ij2n,}?_NG'Ԫ_WZ>䞗+WlD4@\zf+ ~+`n˭豧Aع?7@B=~}d.M8G鶟VYO[z!:AE8_^Ǘ7CY:(R $"ZCLXJbō)95]l*r\vqd~۬a1q92|{FvWbp}boQ31&r'BpL8%-C$w$hAF #.n@hO{`4`]E&M):C&~Dsl !q оYMdR/ y!84"Umd`,A3ݒn=15su`Y Bt|v<-$$I%#pDKOibRo`< XL`P0`3$A͟=m6Jv;G&Q%$qfLD HA`Qb@V i3pzD'_?ާ˨ŧ4 jE"SIY'$0kƝrF-DID᭴2'Z 0]Vfmj {n.*="[xU,,s}Vgl8S/!^ r31:o=yff[k ~;9[61z5U[}vY~yGg۾h‰j}(4p֟^U'ԫc.k}~Tx~TP{|=%/%Hy;en8;g6<Ÿ>stS!HN1HUt[녖J.8;g< Qq.K\PtkA tۣ4e-Z/Tu!9nSTJm^2 -`,^ [AX.ui€L̉xxxx^'b=dƞK`NkC}:$Sh0i= !mj2v,LthT[D1K@CXfE)(!c(!OQ-c T\h1K `F@xm5$u":$!hD{$%r1f3Ĭmox(97FQ4$2QI ~*QF5:@ G)v X'- ? s\'3+IduS8o3"D}َl?υjx޸JJ==GTo -/x53~5I/^~8_ojČU_os< rVN~Ew~fw|ູebH*h$9So_:qKAqM@` 5g><*yym_UnW3{psh'_~ӫKw:4E1ѧ3 å|Ɲl]֣BR XdI[ܬrpF@)*aF7K,N@5Ba _Scq>\ȗ{z*!׾R2b`4wL2\R )S(ilOfoCw|fD]QlQH?y:J/ +& M;dtlY?~hr}X-\1C0ݘ޳/.NCֺ%լ}d)B+nwH헰$}"aE{XNfN.` eS-ad4>PZ,B-}YUXYti'OfCoG~P^']eUon&]LVUՙ.ŭ{Ob`emԍ't^knI'\,G!ځmݢr1. )O<% `-UI!8lb K$$F>r6@Qruu0<Էt7Q.6fhRyH\^/Î]]W׊C!`M"Equ{ M:jBۜgL0'3)T"hq. ,j*Z14\][Takjx})w1mj}X %2@H:YWΕ*aXיSO&)6Dͬd]↯@9ΌI#j2AA6'zP~(f5u/ͅt(#*JtW !ʆSZ5&tW dU2&Ҫ!!ژ(CQgQbBy9<YM1pjD #1]¡M>(z_ PTC$h,8wR0ͽθ5&PqgNrTc9X4M FfѴnY/хCQ|oV Rr>jԚ<0p^$or= emFx5t8_H>zb kF,gX=Xf n1^01!DzǵAR mZ 03=CRHmP,WU^ pQeRшIg".K31#U;[)%8+55+R$&2$PIZ[bf3L$֖h&Phr(iSQ)*eBeCh'm12xvX4N]U[b O.w~vzZVK}^ %dɧrJArݛ&1]J!n+qu(QoZ ]TZ %bX3DAo<40\T+#07i (W| 7.]RRjւL~N5;!s\\UHPQD |a'M ˚W2ض_KI'urv4 vNي60WxTV@1Bkwc/q{ȥ6OS޷8}<Ԝ<\>FG)0{ryN7Kɣ:\NRMH?Gy>GBK $.JD>  KX$ah1Ø]oйZ;uNzUZ蕸Jd.[%MRJeٻ6r%W<.08`$KYcg8}IŊږeM*t%cź+=U,.sپwrػ+L+^YYHD5.RHqЋqKf0^=}1nEn8{ p2F=.~ɬ ?kFR1b5H] xq_\%c, ^i/z ŀe^.J 4zѰhܓI Ovs=OH0I.]asymFm0M17MEh8(ijT](MjsA"Q2Q$kOCuoܿu+v3;0 }YA5g):DN5oTȲ, ޞ\Ȱ J.x ͜1e8Ḃ!:ܬG1/U9{=H,I4Bs4pIf;gA`%SVknf\~BfVHI CЫx(S[60Jx{Vyj,\ަX0 Gܖ. ? 822LGRHJP6(l`qj(g8}4edZ+{}рI$T\(Htɽg㨔f7̦h>HJQϛfo}Q=yt#F:6(ja̠;3#"ZߡrwuGuN[`xPTbfg>@:)=f]қfne/?G@opKc$ ڶ4xq /r?Yv t}KEξ\[ =wR)rFb] %.n]=pUǀ_5WteEmώf,;7C5}4*™"b$"H$2=R8906T3i 59s8i}miΘ9a|m <#F7qo!H8gG$ІGݭ߅:g4/JR.) EGr0\Uq www(=Tȍ4=Lfm< rn/9:FK^nTi>Te7,O`cL#֛iw~ q]|Y]>}y~Es%O+}Ǖht&N0xWCag Ɣ9!DxPۗaսozZ㴖(brE 6ƛm:4e$TVׯպhZBJ%Q7ƱorhUnFjF,ՓzhNL"kID=90V5@98"ަ7 O/<ODTW$ߓg &LH4h Iqs,Y5lޑcR\y'4tSU;xOpH98xJqȽvz}c+)>@9S8"kt/" / "N%hu\_],D[4}?{t?̘ 7^Dkv5_/8aC)O,Hc5Aˢ ^Z˵u?PCgD"`K~_D^ݯQ:W%jbc.1]>xA.kJEt%,Z֐ZʅYglV7/ ]M/} +hFw<%dZv%M)ʱ=ּA(mtV鿇JΎGtLsSP U4V(QБ>F0..ymx+y-cu_m3cΦ0 i#gz=4=$HKj6"Z%i4$==7{2#]|H'm\0: c+I&̘ E!+H.ii$fTF\|s&~ߓ$ȝnl004yS~Oem?S2.5bވ3OIM&mkO7[DKB|M'6\?>MسO"2Ŗ:x.?\+FpP)e 7{};'ξwr~^t.GШv֘`Rnl'_iM4raцvjs:'j2\lʶh0{0N&܄ؿ䜳v/g+P dq᫻k>֑U&Kԉuseþx$Q|Ѧgmuy{d9ݬ\C-٤lM Q}HjA7Rز-oһx `jDN2q>NJV@ѸWQ$#ؚșcZ|KIi\㈮-х>0qJl=!tVtWb_cx8<S8z^3\X(QHx8# 4*A#߮ޗX &hGR4 ]8#UeD;q(mn1Z#FdZnEceβ.fek1vnn1A $6lغ-+ ,/Gtu&X2%vxhRJ Tj|:4x8<}q~&HZ?49]B`p\@eBD=uJ*͛o-Ybև VdCK8z81Hvqx#D  YAl=A3'k{‰odLXWB>eO;/bN9(&tth'FcDǭW;?p@9S8"I0r~B]}nk@ݧdLeu`:?c{X֗116t e%ppWTRѧ&$5``]ۭ p0vfUӯ7$ dU Q]އZ/LSHI Mc ܬ_K9T(P ZBCJ 6`#ޭm*0?uV}4C.ZUrr׶&+T&ô6}$C5?ahF>1A-ƈM0ҤUIg8XTNiʎY3w"҆m>8z1QIosd(R(!!I>INu,@1T/I=);qbe> Hg脒0kVJ %ŝP=&5uBg6RDuN(+Rz\mQr~v}j&6v%^HGĬR8KV;ӷ^M\/|PSP2AQid5bGg^ݖ<BeXCЦ#өR}nA=lA%17 tMRH߆F?y=4@ !?5gQJhvt:]lwqE1K{eLሀ&Ӣ gezJ$}hgۂ=tˇG?"ߚ W t|$~ЌnvO#li]/iub휞h!( Juoݦt `ȣ|<)u6+9jufU)k a軌Z@EȓF)E.}3%1SvWO f*+9Y℥ IͶ^y퓻{g[{-,2zJ0"-4$$@$ *v֒c0#a::cAf8'Z1V,5cѪ9>>c4O1;\|%Јa? '.޳ a ئC{FFV[ 1F)gW2K}8xZצ ]|Mρ|5!G& ikn( 3J:;xO큠ɎvާxcsL` IdWyFYf~;'INȲjݎxq!e:IFߪcl;%b?X3pD3$sFXі:yMgZ8_Q֞EurC*pVRHOJjlAv88SBk숿DHp}xy)2\tN0ƐM|cwGFiJgZz?W5D!8椀BlH8ZBcorSUlwJ Rj۽ܹp[x>8|jt]?4v`睭T> C2ݽQU% ;qڜCy(]{{>7z:GxA Omisw|iS{g"ޝ>tgxؕ'_/ :[cǛ?4&PY p]M$~z>_WA{Kƣ Z7׸1R'^q#$up`gcsKhgB\<w:uyE&Kl'PwAYh/m,*,Wa eYhF0+ TNow/F_kB?^!1 %.䄀 w+1!4F״[?{p4rZ Rskqq>4[kamӳ!V%֖&(#M@5(ZG- oy)Z-ÙV$ŹtGi:lu#dֆF5ϹSwʤKCn;eј)Or؀3WuT$!w6/Vc`gn)q)&Ͻ{!v^IA0*r;x};țyy O7#ۇjާp>=7NER8m4|m)F%Gwo{I冷Jflئ`7{#EqL4^<ۤAG:7B2{rc@b5{ z ˴E͔^H54G1(_SjC4Jħ:Nh-_Z:45ȔÙzi*͘|3&o1fLo‚'q椔d6-p<=^ʈ㺅ai ?`vSn1Xiq`D9P&3< bhH &sP*ܸjAW-Iߚ4%\Zqˤ҄Q#'qR) 63p[q뀘1p='•'_kWCpp-fU"P!e=AJZa@HUHoTY0c  QE\A+du{R5 $jiIIP=XD̍"xI5T:ωx94T: i*FFF* !}ƄAAei*1 U-fII$ KPKP1*eE7I/˟ZʻZカ/pw0#pLt.g:~!\GOb,h҇L PekKadÈoKCuHw❆uRX,uRXֹxB^Ds Q/O,;1&SB10K>|lK<% U6\Y+fR֎,eR֎UsQI$mN x@sLQ49K3Ϭ2<" 2BQ7(B SF눕$x0)-M6^`%,+"1Jշf&_;‚yE1 W ),?RP^+3\M1I\"PFC!9fi!L;IO.p."ip Ւ!e "@q(U m#RˈԖ椴jOARLĀt5^cF. CF"W!2@ uwsc< @c%gA:gPqR <8Rk*>B!*88+*M&jK٩ȷ*e]bgF&J׃ bcg%5"Z 8^c  a \|CcZ݉!&TYƼR,żR۬ڮ%ORXKYk/~_,u<g(Dx#uhrɶ6NXe%5Bʬђ˭! }NGp4I(4^JMXwл%LpDE/ur|a ˘* `qGTR&$ Zn}9"B<"M ``ƦDE' Y% 9,gS2 K :I>f,p .`M$@$\n[TqLas-e :$PT= bOţ D* O JUpR;I$jYJ$jY1Zᚊi*Z駏k'},{<6%hCQɈV I>719B+ |`@HAEb* u y0WgX07:B"a,}4HRKOd Y O/g)r/g)rV̿\nZJVzFH27H5N( < h b@L30(wHzˡ8V(D;gtiƀ-|=iZ. u-K >S5$=&'CP\y`DJBG4K Wo`-Z%kKZP=U _1}gn.g *^V`!yC#677zr4V BYYVBլ,oV7+6`}L77,,,bq9%X/>wnDKTQSy$9D<,,1M Sųy$lM`#9U&pVH*Y;JV| `jwM.zRYSU[Һx}ճOҼs~-$;-%HN>YZZS/g*Le D]]o+HG?,n/nri_[v$I84Ӓ{#ɚI`,VS,;e/ ׫w^?юz~M;v̗vSpUpW5o7 œ9X ݘG#5 8'^[=SǺmVOGE Uŷ"rU3֕Z>қZ|L-ieSijM>ԚPI;Hg[fSĜ}&lvS5d9(ޚ&%G mqBpJ.=Q j\*ફR,k0}U=ܸ X#v}`„C&,1}fdַ>1w]fEg^Yf9y\qU~؋+6iXK_@1f u%(WQ"*MZg^5֠L*}E' XGOǾtx^Ċ9}<4\c/gn wϨeڛ/(}} ,@vmGLx$oPqWeN{󀯷*s5:E(HSk_ye,q0__xGV$gv0_̯m}ªdl2>b$d 5~}᙭/(:>𷰾0!׶^uErCvE_xl߂'*Yk} dS&9lfd#}6e@YjWp1YZT#M8?"* %O =^Jh *1圓t_N?f4Ww2w>UL> O-|WBhhO`ϫZ} N@>}u2y긱Nh#4}ByO^y) eW<09ϵE͍B/vh*zplRલKe>3KQlbI{1:Kmz'N )xLުA} ,rËYgg_Sv?Kg}vxg/;P7ٴGrg,H[_G$>C}aVh̷57D!Ʉv[{bo\Lף;%oCt73g'8|ۻ̹#ۿγ*gt~tp&~bX'8:\ 4?e`[/{0qm:@Gm?o.Doǫ<{ X}@G1T$ݺ`k֑[zEEQD\19yݺH)'AuEZc%g){G$mu'CAJ[ZU9>[2j}hznHS~g?_fB6 .w:: W Յv/fuQԣP'@(C@"UC1f(fPcTп.ǧ܂ͨCTF&H 3P:XEr.b5Yvaf6$ loy # V+QE,V>d q; ߁7:_EXj' wޅzĠӁ 7.]N#8:"Vz ,%) D/ &9ª![ : c}&PZM]YޢW(>,؟c}+1h~ivtօNIcJ#M2 UKI#d(&L%EV*5霬VlybgBf8~NW{kɎB.wNO];tQY(ՌRl"rsIF(o<xREjv.RY*8ʺ]ĩޢ_8K9T^~kQmQiFsm"ul[GNUG< X 6s%(7ԁ$25( [x62䭤* UrWvT`B! OW; Ю'pOUR%Pʔ ҃W켲duܢs+PJL[Fa}bR|+s{vQs'ϟO.++M+1**U}W[=Cͭ@;޸# ,oWZn T^47Y~#8<ǧgE&yki`7eg;(WrCv'gkP ۮddZ(o''VP{ VmTXo60^/F;Ш79̇e˰"ӗuໟ)ʗ5/Wl%ۜ@Ԯ2Ӕ(Gc$SvɚklVv =ɂ^]-g`'z ˈOZ=ug@꺖O495fq1x6g߇h&UQاR菱A7[E_v{~;3. ^A/_O݆>ucSc;Az^?./^i m5 C;>U/TM1*5ɦ%e |Ӷm'^M"N=ȝ JsMX Qn [^٘AK9ʰUQHVMkzT|+z`m@aoy̴~yE$in"3pلStgtr8*lhc";g b#Zo zG 4ts"E n?{!27_QYXs@AiDɝ0L) EPɏB*#8W)GSIY(5V/E\fP3;̲^`v#g$F!wI-ť#1OaVZ~WڥD|`l){i@u ةRڼŠԇ2iaiQR;f (T`POEeԓ0U?/aAضbSayiAXZ@έÒG’u||r!dZY:w@o<*4Sۭg3#!ei lT% 7 9hܢ| }@l^ɰʂ܍X7]$I9E7\NŊR #\_`A|ҷ|CB@G<Wju \*.s+&?8@"X&\lUL\Q-wf@-NFz qKB +4*OOޘ1,'h=Q*}OC5,;(i;/:qU1@'60ұfR@J'ڼC*qJlORsn~x*b[kqndjZ:#:K͘!UCPj<@w!LT:"e'#ʼ4ͳ2fGW҇ce:K8LL#zng QF#r5".0n "EJ#"ĆA\z%ͧʼC+KG)-=)(x.?Nz#NM;_Rb}WQWE4+dPk?]*$%Ԃ&5*z'5ܪBgkqO7],RZzqmAzSYժxhb(בGK}顴tC\mjVx$ℸūC2r?6m꼦z.HXȿ|hWyoG2FU)t i}/qi{r7]ob(#GpW2bpȡԔU=אj)Vix̰^lu;aK3rZhaU*|n-8B&(.DQiM GM#ftn p[ ?GhYwQvVPà$P?ݠԆ ?{OƑ_!!{@C}ϋHccc=I%M3rSȜꪮ*@ޙt!q݅&ѽ@5SJ&|#SR S*IF~[)(b`JCTMImV| &Aۿ4 7PfZ+gO=`Dlb'cdm$\(6q4sލmLY#${rf?̿l|>xё6 TaH+%Zt:#0; t`~jm )uF6"ӰR8:&G1I)&DIbR#|-M `)lhs|߲ f5^=B)0pSK%57Tj PԚef~ym L F{4! D(cwcR/C6!#wF>^r7o˂L]-A9)ŠE)'V%`2"m܃4qb]䘌Aq Jբ5TSk҈޺&VYe(wƉ2Բ[mV( VSZbXBWfm]+IͼL^9QѾ5snjT@oZ5fK}UqaػHjX6.vF Io6XU ӡM)uU lBk') n6'95ZBU Cm Iickpb(Xh 8&lKOU8"ZKp6V"x޸`MKY |%7tSA ;s`V6QxW@IBdS\!%w%i \\8M.ؽLD, 53+Ihom[ңA\ɹBGʛS;,Sa֝ ]?&Gz}p5:GG6զ3FYuW:$ q; 9Ls/Uh|=8‰EW#/{o&bAW8ӳO`sO#Xg=[ʒ dXrI0>~}:mh xNU t~z17,^ܠ˗/`T8MF.D߹!-/9;2f;ʯ^!W/O^\vwI1'l é놟/m:TY 7.A\L<^^̺wx\w/y<u|S`xjEh&.4K]9:o>,K9h<>̼wsI,F'rw6|+$Vnܜ0ʺ>՟nXۿ=?}}1>n%\:{&p<90wgy}:/lqKnq(g&(_Eq86_4+0{<A/zx1-9m˗)~$2 /6+'XSj.%(2maA9XCxRrTcWrAq{?:8څ L|4W/sr{R#d 1. ]Sѻl0B-<=I8/\W+ylSm$x(E 3y"~XK ߣ-,ׯ_ߓ=W%wDh X{]4<$)""%6L;mxq"Cae6& YOUB% ? œr[r,+XVY"%Jڳ 0ÈtT rkJ'dfCeY6-?&gEzSX8Y/UtpnZTID)ؒ׊zQ^p[R+18 OO;^q?GC0a)0+}t>-gTTV/p [x%p!l>idSl+q؎:fގz|2Lݩ8KG1)j[79{GnQ)W !X;@7 Ww~4% > {5` V-ZB;cjPݗTԙ)K+wo=ՀEr k !7 o:xu"3!V \8FL$cu8<1IB~^VcMqJolOчE>eK$, ,!`䒈FiP+Q#DB[R[3yE6r#Kud5-OR>~ (iEdO9|Qpr2ndڟ:0vG&DqW4b) )nI#AH<]M!cyo v<݈5.!{̓ҽ$/wI.?'Cky[ &wi6\im;x 2f~g UMY("(‹n鐖L>|AkYfA4E]w.%ʓK'h f@ieCK)2];$!9 3!a$XL* F$M\ziY/)#6u;.|HPKbdw)_N}h]|k\,RܴĺS&ά.f̪ⶤ[+Z+hx0IʬTTR+S0aP2;KcRQF#nh @&84[AQ:xI6bWIDЊ$I8P dUDžLSm#`P"d>yR5X[ fi-&a>~slI\2[`M\E\\4HC`9e!XJ &-2;$cJОkU$sԨ3i' CarЯlߝyXhKxq|F!ΰ|Ȓ|bI0 )08g"aFshP)A_dzRj{շ{V`Gw,O˒HmV?vU:5N1։DHk %U$aN;)Bf)'S7Pvb U LHEJIJej+lI*LZ)sV)šrp)I, ZnڂX$# ILLlS{j#IѻQ XoϖkCe#ZvAbmF55abR@Z5Ӳ=Ko;-tbewZajzk]ڔx G=iFC@z UEB#FKxв|N=US#*ޅ~xem [Y$5_^kQHȖ'yэX/HnjӔ7Rq=iYBH :bj1g;SOvĤڦ̃ %d8"pPMD$2+%Pg 8ė N v(*횑W(Hi 1 IcS40ύ>̅cw˞xq5bgޕoLE&I,Zq Kᵊ"2HI(8JNY't L4} g& [qXYYд6=ֺj'An+_29cELEQ"469z\iq/lِ.~sR0~1jA]=hpgU}P,9ya["~bUbVz^[QV;.'v|2kۤq ):] 1"d⟮Y2*;-o]1Xzc/?4xgȫc^-&Ja|.bWHIH69-k=O{3d785bABJBʂ0 B*T2aE 1 !dȒXPNWYh]iGjQ7գ2^Pj_{poWج6MͰ$UM[Pѳ]ؠ "@B( 842&&6@A!X (&býnbEКy#{1M^BF$ը+7${^tjJF"LbJt0'"I ( SS_h4pƏjƌ&IF]ԊCvidz<}R{Պ>M%CM"mf Ѣ\ޭDq|Qą Nt0rQ#K+ޒ AS/17 bI%N La >|>5Q@f|ʞb['CM/SD:Q(a@D0LB ْفv.y6_ wMB'XqqP?_S{=>M]jgw3uAKSG_6:O^ĉg0-ِQ:d[֡$b?Q耽X2щb.*zvAeAg%1ZI2]JK-xUl8޴#kfFID *:|ˎȏ˚HbBY;Dc)R+J슼_.Ta]B埾e oiP>̓&3D@z42n~F3€c@N p8>ϊlK|_(ۣvc$+G?U6[mZ 6l7W2x[w}EVWfDXAsd%rXwU2B}_98IpW)(}f8^OiS' Kyō fY/C0$d QHH@` te|HN¤w9d4֒B/AȻIn_0HfFea ^YCX.Tov td;$gp4Mb +8y=Rr:4zM@$MH @ƒPI0I$E(NqSP.ratvRl3yw@YVt+.ifŜdb~ I=󢤇~ZVfٳ阍{Enm#74 ɉڷǪB"I3Xx)n?()B-{wI+QllaE _{Cfy=8*ۚ3xϡVYJ;!h!Z]<. wKbzZ?8)c/Ko6%}iڹ-B )zAi {s=b L)3aQY_/6Ad9q]7?豞Ȃ]IZ[KZU ŧvHL%D,/Hΰ,r: uL` ]?W)c <24 Q 0 柁($*U`!'*^?NP(h]"EDRo16+?䝙wzhF<ĔQXDPHNbcJCP&XXc!R 1)>I"I#sB =q"VR_Sҥ6!u= rWAWAWcWQ% eev/4(b`Fn+J2Cdtl7 ٵ菳\5žs}V}i s/q;q.zv Bus 9/4hK\<vY奪l?`* I0 %q՟R,] PWyq{q %Ssp3O&,0owWg7hWѳWiGwW _QH?a_%ޢ`ϫcRQj7sQ;0vRx' /׺=h*h ‡oѧ1 ̘f3fAq4eyEv^PE&,A2OghWK6$R>s͋G~d`iѯ !+d;Qd4aJ} ͺ9OL`DB߮Yi(C=tQ= jŘmXblԳ|acK $P.lSRdzcJ+Wl15 בMH*d DVK _͟}y>5_a_12׆)+W|s8l",ǰ0.Nb}NL5%J26rxڼfVC909v$_jԾAEb !0%bNU8 +^^l)Uq#O7Hk⦢#aCű=MͪRc!`ZtH&ydv;{KݻB Բޒ02 5.CXpDoPmtЇIgZ$&Z7xQlfQ“r*$PRR/Z~NA\_vi="m)0gp`A,g5HKq\K%} );.w3de*$\RW;iQ#XK<jsn iNYo*AS)E@ʖc^4svTIΔ"8gr)"y`Y""[y~9B2* !Tyo1GP8lQd`\,"S ̞6XZ-o0!gY8'$%Epj!wm .[yJ30?kaMd#S+԰o$,cm [QWa⫄Z] w-b>n :-9l" *bJ}\}^t>\ňJd!NET`C$(l${ DvrrMdƳj}t+ J|]"Z@]̈́JtsڧVdTvF(mْ %}E[9ģ1 kPt?৊`_s*ƗRhow$M0\ktA<jтOT vk\C&9d1'2a3i~qOa?ccթ×S/ww^ ,8̭`Qs।sEC"Kkc yd|'Jơ==,xD?*j|i:5~\:{Vw]E8z{]0sm#w4{UwX/\~zOR2GHX?|6WL9&'yq%,, WIC8e? ;?TB(雟&S0+|?{4?i rr6KY.G^'Tzz}޿n\@ L"r~?q):/H *9|y\=|&$# ԧ.~8ywiL?͇k*@(s 6@˝"[YeH!]0T%>Ye O7EnP/'ps%m$[g7(އEOH&(WOf7R-,dތ}s!U ó/rLի'ϡ֝{z~; X >_ ?8fZrt #/*k4|;:?2noN?+`iՋ)0LCW`Lyr@ʯ~{:4 * c1u~ۢzN]ItR!ͧh| )ʖ~ =vyv&7Wiiw!˄Ku)s3]5wD.)7gz<2lt.ݿbHf1U1'Ys?/)6dc,Ck<ХAx W%)A"Zmbֿ.>g\~]\K(u$yx52*(L?MdǞ?'of(t<8`~˞M*zJ'&n{>-c/})݈/ĨS0ru 7'*eʯ{W#dLL_*e+]! QPSg,D#!"&U7\DE7< ? !> 5OWrfb. ^ޢ\y&q FhCn Ua TKLw xwN5wOG} u0b?XjibNkԈQĔ|09R= ޅra-IڔƼQ(ܒ7S';Ww?֯дrX[up0>UaO-P?`lBfyR2E],2QWE( GGH[V~FLLc%Rx%) @k@xA"D"Թ֯>)v%bUĦuZ`%X?_O?<֬*kqWpAWU.q*._: [{ y;:ߨvC'r IPU'QJ@-;Gl9uުZ/yCX9yȷ|7@ސQC~Z#UvgB% O3y>4\?jYWTi|%xy8\xiCXPIxUZ&BR#Ri7JibgM䤈>`(5S8s)J||cоx֧>xC[_Itʎ%Jć_%Z>x\_o!Q؆mVw{sՏҭT zvySA R/B A]Ὼ.b`,hbf$29xh#7 H=!F8vuL_f4ӗ:WscPS2;u۠h1";,Cm\ j-vъ`t 2)3LR^%^240Fʽie\6cfReNlFčt-˜]bJ[q9q8Ai&=ȑq2@n#g@a[qfl95oU)ڶS/'*|(vtb4l SD>(_U&xS,>U_S{GƖ>Fuc{έ7Nq%xRߟ/1]_ko~ѥXo_U)5fCF"5B<'I@GoњSM(6U]S5O`6kS䇻O.…˭vuT0FG4`y!tZpID0V]q/!\0 !@"bQH, z8/=F>Q'7TEӌ/f@?tիpkC,n:IJV62h+scufM q(8%AyD)rxKmhQ#h!T}tsئF7qgw"zJsjVr-G)7ri:so_55 _^^?βr~~E+/O8,r6_l6kn`:*i g9U hnYQ˘2.δ2au4Vf[e:krYطaz9'dڇM#Y\3 +cYɳ`!Rۗâw"U#kө^@Q]O @lVQ81=.:J1* Z XbO U.Z*5`}%/;QEq˺8[Ty`@2I(W47r-VQ~rma')J,rL toYf . c*~=Üoݴuv[w [#GHF&e `W1UhQYRfM3j0f~oՕZj,vg^iބ7- ʞ:H޵6yj [UXzE%0c04\ƽ:P1@30#QX/NY0l+׺2mGXٌ2(@XiA>,? ̹UqFG*1f0x䫹\K6{|L!" dRMCԕK{ϵΨ4cVe(˜L9AuAV,"(n009Z%Qxap($"5Bk̠1h2 x-djHu0I6uPRp'9 * йcПGt Z)0RV=+ JwXaYreUauk@jmDfXSB1MvXu*aeBϒ7>K,y㳪7XMMˆ Nf׊)+IeiTIeiTV]'U+n!k"Oto[, (N g`HrMƽwP&lf]oQy2t'e&׈6}uм.{n[&#8F!p@FH)50AQT ϸqn;bRO X&f`Gs @c L 1g0@~<idEi--s`wC BYfi'Թ;C1F}H~; CBgCp;1CaKLI0ÆkLYX`E5(wmI_!noC_>`a;AڈBO?lI@RvW=!9g8CLͯ]9ڭ Pe;mL.>lQzFi1P25<)w{Zh&Մv)Itj jvCi h%JE&!sRRFEݢ,k$4\u$J1l@+RkfB:\eS3!,k4i Fp˨$<6Ბ)T0[eeo4|c4bhR&! ^dHeH=0#{k\U #\eg~b:۽BiN;D~2pfhalG'Dq5р ?qSc΂cĵI+b΄9$1}RP9;t2G ZwPD@+R 5DgyvE?k[4GMKi4EF`aPNL6UN/@yu "D} 2/J dàu 2L$gB#Ku=TߔAó .\= IL 镠86T=3ץ %B֐4F1mDV:fH'%W:H]ifwia /4f}0µQU0j+ {q9b'0C\B}K%׆FPak8lVn8>zph8^ dpG5ݑ&7YU'lǵV'lRS q 162ՠieV7kz;@k> }8οYj&ֱB& 3eB`RX2 CV%QY/|d`\Eŵ!spux:I8^L66a{MjO#D;Id W*ryC i)*ǛrզDU>Y~Ɖ&;6XuL6XX7!uA6$CKzfqPD\T dR:КD2% K]%딞N0ݔZ1No/=qVIfƬJ}W~|3wS>DϘS^eWh^=N :dsp'7j[5} +:% y"H4[nB+8*u䩡7||bEֆZV`4pK1񑦜f+u2b$%5Q怣Y[ Qk!g"ǂp6#J9aK:JC1xFIS%M蕄`=f2dޔwV>yI[/Vlү"m0@iS4#7Lxs=Oע#O " #0 QJo3;\y)*z2Mc"_Q5ӧrXݭġ 8zF1BkM9.fpvمxX5 j0!;la ٚV:ը!9 B &<#'"ͮ@P`:m_ J}<+ Rcusv{0+r4{Njmu2L=Գ2J&B˧BPJ}c\6I#6/yIjˇ&G fa: j!|pSyJr8/&9,&CV:OfcLk!XJ8qAϼ^8CKE4ay7^+T:7~b, am;5#?!i$^%ڀTj_ZnpKf6mD'z:?uL9sJ;tۛ*R]D;igjDVй7ZC"5Z,#@Hysɛ;Oy΋\H"pS.Ơ'"2itGJ1[-(\̾ dTh1NaN zS#(zlcAR/=1gUilϐ/^>KJlTp8k eI 5AZMwm*ox3B#h[Q<^wxkᅬu'_H|u>Ȗ>OQĠ4['?$1^tmUvY`ѓ]@]CkdfE\i(Aٞy;@SZ<0CRءgzMis[mT#&e҇\.\N-Ob,NjsU[ 3ӟ@.TY4m^I,&zIq XO4tZ x,li\RyJɂ}2^?g?9{Y R ^7wgog#:&>̃4A=^FJ*I)3 1D!3Nf<(j  #ׁDe3p .z`Q4 z`*VŦVSđ}:SWR xI, (/<tjd~cHZtkkOG*2FBh@(=i(Q34ңud CE NEW;vgVy;k28KT7T-{'!藋Ok)&|XlqW > 5d:[0+B^si\/v<eBPJ׃W!҆U%7P@ Co\"[}kUDz߅pBCEVʔd=eYHEsHfw[iNjkwήO`\%Ф}ׅ_ülWz<_A9 _0j~o1fo(o{ :Gtt @ Ф38 lC:q_+#gBI5:)Q͊#1u 0JNxV8s4ӑn3g "(m QII 5څԁZqa:Sg[-7W۠M8lRxO>_{Q0ƟpOuۙ @NjK#:=~\ǗkDӨyvႧqRZ;#hc9w8TRT G+N`? =iX&k̥N?ݯ"R:dKSDcS |Os!{}l:DMq=~Ɛ:{N^\^d/~Uxӊ/OQQ9GZ,pYl9n2ӆkY_f+#k;[[gϱ+ٜ}IJ`d)/I`Mp.Lb_7FwLfJrݻkc{nAp*yl|p'<]|nﲩbTiUCj&\YQH72J'@U&4}͢HeȒB:5{{8 )jcJenVF b$Ysb=QOJNem Qdxfu{0(nq"Jߟ`)E)閰@iDcGZm#fĜ" z1]qל.\mfX .hM5d΋? i ^?;-O /u=t#዗gskovvzZj$"DZXApĸlP Pl堬m?JHb+mн|~1 yPQgUU \Juӫk=yjTeU561ӴuYRy7,+[@fu_: +N TovTxk:>,1 ӊ6,ypsՓ&44OϢa,TphƜ4(%Cь#]r'\ R;&-[Lo2SIטW{i{̪{i)&qSyյgV_ Fe}Eq*DV|?q޸  \GwngM> O}/E&|3RKGĭ&/lM<ǴMջ;[!b-!iMbX(a'\uOB 1ETk{&e~+ z+001!DDo^ nSdSn/Dwu{(~ ӏq d&ӈb Q8bL8K5D;xIQdR@LxqLb 91ވ6yaTۖvv} Y4v/))(RZu}oߠ5EKX!1i3M8ש /=eUF2'BG(':&(vVX(ku}I0^W pz7˫{AyԘf"hZ]WODnV3cK N,՝0Ϟ}/j0QSCZh!;Q8$ Gyč/3~KNN^^Ze@)FCX/X0XqTJ1vI9FjJh`eZ[|H4֜B9$E]WԴdAέ"@F]Ey%(pBQW_*u}n!2 yODH'0ilzfrszl˷,[BK~1rȓK ^|8@h|µR :;B8Y)]̞9mB-n%2sБ Q!},y?bn^׎ܙ,Jn;^<;&p~l&ܲv$ٙ3瘈J1RMxg8kowI ;X2Xg D@! d כ|wpK-2W,HA%`KXR&tS ΑJ{`j[TRR䑑b9.ʵʿ{G7a}'~5Wm$ngh!I+ JN. s?B$(K8! Ueq,0%40b9%T ^PX4yhT?j X֣I*Y2\( %*fy (M @@')&EJ JiQj 1:5VQwP Dlf:[8,__A޷V+3Z(:EVJɰ;,Dpd5h&9tYSwkCc0v8DIŚsb4.oWV};jIj ]jԱ]DzEL?IUs1P\ckÕ"~[:Z/w=r藊/fP(⧌W߷Ωu5g(''a̧ ucOλMEgUb5Z-4Y(jfnpy?Ң\uqn0X \X vm |x}e1t|”[Rw5 fjE,|gp$/iϓ^:=ƾ?ڱfe2f O0mG|좗؉q?_'A1p.czT~1Fy0}hdTcY=ƿ<I2L;Y<-ܔm(%~-Խ?/_yv9:?q5wF\\zM&gŚfMrGB\D+Tߦvݪb":U(ݎ<5Vyڭ 9s=\@#ࢉy21N^<ոYcY~pj.+|,QDuZiYr~99bQCƳg@ VC\Dr d1pA_)ys$+g۫gك=~WA|ۿz6wPbgWt|x\bri#Y\jZ`̞[]g)Fr3FWna{gDѣl24xUٔ:DdGdi}J!LM8P.0a48䙇0go0ɔ GZm 5A9)RrKS=~ o@tT (h@Wf wlsO!_Ǹ\6$ FYHdxl٘;æ[G }o;g&:"i*#06cͣc2۵A2KzƗ nsB(=%Ui#w!"4`k=xE5u8Q ٝӢaꈛe oY1Ʒ|(mܾ/sXؑQ`Rƛh]$ge\HKeG+sYE H 5N A/K`f T!Jk@ji1!69{(daI ր`za(2p d hLDl @ѩWۜUp % >G=`0jhNxg 'wz Rнj00vcdԡ>'p:P%e[uX"AԭߌܬsY`g ܀RhfQ[wD`Pj_ձXק8uI #*bd!r^? hc򮝭p.f빸}1H2|  v[BPM[Ӄ6&UN(&pSߖM;5biMqd9+ fqI0=L0ٙp3n2Ɩ3L[RnvZ38(>~U,V(S^LS 'DCdi &LJCE],mIN%:ơn [kQ;fg*x>a#!*Ej wl]~1[=ݥɌq, V&þR &DQـ `UUd$/ >x4MBVp/YVf9"EƄ/2"[j}Y^H3N%ɢBHK=ePG#BO(j d&$o$Ba5:Ӊr=6XNAp ᰶdXd80f?HGHh1`ψxIǓ(j?Ռ7iX''HswHˁr`ޚ }Q P//I'Xќ/;j>cylNR 3xgB*Ώ`Rx#`Ί-O@iiyo!&RZАޞ-GS|=N&; wNpNp;,0X:En5ϵ-#,QȂ H`KE=zg3S\- T_+gؼaG+yur:Wsu-SH$!} SKփzOQN) gPZnyt)w3[#BװVﭻ-彁!90$G=k{SPa9rF `-}6Y|stP bKlO_b6?2Ym*f&[IOir5`#Lx.](\J'@qXTek(8N+h؍(kx+PfBsG0X4Qa " p)$6t5yTq֧(k2#T:ai|LMxQ!#㺱4{SD`^HT0'̈Id<4x6"*3@ R5eG+:fO1NbhSW ֥٦ mQZ10 -fc% M`*J79=AFdV?>'ԏUj"Axc.ڠX'"KHS&kJ3?6 0x`87[-^O緳,K<_ ~;{= 1nf ^Ke lro_ހP?n k*0N O^\s1kR5I'cqG9jT}F ܟks} Gw V'M?y8]w7|kڵ8}Z8ֵx5\PT(p A- N]y7 $pDZT|>86NՒ:7Gdpzgz@dzAWr_(ͽ{u)%{Zjͩ吼sjw"}2wȮu8s\g9'KчNx::GVJI~*L!9CT~kyh):-#kSt{֊)Ҷ4XEFJaRmMo&~.›< .?MKIF\ݺנlE0' =K悯^gVݷ}PK~xJbqq a]dGD҅4h7}RR:)΢h{)`"_\Exg5cl |U<rX/ zKf@$K/혠Ĭf 1} k @#(Gw@S,Ft[=Fb8'/DbB16QQv#:!GkBxQ}h8rZݵ ~tk/> /%0j@Ma}x@Bp$l6&a\Z5 HJjL¸Mm#'Vkc7&pQc !G\ HVWI fM¨ƣ&! 5qzpŦzÿ&2NِYۿDStzY=cP!&hhZTO(LG=񙼗E1{{'^1WH:"${h^r{DS~t |<1NU(Ku0 re {3lUyu{~܃dIsiyiyC^e RR!ipHcPK5WHZYq}.x!@|ϰK_oY,\Vԁ^OBfLr%wc-GUe W%0eJ*aW.rSXL^]ȩs4TZ<DaG2xa@+fRA|Y9 *A=A,G_ -zDd`'4 ܲv_.m6}ΚewOGvOӔuT<5Xh1s'5*Y^Ʌ22 tV2Bbu;.ygtWqLrt$|&Yقƺ&˟o΂L,X,-VuћY6^:k=Q9¤6ϖB#yr@tXg3,W`NUR͕ljn> B5czLe{`V 89ӳ(nc:0I8[O 0IfmMdX rYX;AWob6頴w7C :X?>xiwcP{F +:֥(u:j\U.O?83LV,˺f܊ 5+.|k>y 4qwq51H ۯ8F=*oG+3>HK=`<~G{n~c߈L ZGGZNitڈ)Qǧ|+c7nܘF;U[,meQl Ȍw{Tԥ }oԎhXp뻶Kڕ>R*HT Dq{7$T)P;BuA*3u@Ñ=r Ddj3 WɧrՌHDBi)$uID\(RwQTAn`CGӨT܎JXGqD..Tf)D^Snbpq݇w|"TI0?{WƑ/atWV-]66ś,IJȡ8y!i0NYNv,5FԼV6qt۝"}4q^DN?yAQ3f3"`^N ʘb[S'rnPzhSoAJyB_tjbAnTؚ^BpTh{@O1BLr"UVzB.YM!%q=[.x_n޼{8Ek$Ngwgο'9}dzۧroz*l^IItv?Oc;=t|C=i'o&o}~r0#j;'"0 ɡ7Er/L- _*@p^ҋUzUs6Dm ؈R2l3I⟣3Vqwîunu-mAU>NMu4UVI I>Etni:N.aBÂO"F/\rE~Bb0Tc?!*'CW٢dw*&+ɴ  aڝÑHrc}Qbk_Xb9^|ǠeoY+)Q()}֘+#K'QTlq ^>77Ck 㯓`ro?۵m9^g5@׏@P Xh,SZ$i9!˜qo9?f0,iN >DAd84l0F8m3ȭn"G2֖>cY zQq P)Wal, 6XƊn aɤ 8TOA\$~^iD(HjYV>Ĩ|LZnw)s 4hbP]4L<Adڣe*|F  x3p͎.yĎL8BpD4kfU&`*SHĉ5g[BOl!;;g8OWnQ}acax,/ U*}秂R|o+HdtR(KFk;F}>X~tQ;ІKt6\AmN/>K -_7C߿}k#:~|f2kLےG?]~k ޏԔSe9?YUl~s +dOIZ^Y=o@J@z%=m [ li,দݠkLMHrMjL ->!ژ4,Pg?MkS{M"4W}p!P rJcNu&X;9RuFPK \:OvA og?mcοq g_ii؎a$Wk^]Y]A(v:8Ai*{1c-V8 W[=`Y㩌#6%} Ł]- &hW=W7Rӝ&ЈO_gOh"eqFC_ZpC0tTeGny' 6GNrSH'),n C·S JnvҌ4M#ˍ:inbOkE)vj'S {-qz!˝^R8R=x xI9%c5Y%("[-PU Em9dGy`^ۆ۩^_v7QhDc T ::LD *bk[\7i!i60deqvC9 1 dAŔJ$e é-pOUA/>]τ6rJDn8*w5|yQ{w,`lwɝXn5]h#ݎ+ |} F7N~2N8Uuhd'xL`kY sc6^۽@#]Ԥݍ4v ӌQ'zNOIFr3k yݐN? ">?EݐDS/i!*6-MmsD](@2̷҆H^VURQFrC§v'ʾ {KUڃsTY2{1;m9[o&8M'$d:aVhj,bM#I1!* 0Y X)AsJӬ^ P(,(K KgcIuv<U'NBɘHtZOy4w J*D״˜ &hV>*2QGT9)QG!^/C2alT]9B3+BE<=A*SWSD;zm6ނW.B+:vhsFƎfR[.u4M)/|ޔ0_0{&ğf.bw1XtQ^^#6Qn$bX(xb=O4f,h!,e*Yg=v&p|.ha?xkGh|`)[ֆܪ4,GІE&u|D+(DlS@Le", p>/C0ʭ1w j U) 8>"!W*8ȿ13WU^gn QsKۜlpu*|JEq) <$<{PY삚feFs; oA`g?+pO'&sOЋ 1}V'c܂zpiuk_],)ݔ'a}0LV5$ȶ<|ޔYj(,R6݉wV k) ]{I6B'ZɧpXێqa 3xknc$ewCA:N gzd\^i @j:‘hjmZMY.z>lRqx@w CS(kkmOTEhwkStOUudɯhUs5r{;@j/AtK*$+.nKsDJ8>Ү襤V׿$hCc2ANqfYmɆ1I^ѥZ&C֒ 3XQFavW9Ǜ* l{|C̑rףW#ֿKaAz͏^5ƚzH\jtP'=ЃYz?챐S@B+8')M#vSP%ܝPC΄S\yVSE)Bj[js5x7M>vZ{_Llo5%ĕ OkMS6"< )J!ޛ7qt3+f}[Lve$k( iI$mX܆g|P !rx rVMhr`YVC4>ڎ|]{??#U~>Ց_<{<V0f|電/w+蝟ɥΠG뾕 gNdS>>+poCSsw:,\I3 sG5L~zxy=x </ꁯBjZ MR eVPJIia\!p |!V&qM΂\๏n6wNtkGPtV (Sb_b (S =V0kztGZ/K ~>V; M+4|-m綃o[yȽhf Ʃ=59Dn ko{ہcĶUrhxw ֑} j6*Q{俹3JUnKwe?v -;C!Q .WnLYͨwq7˃?=w -xyQ_ =Їs}kosˁ5m5>_=P0K郿OkHn۳񴶿f$=0=olTYͥkF )Zx{x0;C`Fٸz<$8([ɞ6ň۶-huK"RG](L'Ds6қ<Z4B_EBJmV=w:MBf|)L* n SCnPV!5L @%).XwSڀݍG*E ӓ7ۏl/jIgJ[sn6m^e˜\\x.ol5+^p^]]W_I_%@> &0dd;ޱeG_Q-~SFfbUڪ[Ͻ-T:r7]qR{3u˟(;ہfuoE՝}d8`\n¿^wx^piXz|պ{+RJŅ_Εo f?/&f>>>UU\ ltې҉ I=+g_XP*xbޑ,D2._֘i{-c@LdzZP K f// P\F5K2b&|޶YKSDjFp9I<ϯ$Vj?F2VԊE}/GO#AJe,}AuUlJ#{*Azg#Q[*?tր| fY~XZb!E8Yxxŭ *Jl)8uy{hNZҽ_o/s|Hty5T'5gJ9|SDvIgmBtѡ-O\̒RwJid 4\G|X;{5gL6VhOOƸn*i*q6EќL L9Ma\x}?=ٖPpߦ]f")P.;/(*pi7wO%%o)pUN>pH"&=EN)>C@Ba.òLM`3kKg٢k1+ \A!"ӡ`Dp"D\I v{GU׉??';vPRh/\qa!.(|1LQ.E&]XxiORpn=Zϱ ϡ iQ'iʟ' l10X:P18"s&9òEkDbscU_'z.ˆ 㫝x.70<vQTns\"JJ BԩWZR S@1*ߊA_$OK.  ʲoc+4Y沟K r~{c_?OBmūP[*V*V\}v(z((Z.,.(h=զNkBI w\ȼS/a=\x5y;Pԉ^M/oM³_RW!yRW!yUN}#l'8jMT WPec0kKL"B`_?bar#u$H|K>p*Q5̝cϞ aXڽ84Ծ8t+qX#*hYF; PL A!$\H v( unK7(onw2GHᣬ+q|XGUs559G6w΃ZмP /uIL 6)_t4E+QZkE+arBO#yg<0-t|iGn"nlɍ`kLs9kbP T!B2 jDKrXּ.nd% oS:n)ѻNb\ZѸaeQ"RH,`1*8-օMP'`7Վ-iVq&3A8äG<*)H3e E9R%0AS4o (HSvƣS{oJN2S$>q]Mri1cL+, `F !][@=8- FE.` H YvdVo('ؐAܨA &RCDƵ,6Nh̩T{dZ2R0)LQ;v2$6Jvld'I$'%QHn$1\{q)p.IJU ,$1d['"M;g rPQ&MS95ωHB{c3G'#ܢ95SbO.l5oW0sr^Ns:&ɻ5lCq202BM_apjq;dC+"%3U-_B|MV4&i#>IoXZ4&XO Ռv0^G]WOF&O'S›I7ֿ3Ob4AB%3g!Kx$K%Bb|/^OJzuiH$ZXo ՠ F}V3soKbD Oc9C04M,WLy$F.4ԼYT˕l'+_ԼQ ֶ0:]%u޸Zp[$=:I9ݦzfGC%2Ǎ%~SQ{?h!=\=Cf a3i!ls-H1G!AxF¸kB.p60N¨Pj;?>_}W&3&*!zD\z4ѽd h)=G$]&cD5E?D?Ft5-AUG{q4ѽd(ւ YFW﫛ff$K[8 0f1\Hա89N Ӫӏ;QϏג0^\tv_/0G0}9Sϒ1tЀ,9A,uՀNCxSP92})Kj]^zd)sxKס;etg:rg,!c*P8:ʹN3UbXǦVdfD 51jlPV3EkFPMԂX1w +hBNpPR[ֺk;/!MO ':&%W`jAR8 7Qf͸(FGiW.pi >.V'7KTZoU]K^]cYd-kuS[ֈm{]wRG9Ue&+kZmE a=lDr]':vXW8O iFr݊Z3ބi WwbSW8MQ)S,;Jwj5XdD|AEx!V5Q8#0_/&?__e|sO_whvL`EOCHxwDe7op;2*rcf@܈JE ś;"+_7-Pzf47daeG[#g'tnlf/ܰrÇ~}QMvF{4Zpw@p-Eހad X-revchqI!x/GJj5֌Ulь{j3ñ-2RyG b'9HZH V "mLh/YLB[rIȹRžQ*pZrR B< E%)fa&t`W+Zr9zOJ !CC4hgu3Zpu .~ZU< ˏ>a-P- 0l %& s]ߒo~~!2'~!gp3`!=/o.`kztB*~K"HۓÔK@Tj}15KFdz{ꒄ 4brO)|+,{ z"&B7aKzc7[QC{Ae56e+7MX-D>8{(|C.}x3Ssm B? eÛ}F^ywfvc@a:J&/v*D1߭:o?ރow`_JDU$XIG+M(6[e3HU~WPlW75ʳ4|Jɧ#eUqMT>vm#waBdK 5D]8ZJDD Q\]B%5*brp BE*|ǭ!JÖHՕ-܀r?ZFrV `KְR䵍%Vt/+}U#qa-b56-"6|=yU抔z&/f\A"6}m&4Rg!B?M8'Rm2/f_}n[f*Qw˷5b+uv]mO/SVznwoov#"TG6* jS(v*H+2ڪ V㕺/?)?[m*7py&#eH Qc_n Julc(8SE&,Ñ(g9,#0hdNwm=r.n`ffdfy9AC׶'nvwY/[˶dSKK$̨W"YUJaT-!$G*S03|,4sG~gxڎRذIm)OJwB ^D#*OvS۹Om>l.&sO N0O }0"(8!2RgRP4%*%Y))9ПF&3:&:ляkQʽWEEx=Z9ZcBITy R¼ToR*C)-f^0GĒ1%!ae"BN=u( ^1 |=C9Q$Baȅ3 U#~Ḳ! pV`tBlzV.kH!0'??拧r}pMJ<q7wۡRc ێ}-UDJC`PqJ0sR4ԛrN16jL vTR[le$N%ƈ_VU%JtmcsHF>p*ny3M%`[!n+3\~ƐW"8ޥ)I0 Q9B_R})!S'}6ld,;y5.iq҆=$/=x9i)F 7I$f2mH C[jS݄o?Fb'fEI:i pQIh..^lm*W%Iy[{˗_IɠY=I~b ]%y<$@D;Y*'9^%43d#ɵ âAGH2nr0nҍEeP\ZEOzdb911W0JLޑǏgB.98Y'^zޣ|6c n\W1H﫽@4 ۋ4&Smޕ3 m=lJM 5Q9rı ԓGkWd 紸A® % Ke3 (uH`Q|$0HfwO5>=g5"$N$)J?L$R4g%'Bp2@, cH@d~& ;5\( N6SA8! p9DoHC\f8ܚDL:mI̦e$LOSF*+!!$2 #DBI! N\$8RN[9W*(0g.`BH\wI!0Їc:6'E&2?zXahl4@-WyV@Aj_AMojsbW.qʿ`Ʉj{32uUKUh ^lဨzJ*iZDuղ)TWLp$4N$)ٰ`9./mDtT?8 ғi$&>v1.TXh(~ҪYv_5![ܢH'g; MHH: TG:Fۈ PDl^>>(4QSW(Iq>T0Vq,(@{ZIu.צcC"=+"}*d8|vphJg Սp L1jO! J9~uw5P{x$z0TBݧöF^RTaTUbsskdВ=qwp;vTh+hJBXD%ctVA6vkUqXc >^oyh8MY"ADÜ}:)X*"%d=~K^%+PڲHgۊZE4w(kZEd1Mw$:FSXe"< ^CmbK{}0ĭ] 'Ho w`>~$;^re IVΝ{kTp) ߛ1w ke&Xj9RYHw-yo@Z>+~Ja$sf ˁ[ %-R0خ*<@a1 w_fr4AS@X|q'Q2ZDI8nK.1%hTJE:]yU|VWzeVtq~}u ݬ*n6J.E;G//3fa~rފ0Gx ^igu˸Y_ӫep_LÊ3G6 &QЗ$ tN(f;?э-m2!uv,3]qyڴ'-_П[~a㇓s0Tq+͘wUhg%SNX:Ba7UZbp*rrb,kʤ!QyAHhe[N` YvG`C}F=#0&J>su%eniž^jYU5j5љ "Ss{@ j =)E GW0k4$"Tr@5R$/ѾS$|O=<_%/ 8vLF9h93#Řă::eΆ:$-Q0WG CJr.#5MtB ! m0A~#:UrZRBbx's*G4@֫E%#S?#_p))1Qm%THj',~T䗅0/ t3/^f&A5VOV$\ʹvsN2\L|;p ]JH-e%*i;)Y^"S1/=DG($`IW0>ĕv $vG{T@;p05m5*U[IS@?iyMW4yMpQ&%]HNƱѫ\V"&">zivq(<()\ݮg7yi. 1½xކJ'3Y3m [GЧ'Qf1քwyo!'5޴:֋x?ǩc[?Y)a'TkM4J\Ѹs.:4<]5;fF9PQ>$X9(TR.pD2I 9 !4c&4pZÜuՇyGU |i֔w /Z;94WD/d<|`돧X?aljiR]u&P2&[W6nI pn9\G%bY~`‚7Oۿ|;@l!SNe+M36(XwACMp5PWBcBjC!Pu4P%f "-K g'3}>mH|j(>EXjc,p)!%U!7lq,g_͝ W0阼#{׾9lqA9ԓE6,/]r3gHm[1Y˜ӳiu.35ٜi~:!i>=?3mqC~%qO4,O&W~V+SLfU?"uo y?JU-~Ǫ\Nf)\r瀟`5~fS]Md(YmPrK0C1qRW9}JL@o/>eĖ2İFWp&/ƖY~QV/x_ǧ8!-lX"1nG?L&grPfy۷Y?yØLE[A3?#!{:[=Ux2=5RI2?%"!cUM-o1܄ /|R %7-g@ܑ,+qxfqZ5WB掁SI+繈Ѹ䌵=F!G n%':1N{h˘@ JmȴUM[7% ,<ﻓ rMOYɉ>yW2_׃4SӖ{1G*T\%8Q%RdOGK~p*iFoc"rM vhJCSvCjjcL"A.fim̨ZT##W  LLbƠh&F]0lDeu&Nr ("&*[U+#_F^a.b)}1E̔79%^<[5dhcÓ @PJv5FՑ 72wUޗsZأ9hyA@%ilF"\PJdRLjDYJ Ns*ۄeF4Ynd2N/[i/o+XD\Ƭ&m:OjrJI 'MHh6~,!gyN We+k'o>F'j8+[fpZoS)EԒ YqȰо2OgѠL , (MSE%-o%g"HmѦDɅej8]$giඉ9#5RwԕfF(,wNDm"rD$&N!#Xh#'^d8Pp `\D(5 5L# aWP oއ dVkd8pHQAGn6de,U։ukJR>hQä32sĽdW D.e@c-xgLZ]?0, nX(oO4II;˹ T/؟d 1b{z:w'X`Pq.))kcs~,#L!/:f};)En[ .d"m%Vg uGLdtN *)vouz"6;䰏e zc #$eF!sl?ډ{Ihr#b` ĤO-ye]+roݓF*K?!鏄d5)ct $c,S]L!#r!our&DK7ߠ %|Be˦;TTòIEs"C,8`ރ=v);NSR\lr`,Rf _Oܺ4Ƹm֥sW\<0 Axp]Jz|q K{0=:3 A0uBvF 5i;N;+“;odйjݢnvzk#4jdЃ\Ww {gUpGU&^OC'"׳5tt)Yy-~v 7{nf9e_{% );?Wkm޸dK޸dzg!Isah9jM.ʫU&2?9;?Tu\;|mOTr/ Ȭj|QhYB:.[$2~%.rFX6``.)]eP 1 |B&z)sMkJ\ӎ\mY pVsc J˸tR$vӍ\3Q=dAß1D~wW'=L$p݀hF#я}{Bzsr5!礷ߜЧj]Mpo;7(₞/\7»57cB+` y~({j&O*FW;VK&K9'Bj$o`kJ8|`5)/y2IP&)&mFf0 rBdgEIɝY*0戳'[fU'Wͺu~|e4B4J H&J ɗG"I&\fޓNl g<2:1H&ZY)nv%qQh;AVX!ݪk dl(ϑ@4ϡʥ2BEs4fhS b zV 0n}Rl7qMoJSv'V(r§@UtlC&0,ir2(Xt0׌$] ")ŴM2 ؊t-/Yr7MIhJ@Sv@'*Т$\#9Ǜ!IVWq B4r$8_Eɥ"$u$b}}{׼Td^MUGQ Zi<q.yrr߅inųɌ?*ctj^J99&gf"/gތ]x`3?.o_| liU}lQm$RwϓX7boEkGF:3͆1;wոc45Sq!GIO=~sRDVsE:jV*I?,3vZ"c%U\FҠ $-Lrd UA(5S$ˆ֩i4+neo?:8 m!|5ޖZLW(ydZ뎜_ {F[tYnl'+a=&,3h?Ο IwDSj~@5nݭqqwkv5c ͎H8!k01d쌓 C[T.`.\vLo–[q’OS mLQf-3J:%T7)@ l<gjzQbbJ{{mElWZ}r<ͅj"v?mj6- Ӭw[J%!?V ix@^JbZ6xܰ}r =LK빧 (aZs㺗bɕ|{LG1zQ3b捉trqa5y]ްC[imk˪Cv6֡POAGsb{أ:Xv1v0{QR9 R& zGu C(:OYSa'c7xv`تQ#|&, _uO&PhmIR|.^0 grK os?ӓӓ&鯡bJkvw5^~ 1E,'|`"/EEc +f_2-(<Ί$?)a.k)/;c$@t;EJ369!g9k44jp$ c,hܢEgʚ6_a!,4*?Hl$۩TRaQH 8;p! yr 0o5+kb@n'FdޟY«UIk4L{3ҧ* PA fRkoofx|(;FV|Pc]L26^PX My;}TSI2vpR /(W V&P#]&/(H>@x BW$p~9rsv3$V? wo##(o'f{QڄTV/18)S'pUݵ[0G)$ЄxDH Uu(@1>Á9 U;HrdH699?e)! /HJQya7d#2Li$}bSBd-JaٖY@(ѹtکezH@<(E>< slH=u(%W Dlllp&/$&pX.6dU03ڐS n~`M2bڤxY8˰(AIW'<۞xBd@.z0p0FX㿃T.}l* ,tlI2j]&ϥu0ە X}mp}m@fg|IP^*iya6m2\sU\TFCvUV&كyjTfgYlf*F.fMWYF \Ɍ6[l6UR&HQo7/5~i*Iq n3.wa0ldBh[(*QDC/7*iT(N|"jO~ g E,λ5 8[&:wTVn`$Zۮ`bc#&CzQdjcO?v8B!'@q3yCwx&!g %|5ߧv{WZ簝\_I-im}[ug6(+Mt7O– E6KHs %免ǸRvFl^iWIV6ܭ[`wWQR*3 VF)nyT-,]#ߺ eDQ.ٳNaI~PtڵvKaE__C׿Mu1΄uc@DyXMO*itsզ'TeWP;d֖IKJ3ˬ.~*N'Bk&`)b,#'x_ K: 1WFTb^8uZ ť9. ML*OBʯi] G zUȤ;+>|w$iYӸ#|(~mfS WޜڑSI$ߘ1Kt(_d1w7M.]-\LJi򩍧u˖ǎmʆBX Q[ ܍7GAAeLٹ[-xe -x HdG.m~ h(B)ɼ[_96u)..,{=[(7x3aӓ~]PA9Y(dv.sزX.PxA:kb*5eY(SO9β#[,SO) 2ω ,tr~K egDLG e$SLknsP.@Pقu;OZlYGAತ0@?L$MH|wtD(B0:t\等 tᡟO~/ kWtR0|%bfq%llEB9b +M%7yCԿ,]CaBJdXi#<,1&Jt3BQfG`+Xrb'd2g;P!V SHރ@SKP 60cXdF,(Ҷ`.KP,]8 h:㕖DmJ`[Ȓra+ln =)HL7$/D&XȷfJwdKwl9q4f('[o)>[b{t,$;0 +}kkWeBhmEyQSRC9mOS YՍZ#<9P Z:a/l)~>駎UbmWg@Isݺ_{fyIWLfnN? }^"0˜Q ro9 "}'0@IsH:hbo@P ? myǍyN $` C lHpd DojSRp"';B1L8iDam)(|xfSuKh;1yNw%J%pxT:7Bw?0]Gb@U@ 7`8s< b stG90r?3R*-!Rr_}UӶO@LjW9g.O>?LAh#t)&"OidRm}hY,љ*%EҘ6 joϿ\2rEىְnԑEJ2ww9h`{(c#`Pʡňٞ_HFv=3j3z0¿ϸJtV"2ϲ@ >pWҫjTZX ʹAub?IT^|2nK{Su]"koj:`VPUN{~%vi"Y?zI'Ϳcmㅁnl)X,WͨN$''8j^Oh fҹ^"0SkiOwJ}ȋi<h+|mٓ΋kq3ǎO{Q˭jVf\KThkO~(.^>{/~{?]ܾ/|[fT4j4KmrڤAIidS7&n1Nr/=?k2FG}Rm+T&@] _wkNb4^?JL G&rua'o__l̻7=s{v◫Xz{ss4ýb߇^t~+ק\qs}a|Dѧ{ /jG?n—?&Է<_ߓXұQoH}SV?)%1;NN^k?f}WvTՌWz‡9f VĂ M13gHRkOf,WM'uQ/b_K7aV{ ީ۔LL׳䷉k]"Տ|MU؏YԺi k;a+j\Sɥuwzs _{ nۮ'Jz7hG'!"ӾWVN?1#K?"e8?o:Ư\{nm3誶@qcp ^[Fut}A8y{{>y+LC8|~~RS7ן٣&@t60UXtC$ Zp DD%_K=Q mAp"-,!iyms<[-8JWKx5jy(8j +DmH FROKL4qꞓZ%vշǐʒ1xfc=k~s?99-#:DO{AW&c.nybX2&Ƃ~<`@d*ck̇5<4yʴv4UzRb/x)7wuE'eG)L3}.+,BDHQRW+n9jGmpyj6h&EIdJwd #eSi[DP` Ũp}zr/y 8GWN|}de >NU;yuvM.ָYS+T)wem$IzAly a,hKE^H^*R֩4T ltw-X' 5ůK.dgl06|Ɉm +|-yńrA"U 9 R dz:b¯ͽA͌Wc>cu>gOW25;^9aC;O+5L2\D5\oqtw8*Cb|Ηg鳚4h(AjQ4@km¸(>dghݚ9 |r^ s=Be~$LQytY@Nʮ}',! 7ޗ)_]u'+Ý$}w}@򎓆\#o5'w${߮?~q4ӵ2qB —}o *o&a<{f 9♷fx5rhmd\<|l@c觱E#Vyw{6Ggͻ4hV\8GbvfV^nh@..γkBFd%_>p͙9#+a4:@lP)()9Kktr޾עvndgr?@[f6[rlSٷ#RJ!r5Dkz_M[prh#i.\݆ yJ}|@$( Wr ~m#Tȃ :?z65]ތ8j*W(gȔOdmj!;5*yKz0Q}|PućsHW+N9f:k#/e,Ҏc٦b~sЂT8fa~s&s At˜8k]x+):Yk/8!ӫk_Gc[y}sVT's:% /T@i4G0:b(AЊEWf3p_ [@~B2%ϫN+%'A|`Gey isUw;ΰvRs.bsN0i@5D}]&i\ e`.i/?6WS3¼̄)#fQD#YJԌ)%^*+u`1 zNѨG!Q-6OR3AM )ӆl;-s7NUtQ$ExDQ\Zocm`ٖ#ם{K &`IAzD0J.84ZD C p]bH\W^O TT㩮 mI}_yz,믿uJ]SuDH0&4w6RÉ n$HH Ft!}pPg`)ז&Fi.hPFUԞ`%hb"J$>j`wӥ75f' *Fje+ ") d(l $^XJ xsY=ռ!: y< _,7~HKipp{?^8OT̺-ɐFp q"5KS z3Jp'5.. ' 9cg >EX9ag4dI]F a*j]ͼjVtQ5߳U Xz#uxgYOg\lyk~V&MΡ[wg9Ė#1Q'&j.D1QoeoY˨?m)!Yi#p_[.7KTxmm; M{_hYYHzx,T'WeYޗf2,`':*f5. צ`*%dD+4iZ:+S^&'q}z,lwNUO޾ByIۡdςIawߌjy;Nr*g07*Yf\n4S \zFR3Rl 1DCߛ{;)<jDt 5Q t Ǎ|s[FmN1IQsHuqyNPuYH7C&YC9-_YOlcidutU tu#@Ls=Ϛ3}i9<ܔo>m;/7D#(m=b5 q)؛B km8*9m&zs,ԘTk?|X͒,˄@?#xu7 :@> :G6;H*vM{' yM@o)edSsa#a= RTu֢ڐ,r8M"\ h J:ScF7Ny#"sMWp%,J($ˌ(͇7YI~8ͺP-N!oCq_: i^PHT^{>ϬwPi&fbbE$Y&Ȑz{8 )LwbˋǤW&eVMZk0 έ*HrU t;=ZA ɕ*(+܍Ba&]O[لD²87`A\I*WI@ÍYhW|~jY2Wdw^+|&J)OmDBX" }䤶bKiMEVs۸ExBrФ Rx8v)a%b`38l s|Tb)J? (✳flΌ٩aןpV,y,\`z /GDuYHqNE};0 A挒Q;Bk˸`Ra) 7&U2MJiYͤ&`ݛÂH"ln1J:,Dv.p3|CQbߗ wi[u&zߤ yr"(')/ ^Lŕ\1ٕ!\Q6x=ML #=I^aP4#?/L2/E?@oMA{&w1-?OLCCc϶CAv^̱i 62=D)QpR-S0 :45ӘYbwq>0Iaa+R>ji!xrRh221 .12AjL1g(UZ2@Nl` #us|?s"{=?{W6/~^ pw~Z`~aDN2_Ֆ_(IKY|ꮪ*b\`hDKiTJrR$ *rW|dn-u 1}tLQ͠ShT&29`L. b7*[qA3֫d |e,mVT)S'EYbe] %s^AsiD,D.sֶL%2}JNNpdGshBs;JPz!'Q<#woZunBރ"(GB ڔ*7UNiؖ7wţW{*h{ kfX!(Xod#U=o?W|>mvg̴@2%VN&Za]PEC$oDI7mc6jrq_h)I9!/O+:M!yibyVMWZ2!e֋eD 9KIcjpFomՌo)Uk^~\h`2`7V *10g2|Z4}K-ged A,/{Dg/\PW*mU:<01u99wJsY+Ldpxl&:ΚH$Fxd*+/Qiplw$fVt7߾Iy6otrM7,{l>5 00ձBx䊇pF lEJ1T`9bü7YRUt J?dЮR&me7xeUrAL`6L AZ P:hXq `]iFR?ї4茨`M丑.6 c*Avy1;#/HK(hZ)K$-]\ 8P{9Se6e ޙK+Rm$ Z7lxh=Eud?05OBN5B#^o 'DUkV@9gS)͂ԩ<K-r"LE)Fv–AN+YdWeTDJS9 +|qF&V*#~Y.\LeJ#'RI/TU" +bNXOrc0c2x rSLTdg)`S2Cz!'^dyP׍.KriR{yWRh )4b,՛J-Pᦊ)?ϲ;H5=\|PVcJrfBrTXŸ@ZRlr:YE7^N,$VQwA )<%Yiޢ/{yg,5kTv vO86T*cndٛ&miZ%%!KQ9G+!Z+J#QAyGۀGI.FVWc&BnS]g3 7EQ=0s%JÔ\nRվw@I6_=L&hkbFmѲ ꅙIQ>Ee~8[cm^mfk?\ n]p1JiY>О1ƎqtY|?p3 c#2Uo{ OdqF˛wV͙?_wX?ѯW@xfh]Vۏ}xZ?jy7~_ʏ_іM `+=#顯 n4!}+eWON`|Be`FAb`(\6\:9 2L=)Z͋'lw׼i-%55)`ԅ.|Vbu~q+.>GR,9L]Q70c10>ꫛĿo=8R Q6RQڣ @˗uk3?x˭.zuQqxYA $z+*yUz8K91cU:,)6H#nqͿ- <)ݚUҿ>+-~9cۇD0h ֙Ov2 HӶg_[ _ʏH7铍u:JK7;chgS>/K$ϿIE^_%dm oUgFj31= U`L#ϳBٸB>J&F,m{C|a-c⸨֦u:U/d`Uqy; oUr[H0`;t+*Wtx?/@_Jl>sXm/ɉ Ё*}5kɍbr]W+(Y aYtW^&//g~j.Lne:+a ev^[ Wcv#NMNHGbdnx\NѺeڍ='owfP|%gEN:NDʜIocmXd0Mf-1V'P 6siwԼ,l̃-5][Z>[/82/:sDȻ|7Ҝ83YP7~X2Foyƌ!7Vꉲ)YM>dWՠzȮ̺-IA,c ~TNT__=9b ށQ$ SaEVQ3k C":lډhPIPh*dZd&J2Zܝ@S53c\ow)jN UJcU]ȧ96bǢcn'Lޘ 0&*ݓQ,.SC3TV.SQ4R'O"K!iî3궭>HQ+x\˿z$ ʌ2]8N InG| 9 8@8)P7:^hű^wwH[jeءWo7jxg*GwoȩpJ}C| Ž:dd*cy ˥ԭ#"W D*& +Jh%hyr$*h]@NK^+{ʼHxk}ݻv9+m/a Y\m}$iO;Myob %>fC[T[܃glpߎx2\-!; u<)6ۺ9,,rR&##M} ƙ/i9ZEu4^3sK^>ڬ:z.H~Ϩg `KY5Q l yy ~:4 O? ]h .z ԝ. ODoRyuy݇M=H^џ5Ew 5+Ҿ O_^Ӑ9TisU\q 9L~ zctG&i UBW^+8r&ެW{HKjp^ _9vIFra ҂Ҹi%6L 1oR[RG궘-8r{82r1(rv%N{% aN?#OFl PsmOlB`j6%N\ 9PRNcA-G XXS]$G·;tYlj͜ Ѯϼs^y)wFÆ. .Jx] Aa,Zpip!d/X/Qve.ؑl#lh^Eͽ0ZK'eWdoQ/ۻ ^ȠSiirFK+GH;w6|E$ br3nze>,\IMV*+Sǻ>:<ԚsFgg0b_o7_\us}S3em"?hP3OԔ m=+;7]<Qg13ΘxvUdސ\1L'̑$Id>ZC2<ݎJ9ߪ.8 Sv~Kݷ8G%Ө>*9x|9˛˜+.>G{3 Qq{{mmDSZ-ŋG43r:(6v9l6l =VB'9ځKjjVH;KAo#0-9EsݪvyҞ>5[)Y2e iD1&{}Lr Y#c^0V#]ɕҠ8bt-:LqeZr.FwOi7hK]s=j\#Uy t}\0h[9EkICp޻L٧硂PA҇S3gd>[2%%AڜFE+Zh 7$wHznGhsk 3אP <:8G&l;5=|iJ`K{*{4gGkXqT6²nZ A͜'$c9"f{I69@G-pRũPi99|$)8Vة T_y`z8 V.ƪtޠ7*粤)Xb*XI4ZxH < X[8sy%Fқ1FX P=m_`T/R0m[e)"B ^֡# ʝFԒ8Yn%`Tx V2oq_uɪ;Bci~}AFa#ē fa<ђ||%tc`;8F9qNs少`89rC !p 勛BG9 ^6R5,&V>{BoqFy# 1HH@Bx[<}Fe6XqB׋i;+F G]sbVcg#{Ske-;{wogZ%ZPuW0Ow1օ3$4z&S*~3蔸Q/3R3սD)طaRrgOr"ZIleLIݪ Etv;_n4UCBN\Dw)M%}L-/Zb{8ZarrrZA~5U:c\Eo_8^sfs~e2xvan>azsoOa MJTɠj/"y z*Lf?'n}a~"8}͏?ՌǽA19,ϙKsdT5 J`_䗳b25h.WSsa"9:gQ.RirJ3CʹyL RZD w?ob\)?ej8-I<5 ZDyOH . MӦۇ:tʅRZE*t8\FO`KЂ`nkᱹm[pp"vRAzV:h-@}3$LVɴXAXb3-|_VȔ2Kq4fq *g3qw? u /㢑1/akfxrw2!Mlx"Ć7冈o3n)kVD9)bq﹒`-9N`|Bx%>W诫λxZ\.n;UV.clok]Xzʋ.ƚ~x+.m'ϊ+>T S(T<'~^ے mrGYnA6%BPh6jvQϗiڙpNF!(=a7s>VzC)91aо1}5lV{w\3TkD-g7i# ֥$PRO+XPk(Y_+ v.7&M/0D?V븸~M?xM6} Gwſs?k*{Ɍ.={9_2"Tt9_3Y rDzZIXcQl`<Þ+K79ҹFQoC Qkb 5-ۆ}:طJ3[cTe!c(La*2!!AW9cf"\@ 7LThkP&{L*Ĵ 1Yn2RpxqY.bk_)بAc@(3 p( #أZxVs&MX$y^^_EYX6)~^wcW?x1lYuwNVX6QL{o[ɺ5sYu6|g6drV] W^W7!Ck8t> g3-=:m ga|OY婝5^dI(%Lm)Jn}4fѷryx +_ .ōќG@ڰW/*<}Uddb-BYì,1tOg`n6G ?<5/hŒsxyn+ހw>] ?QLo7OƏ~7»8+ tUWO!HS?M|$p mx7Ra_w gwg9?~WT^`Cz\89~0TaC p(o'&%0#()R O:[A"J%^8Ǫ<XiΧy:ጮg;5 1MI>CS$wH_qVy6& &\%;J(z%=I39g9TBISzYq%P ;y4Jw&P۰W#7^IȽlz}]x1J[p-l =i gK0"DȚA0Ѩ\PolsgfY6^e(uR:diH&P@jѯ\B)+~}.\h}ps~Jh_TJ4lC jU+,'F x4Ս2wCjA\b[n*lf,.ˆz -|" Q:H̐pֺHEƪ)g-DXt<@RP|9 '4Z {Ww+%DjG<`"zl+d) /WN6p7$p|C'B#JkLQS6g ɣV/ˠh(;A4)Y.?4G1U:'k1h9MqO֎֘Q^y͞9'^ﯪKKJ zDH"@1W Y+(3 A30 )`5fPUJz6>Z&GL}-|HfIÁx$LGj% á$@^NltcʒYJXPp$,6k!.X74  K2ոO]tOD4N$oް'wY`#S6yJz%yk!e]FzsQ#DEl8v)#OҨ4zXY8'e`|g57%(;B-]TP-󴾝'v8:7"x9CqÑ ;#%>0DQ*iL6O$_0Bi/pw&ag:9q=9wTGYFDC^td9,Y:;P:oyƉ0}gh =Q۽F*[=wmǟ\%ZfӺQ+⤦LɮEr~xqCqYpgLK) Ȱ@iF-667y%#/eSS#P7># ́bQ&\ *|xc3Ӓ@yдM0C9\K#xk5BD\|$+!NK̚yGFCRWH4W%&["TrO2HPi'!x:+Ѧ䪜 X47ҩZH`F{g&G=UY0L3sOW0Us-KDCikPΌH$]\Arfgd)e@Wyt|}Fm \|ʹ?—f]0yDtpLr!iW-] +];s:eDs2%H<u09QFO*A@29J A|O3(g=Q^P57sf)8gdԻMM3Oi*9m'+9cUvsYDxHFo1 rG=·pIȝ6Ox Tg R.d3忔)'D4٧U0CI1l&n^# q˸h۸{.,~lOBGd;ltK%7kuÈ7@FaFA[jPׄ5D0DpTv )0 %> uy7 PT(ZuC $n2 X͙&Ś](-7|eWrUdQ (Y,)UUZ焦>&%B?en; h\ؤwn<ҭP$z)4܎ X}Tjzx/}7+!וˏgOcU7JN4&g!:`J1:5 g3-},!NۤV/ #zYa/.XI8-P ,- \ TҷoQ`aC/3/("z_$'#ɻ{s3-so B^ﹰߗܺslnPpp' E_,t]q}L \ގEG°c*f1 ڪ _BgYbR2*Xe24 K&BKBOFƊr6bY0gkc6?&y]H^j H.·$<ƮK!.a-ʌ0ą`)`:J9B jJ#dB0Nvdaǔ%(1g=e0%)K5؄7-%9h)8n?_J A -ViI~>5ǵV.;NfOѧ]Lrx}#bԧ)MdGbӐFXK4Bv1jҿ>LW=kђ2]Q!QVʶɩP'S4*iJgc16Ą t!"A=]<^[bLkۍbG󔨖F ^G[8W4 :oa[lTj@3kiBO*yY`pܱ*z\sJo%eKyZ2v^g! ek>f!$n BR͒ñ$eXjIR:..F$SE0=-cMRutFhd}<3d/\ +-Q"#뀱 c%K@%,Ą.=PfC )xb숒`PC΂u"Jl+ǃ ?֒JW@i UAP6N0ǸGt#D$= zr8 (T)/"0fI@xcefx۰ j2T:lB>NԺ"nXJ4UQgQ]~C*Z9@ިц,>) kl0rcN+9Ƭf1 Tjd>&l^bPa>00E9MT@uQEJhZJ -jIKpEQq狨!b Ji`>سZۄjJ1tëS1Xn$Gjxӱ=:~xt9ʱ6=eU3t4iDKU+$ ZVq^hu+WxlN(J0(%rLRY?o54/,48&/ PMF|sX Or=I$[Mbh 厶p:!iNтdW )s.<\&U 51# zsMrPE}QF[!4s3LhE_-A6N g(e5jQm0,B6]^ w8&Dn n㘒/`;%cݦŃ6 rPW⫱AڨtV0 { 5,"y2.u֚A3lH+e cA@( ir*0c;%Va̧ٜ}>]уA3ʙZFF.ȆASZ(ѱ❐YS`_#D bd^³c) 829I IH#B`Yz'&є/m.5葽f8 djL5CCSmFWV6+`\Y #ּ_b$It ަA:2ʥv(z~u3}Y.;Y-OVJz+IY}|mN'*++|>@ce\D#BkY;ݾvCr^pK>ΞGg'?AN'}wq 4G}ub?Y>Xzwƛ74R`ΦzNN{5÷)\֣uWG,Z:W`)Q$1BVE{p# ޚa&naLͻXcC&y23+p3e9gh4iMP5 ^b{PMH1B3]vuYStAZ&SMl >n%^5\rןhC 4v|7pۨ˝:.z9׳{6$n2#2k}yv'"janoyHھHv oC.gny!Ck[i#OէIsKrՋ |;7t>sUijc`@ߜc2Y}'?\~8,>jy~/,!Wkk6mrtdC:ۡyL\x|ƛA/~-yuiacsʪWwQкe~H+-.9!氉 $%& ᪃kks5 s>H?j_l(Ə| _ѕy\iÞGx'q:U0QP#%%$}X.9"u2(#P}RSH, <x?HjU1fK)ŨupI{5&6o2O:' E(V@#.!^c]GһvY d2]]V"p@v3&& ߈J1ɈHV;}ەYKϸ5gRQϡ˝<:9}Vly/Z| ɿfa=r" ?_1KT9axXEg,f_U<\} &2MWη}uGoҫ;]71ܛGK Iw>Y8/΢E<%wzQbЅuJGe4v+_ ݆8Çۢ!Κg;l燕A-H<ۄg$8\_?jrpZq/xIEF9\$ I$4B.:Dp^H=XdOD&V& (;Dz]dmצ(̚V"ʐ`0 -+YS;bՄ]5cBn)]YJ' w7ӴVc섫rcg"-ɺvGF{ntagnt[y Q !q-)M-YsƞTxD8zgl% ): 4%pQ(i5,ֵGrks`L`(͚6f;@;Űڋ}+eb ugt$ N`SȪ.r>cu67*@ *ZCyYF:V(vBkyHŋ]wnՓ d28%X[+|Uf侻6듫yyyIY^x޼8w/~;m\>ES0w+廌0sq`׾?-F I7y?Ǹb2:yQW"ʢہeUwrv +^>b:=sW&W'/s?^׾6ۋ>Yd3'8n1lOH>ֺeD6 \f1<J@`M]|/FNOHl;\e=MP5/P vk4Fr*vXrTLza" ՋO^nS @_p%HV_nF$ARƣv>|# ÜLhJDӲJ@ui~Ûo{+Sѝy7~~.Wn.ogCOUx3h"d 2FJxk.FROqu}g alw1=~EZq';<H[wqŴf6RT>^ dۋq4҅ Kcë́EeY o{W7Gq }Q fgښ6_aek7 iU!e;NvsyIJ5 $%QKRv~zI2]A==3} |ƶ?I* aE xZed:/Pd2nr?炔mӡM Qr|5*u8O|xWG?d#‹/ZHd+ q{*CY,˟@s,4F|83)!p™b-7RaLiIYi^v)Ci)]ĸCN gI?H΢NVRĭcrA i ;ѩ<bP*˄eF+異ܑU okdJ`Y4N@Y:)s%42Fq+b$`lc+a}bGǟ\p+p]jō3%}Vwߝ5Pc0Wg72Ҋ8,S)gR+u;% 15h:At|V-AmgGS;afkG.15 }b1UT|}Lj v(Ϩ㴘Qn݌Z J#7ڷB*8BPDBZA]pJk=OZif%[UP1!6= P8{||]֝!~'q{lQ; yM!7ȵY8A"^!K]ꓐ;&뷨CcpgyݙTi4f2Gm>du߶gEw9eWY2?/IrA"݊q:f s"ԅخa"Ѯ{Z.J2FƾK|e)^|1.A:Tui5;#V=љc@MxpIΫr}xxw=~*2GoϔrQ>lQYmTrÚ,xp޿yKeuJ=;ˈHCs}s#)J0254ķ LI- (F9$7iK4b -UdA-T!UB^ϭF{̞64] ϵpqM;,e*jjHclMq,Nq )n Qf-([lk IT֝{_{ӽdLJxw)j@Uϡ+gՃ#l⟍숱Lɕ8fCT#ؕV~|X;diLww4B1dv22bH8UQ@Z[ی!+ 62YҔ xC@p*Ӕ-tN1)4x9bK vq1DH3mo 9eC/[;O! ͘`_F%S˸dXãRj&/M$6υ<dgQ2񑘥 LSGjlӉH(HX"C6m*?FN8[W rn,%\lcG$l4`*e`ccoӄ[VipXijkO Hm3d_(ywg4R&Yސ} S7y#~b>7^s$kMY>}^_x"[ε]]F)CS](w.-{&BZ*iB50hА%5:7DXQ 8 zI\-ugH2@KB`0j,"* Έ{w04LX!,J>4@<擻r"]gu5V]{wu=خl0OLL\,6L}ĻƳ%qF>6qd'ndzⓙ^܎gĹ35V)O=* \U(AAWO yA[g9ŖCKb .ov,=h(mS}'W_wѡԅj?`ҹ朗ذT~ }GjޮSMYu8GȋQ:܍1#oWgӇe2Iv਼k2.np_c8M~#_4}sB|n8Ahrv'gmo|#3ruN}D2 '5˥'%HzVK-IW.dJAڍ!†ݚb":MQGаJduݚx nmHW.k2eJ#"=q,䌥K ~ k„bQS`nޫ̋Np r$:ń6K;EVPz(<)B@(Iqi_PrgA XF-m\|^Yŋ9}mK4&# -Pgܥݼʷm|b`O"8+1IBsb [$S1CdTrQM?YOK'LM_t9Z 5|߸u{l|U"b9*֩ow^v{?w7:s5!XA|k$`+E2xw?L3"i*Evד⊼⊼⊼⊊עs8˱aiTaJ)\gƅ.;mVB1B,bW=$>ZPV"6i!̉`hMF1RC%9u5^!GY!ϨBh8MFxӤ4eͪQYƵ)I#4?KUVЪz+VxDkq?%2Zk7fP1zC_>]D4]mPYlkB/cCzğY$utѦnD0SfM(u i+ׂJx3w8%V%x,☥2S}=oB @c7i+{N(AH|xiϟt~-|yI eƭ8]œGb1r|k S(2zVij}Bxc|?߅DZRecu$4bV&1DRfl*elUTe-R]`OȫJFif6z!J:ɆFA" dV7=397G`"EGL82}߲ڨJ7zGA-z\V= ZUO|AMz6#7zG )Nf]KX'RùjN]dzP #x49]H1TiݟT'ퟥ{Z9OTuޘvX*vs+80γ}?6N3_B0]b_O{r6!6%Ӊl e+OgOsħ:|)awA螜b)+8AEZ@P^ _?$T`M9m[OqBR-ɩڂ7#uOv. dtO.U⼘ Zx%%얧'}I >dUO{rB2%!?$e)| ?Nq$(v*X&w dH c|3cJz%_";9xCGfR!@̈́J#0~?ZYHefpC-6) n V && ƀ%S<È*F1izd9v=a(ՖX"֙,(-8J)uhC:Rx Zڨ1C#XMh~[7p4ǔR.";F0,Ib2&HP 4BR}ob厂~T&Zq}K]FJ~>sWK)Ve;za\)X4#xx'+ZeB8\^\4fw>L^QhHyf-mlUH11Na$-t F#9sDC3JqĚ!pl 18+T'LPy=>\ʼɗ9|QaMwyx~xzd#݋)cه=}UN5/?K޾y,5?=,3XAQF? lRxu(!tskfFcw& l~x=e#0T<{ẗ85$̓waq,P @X RX+ `BU /DxFR>s>ƵgcɠQ¹& fqqT~pW]YoF+_.Υ? ;3 <%hYr8 }D٦JEIQ;ySڀ`(:R7pC#eͽkYH+HtAb*Dg:G-CFb"`6&Fue?Ycx N6f ilST3F<˰2̎1z [,#?ލ։+ y4lvs57tMA2nF. sޛrl,V ?dc휙/ߗN %Ƌ۩>r1D'>x/Y0Y,gPÊ* (W|Ch)=(E'L/iIB^Bp:yWJy3j\ RD'M9Y!+nDֆp͒),.9:?x8z;{q*m_ׄ5!{& $Q%AjwNx!}nN nMPH]ӉƔȐ֋4 SjJbbX%߽%@u6]a& +㝚O@epaO6[$b H;wW]:w_mЂj>4 6^=?C*GgaFVoU^`Jȃ>ɝtJMH=uؙ3]t1JtvM+mDSI3uWo-5K=5QeהNn6QlZڤP)vֶ*ϴ6$ ®P[@BըGoD/"M]}cxƶla~Jw'qgM5PjGD7k{AþY\I 3?%HST'A&*ƭ?&j\] ;Lzd." bBҋjO-Bd`);9kTgD)X?}jPڂZOv&#_M%Pg{ j)wdh(:Z<3r@Q˝_F,eX7*2e_1 Oy;d*EK髅gd n-l` -jsTtz#Ȉ жx5Ԡ! #Di")w'><¤YL!z7v*f -GHz(ow7WoVYM%xwO5W4 o)#RHxAsE̾Cx@J4q!U+Bj̀$F"jCPx, IjzqIJjv< jcDJfPa= 8Ҏz9x SUq?V̓K)3B 9dc18E,lbR3JMZ#~K {ŸW[k<^1Z_ nXٰR;0xm(Va7[ LI=R"3T(Y3&&wx -mqH7n VÖx­"X"5x`=R=mz"?)$"A%"ͮa2q8澠PCԖ .,)f1!B{ ˱0 H0\4[,5?q,ޕr^th³'ơj;C dǦF69SCVлldb/J/=4hhJ%8W,ሔi(CLs YRdxH"AyVt ^ZժӉHxzeƷ1t>l1mBh)*wx^,sqxjɯj!{QscOEf%^A3 Vgpg Ac_7~mz2APndKYxGSʏ*N!JuBDt/!}r@҇b43WkGѾؼw'{yz@52XBӇw7v.=h>f`|b\1rDYb /*ZRHA1BI]'i'֞rʥFXUuذ{KacBQJzWן=I$1씊VOu__4q[Ps$P. ;L,RZ2~o񟊱A3?д eTb;|9 iuCu q5* ݑqbm/Z.n ywouQZ?A"@qWl^O 53G]y`{B 6Á#W3{p!X9>"X %[O"0;ץ~h;~~]70g "t {=w{ =bcs;{g}3Zq yKv- 흁{h蛅TJOP1!"su ٘JàqS\8"8x>ٳW˧q3l9f_* L)F4Q'm&˪% JjcsH4Dٌx..*wcLH(hxAӘ1õV hk9ⵓ-I@핻'(LϾS&!&5kj@'iO(k5dB@ɻ TKSFV2/PVre7|W_±D:biB8k$5D"t(X3 eL/VFPcbd QAZA R9D#D5dnk37`jwy'QDHl @rf%JBr`6LeQ %E,\ns.P4wdla%AC$Z~աM1ۭY6{mćll?[T: ۨa1OoAQT.j _鹛==Ҷrd~/W|Q'O@?~U0_]]5~]X]o¬dE_A3wW#;/j bwojb]*756,'0>6q=ztwWvגpF8E~Q ű !q˜Ou0s>}FHwhܬI;Mv\I$\"Ú'A!G"ު>|j#ǐ!';eܬ\›(ph%/0<$rL+vu& suzȿ .&Dd W˞{[qB7J|♜J\h+GyPdHs?Su՛=H [-ɻN`Hc!!"#WW4]iv,t ΰtI g=w N 1{tLd }\9Vk] Yٱ|a=yi:Q|a~~v5Pʸ/;Eb`|t*zu;^h\/7Wwuv7o ᡮ[xS6{ުhaG#VmD&A8^>HZc?iBOo7Y WZG>[ozxV=L!z:!<;:=UPPC ")bßB0倒 +~'8b~qĚ3W}&36=^+:J5lXTIaAD8?:hrWvwH\Ws+uA *AD4"%Q\@l # 8"/8pYKV8$%L@H1:G30P!8ΘL*s0Xm!S3d TeP$G܄HIKBȸKmô~VRڌ]G] Q * ;nf&~w7M_ڕZ77Ὑ.Ƕbz>}^ ?|_ܱ WƋ۩> b0(][v;̆rW|<LV%R6[TXYZ.92WmFănN=hX"O9M7BNnmH ѓe*4QNwjs-&0}1SfB{ń@vJz' '`. VBz! *,Z&Kפr{PYfz V@zO0 ^" ЄIH͔RmlHk )|4ۡ|+5zx+ݘLu&$^eͣv9B&h1~m?NL^gOm幡҈?WF baaL 2?$%]tBMcL5yþ`CS%; g^eat7:*fs<ͼh}ᖟ=֞i@ D3^Hi02.쬶/XRNkجEx%Fs P2āAa 3+~Eʋ;,djS,sk$'ƚ'Z0"e`ᙽp*6 N Fj! 8"i  hĩ'*$p3jP*br,vK^޵Y,khkCB^fɔM`Qɜ8v EtrDq mY?lVLtHo_| cӅ?aJEv]|GIIXcC &(GҼa?J2Z*ф#f`6íb-3)̼I&gf,/{WFr$ZΩŀ l'`L40j4)_o5II͵ I`[b9u,@##oZmez|!xj񴼔ʵoEJj_B`KtXlEupԕ6'}.{0BW8.ѠKuc/uNn=IY\0ZV|/ޙD]^6||6;Ho8x(v+Ud&I[24j:ѐ)B rl ">PG1% n$&]0@d' ,u Ƈo pn o{ætcgwa[ d<\QƕmFrFAWhl&Y)HVLޛ"k(zJh)|(,wrS]UP.ۀ(Sk33TD+qM{PӞ8;гp㝷FJmNBNKAKjTڴ=<5ZU00 bْ͛7u>|1-?RHi %]A13T1e>>1|k(y }75_PoJ:on7jR7yc^ûf7ՇۛYf^T4Wy*MUvnE-6!QڅVtTf/}xCi;>j-7TwTG2G Mlb.-nJ~dMNj7xb rxPvx7<Q,NFmDE"8 \UֳܭMK·#ˎ^߽~O+GELEu1p/:}PI\.܃4EMita%#q-x݅d%7QxaI"K6Qa19rֱQE]SP>z I`EדkBE0ηcN;F}ܑkJdtN¸ꭁS-jw Ɇ TƏ(0_y%%: 5} !{WlX* ˤWLt1O8Uv+&~!GIz9.A}R 'Z~IMhdz܋MT df,y[W5P?5 +y*]lX?|Qui2d-p"nZ121;'̭HP=ZR9*MiGHeS@ZJ~Y;ںR_#zWu4JaޡKyp? []bv|G9 6#ڼѪ,K,|?UmQtEX=2^{rV=ąi!%d ^t̺#F?$Np(.y 5#p5z`Md -TTZ:Db3oa,r}IC]OͺԨN~|y }B&C !DjZ" IJb3HZ3&okDpWK،ڕQX3$$cP6E #4)=Xb%2K>r7OMf)ғ}{87>L}_!SYB7wJiuy5mtݐeysJnW(EF?>iӶbe-Y e11(G-7ҥuZE'KU^*›ɥ3y^puϷU-7iRKk`i.oZ0MμԾoH?~ܯd_dAPͅǝ^2E!ћN!c}z~*%@E%"XVJ$.VYuU`J)b`ڧ)%Xd@CZyz3D訣rj `cwtZkIN!CO if 0LI~!ucLCΌyzȈչn'g|4"&B ǚ22d3~RTxS*-$- ņg@mϏF$|D)0*I_>8\\rZ$CgkOg|4R6Tvd79guJ &hs|zxB:k0sp)$SZR"I3Mp%yuy>_&լW=ۛVZ@@tNB m Dc *yV~x\nn^'oo/5X+-hHqΆsa  翵 "|@ .;w)^HEMӅupM8`3GHL# xCɅF2*ELWkUq\cO=e!!LHCJ㣘9)%7D;Ԧ%TlVДx?Ӊˍ[}1OIJZ\}sK!Jxo]4P)DDiw.U0}ϫ9#LO/HIs?_ūtwMߑU`ID@IsdJ;M`:x-NQXJDԦ:^ ހSkr3/_<`0 3I*>(|U\-@(Wи< mC݂а%{EviM {C%J& $mhߩ$a.ȹm|ʹTLQ%ifѤH"HLpT ( @B-f"QN4q/Dww,\Z|tU[$pvU+c@-D>֤6[^e"B=Kk AQBGb CnK{u^ 3ݸhQe,.烗?5fʡb3NT={djf'<,kBi>dmÚ kOE&]lFqU|y72՛ ʆg=UN Բw2z@P6GwgU}Gߐ:5z-[0g-XK'#Cԝ>K x$w6atf -9+ZNLY  ,5jME2o̭-BxYS4a3b\mO6+~,lLCLLR///v=Y!َaYZ$0etGp"AQfN+X0\#Q1}}pI~{yn/U}˳wS3}j>$Ega4T7=lzVkd]{Sch/3B*vFBҚɃtn $gqԇ͗և=hnlX#5m-4kVL(o+(>E :HiNrk͠@IXTZ  t@pʐ\2(GE2*1C oӒ6P"T ?b2#M&Eyi9(ɢAaWIZ]+Щ%MK)Vam=ެa^?wN܆WͰl#3~b2K:V͓))t)e, ,*o\Z' 5BFcdW!TR4C1*AP9~%+(FVc9|a˹ l42&)x Yn5ZFJ\Cd;eDM2v9բs崭c+mHEˬC)?3pˮ!ٖ:,J$"UYU,zem[ʪ"22"2#aM1N$zI:(9(xNkanL%AIaſ *չAl~0$U#)%(Xuw7<ۅPևw'`OKpj؋M&0Q=^!U镚#JQe5S%pe` #g){XjGcC4BQ !ePL(`ԕͥ>S/̜hH.nX3"Hc,'K)JFZjՈwYI`uYFrBdip-!n@.9G;^ۤUL&r-,BԴÇ_1.!e:ĸLCjh9V!#pᴲ 1X >*H(aW\/;7\r[/6Q w4ϊk&t Ԅ[S+0UJήSA4Kk^JRP 0ꥍ(yO O.Ug'9o('^ãTɊ-ty`LKDƲDE!b/E O"i&+R)@жc Ww*X9-h L 8jDbaX*SFNМXBx8&(%2eGG!lwE@9эXOmG,O1#9>OVTU|JB`[)ŵ`z) cneGe%` %NJkҍHN{r *;Ȏ&Nd'+m9J01^h 󨣕.j dFv͎8Q $h+<msV nXAAk<$rXXbRzJ,~C0T!fa"8xY%J+ x3M`+(#+?`iX)MpAnV:@/Q*bLKIPJ:5a*=FXN$nG)02)yu!9Xm[IdOr,*pcWd]]ҝHّG{ezU:z[SywL5s][_uCrbAI9OUuRW헽뻇'>+ &7@M30ﻕwepUo('$ݗ,|c̐cDz|gxz`Y~Y| ŧ;YW3}%"ˑq+[5<KʞΡb҂Ў9$d}s |+Gib+*r#{>`+!4w1J`t{5CdV2u$l5(Qzkm \6f0a !:<;x>,Z1(v{srsf \+>G]!JrsE t("^,İE16 )~zb{HS2-SE1ߖߦn cdpಎ[T-J QfNRS}x6rs 0dH1RƊJh6FNWEÊ RC0g+Lre,(\񏁈߫\#KK&͉R8RLx8T=7)$UC%:w9MsZ*V_|}*'a)Z/TθsUlb;Ɣ V"/Zj^vBx>g$@0_٧IR\ztEg&7xfOhƚ>fmIVﵴwؤ/'FSF>D@w͇]gRTUmFUa[Şjk+?{o7BzfSƫH1nAor؄' =QБlm)hn/K&q~|-J=_aҮ!Ƈh.oe13B T[#5%' xzcW:fͩ;mұr8SSRYp,iÍ /s^yA/y7o7BJ@/6lNJ_ ̵ )LN'ROΧ==#8H|'o<ԺL`vK0J2H6 ?HyGV^=KF/ FT|~|x8ʬ;ytB%#}3؅q4h°‰t4JFEF@<^:ox! ~t֠*hD FÄ0t -x<B>`%NSCI|c> ty.c*#Y:e:t(3~}lg[1MSR)m{@8T  ,+ Xoœ0kY(%Å[ZaUj4+qg|Ys9A8Q;9=­!8qy]9=T?ptB)4)D@M)0i|燿$< x{wyo~`nWi^N ai-XXH`Hpʀ#S@T( ߌrZR{"+LkKK0}YکoNnnBqrulIyV骜z_Sמ8-TG{$ `]xW:Q'ԠSN)yTF#HTxNlHTCRI!XP"ZHzb XbwUJ'qdp$=tHbƤѕO]HsgpL,'[S$Vv딁ZukWHKv0I`=2'ԈLY#ZcQ@.8G{*kq2[lU κڸOaqA K@h%T@l"E, Jz DXNY|Y^U[1uEiyTHB{ͫ]Rr41~ @6R:(žDQ-W")*_}Hy1uID.f,z̿ Ň~ޅOapeo9?'Lj-y~_pybN$h~&Qd%e8`H`s^ $tQGE3`+b'z*׍T{'{# d$iH}$\  DlL HVQ(8GV7i ;ܘ/a;U|z0CxK~rue |ʟynjק|g/W94p=ז(iˋ *?~@fW_naqU*uT5%pz++~%#X+BUk;`Q^2jw2FUncA aW%m^;ʭkQ1:`FpkGBMvJkؕwd0vJFF2RnrrVñ%6YmBYLX\+ qYcm($;BbcQ5Xb!z_j 0lͰΎYe$ZVF"A!i8+$)BDiMF0i3hx098&x 4S{Dv^cФ W78. P?OP{L>O'&*'Ks4UܝlfOh\2mӭZn?m%xb_=TՃ/Ȝ8CLR.Y9?+,#0`\iZ\hGl#Y<4&hD!yA{-{{]`6wї-c|ϳ_2 :(G[n2%`zsɡL/7/R:|;_*?w8ae i YF)È&1{nשׁ)cwRUm^) c2[5XU>a{wRrO$i'%g"#(čk`?j+TYq6W#%Ξ 8MJLΕ 2A?JNwY_@]P,)CҔ_VcicJBIw @A9_Q=fo:c8 mΏyc.sM.M"i1[otcUO+ *ATLEeZ6澔$Ȋ4m0(cZSO$\$BQC+$ɺjGtEΏc%$tfe#F,.YqؑrVUn`+@)fI |Ueo >PaX7X\K⛑Sq6-z]s\V/ֿlLsHף^u}5uJ ׫<ܫ%5o[XCL9.F\9sHo'=-8r7OLJ9xmp,g(};(VpX8o2Z XLJ]M5KAYscK+."- lx|W/;(kN]L>-Y/ ڭ;lbʱ~lb0j6KXrm۫b)=Kz96ܒjÕ(N\xumϖ㮗aRrİ:NǝOu:Þ:(/jz]xamz EC,@:iYJa"kՐ,ՍUhٮ5?_\ś#/=Vr J+-9kaaXqȥvVkmBBho.^*}F"INAB  G3Vs0НUBK ю˨+M5AzA}糼71s{ ?^!XVXI~UP|㏷WS+~gfW=;T?/~$pm1d ~ }iX$FWKϯwɍ>Ma@0 :TA)ގ:+ٺW}8D1#UkmD]UvohMh17PC0D;]xw[o'U>`4ju3~7.yFµy{e ~y?-uqtTc]&6n'n? flޙ Xjn,k* nnGSL$Ry-?9~t+b5I+J2EHW-Fy [U RDje鬕S`kڭ3Q!!\D))=^NiaGL9wU`DP@\ kP|^]y& a9E9V - ;<ș@$L*0 B U@@ǜ7Mn'tT2l 3W(l3Xrm y 3 % |m/=tR 0DJ00ߑlO>)m|^.9%{,I`I%m _]:J¸ne7T&ePHElyb;]B]VĶ\f$T2KvS\DcS싺ՊvYb(O?v뙾~c吞v rHOzyCПF^[N%IK $>S{EvF""+b];ݥwodYKِ2^>~:;%b)CQd8$ϫ7 mCʪWovG/xѷn7yn$8 TV0.\;k 3nO7f\wO0,9֝- pҜBZSPxP ax`L Z]!84c£dUx)NPI"AUIZ3Ly݉,X >MV C`83 +^@' 3RhPti5@!!c bN`Bi+G&ة|* )*D[BZ偱B(D Hb$:1$ `b, %7sow54R3A!9 s54O6tu }b\n&Dܻr7?e4 ?*s`Dur4OC?4qBJ졅PDZfW1ygOC)N9w`v`昀4:V4uکƲITLzN>a)ĵȗfX-WcCWE6A(h@dL.#sA0QHNlC qj47VSFBlKԕITє sZdЏY>`&>3Ya 秱˨ꉎٚ)\8?R) SQJķ+_#w`č {$z2 : I-Ri `U9@-_n:_☦QʷCn:DmИT\L(2Abh[{vIR|8|Vrv,>?& .E`/9pY)d!~{; lQu{3y7|*{U;r@eUn{|h+w~J.9GyK9SZؓ> 1:it#GSZCi W?;'L-]r73gɼ1Rg9EdFl&4dJP~\YM,Cl슈l1//AMEx na0f!>&^>_[55j1Y̸ojԛV#yqXx~_Z̜ wgJmĴ}t͌eՃ۫tO/GVbpSUuVn:Z#`MQx'7hEtˬfߟme(Ţ،ő0 'leq'/YYhpY^J8Nb~?fFJ\aa{;$e3ihC3JUX-0pWJ7;7^M$I^ U$m\)fhCnDO?YJbӞ>1(8m¡8$쌋k"+ EZWpљWzNEHғ;:)Tk.bCINQP 9DZhSIZ1 Dj#jczOkG/GP%=@ ١KI1b' NAqR/Vp)%o=Y8 "%y-)ax/vT钾jG& G%|MwGy;g]~;8\܇R JSaKcD v>l&Z ^$; 8@ ~KSA+YcAI0eN\$bH1kmt(kt`a8O1>pvo퇥N̫ΩˡFmi,V RYR-Qgd>I|V +%0(u*funyVLm +gᱡ>?=~6ڻA{-zWK2)Kg7/RW wO_\ t6_ O(lv~/q|ƃ7PgEA{#5?gw/G39l=V6(42Y!b<mŁj/9 n5'Bbx=_6h2' L/LD{8gnD>Sڥ\E9 ]w(牔>d VBlJX ̏yPÚex&\=?QaMpm负d$Luil?.S>[ *EK@K=?(;r0gUuJU]1knT+ԉ*?\LǗȊ#CŴZ^1ڶ T@pK! +ko^q<|ṯcP)eF]oGW}Y䰤=w ֗'>A@ УN 0I}2:U;0?<_PAP>(|d3=3缒:`#.j`=$|[J5ieͣ+FSZ<W\SVauT߻>Փ@l&]/!tFcr˹z;ztз{KGNNp'v"?E|6=V^l>F4i-3h]y?$A(Q0"uuC{;,@t~%`=# ĄqmGNgDrfU9*ə(s%6s-Ja,N<w@Xpf(@j-]2z5Sr5`RI%#҃k0n <#>L+pR5:I%RoIԂFPi a0ez5lMD*hwn4D(Kp"͐z:q { -fFn868~q| ~^&:v*[G9B\o/wOl/ݎZN7]k-iͧA֮% dOQ>kqG!O1ӽ+^XUX*G5ͣ=HEzL\؟ .P!*BE/΢Mqx,RNaܐ&~ +7 D8YX͈&v.P>SDCTTPL \2KTPT TLR3RL%ir#]@0ek)%~J/ŐD8^H, {`RK"wBL[0FOx~X'||Ǥ,[J5ꍜwݎ|~P^7=~BxL-Ju2T_jĆW]@ H/VNUXU-vYf j)~*|T<+ RRCяU@:#=jj(ZTX_?&EU1!ߋ6t51l u8LzKcE6'독d7l)F^R0]fzXeaJG!&gVR9Q"ؚKCZ-=b`RYz:T.af%a. A)Mo>žPw+X^t pAChqڊ~d>1`2͒Y\_ؔä$ރzL)7Ͽۯ!; rXLMa 9\ Ͼk}^baJ)+*Ŋ äдǽs4w1g=P!Ugci9!G9:f-g0ym統k>3R R-$vINH5>Wy"|PLhݾ}P%9SǠ.>iў8Hf #GS ^-a\Q:)3AIre0K0GYiʴHa#e?^2JK)E!i S8.%KRQ `|$-R)U"JhJ"-\L\B>juQ=?\_KB2QHRWA.?B b#)oq1jDͶ.aJuS y5=k۵ vc#kuh۟U1i(JIL?R [=Si1yi :B!MqTυ}Wb'eڬ;EA+^#:qee:;"R%ʙHRKYYb:) (J)sH0qztxŹ t} Xo'OKY.VA \Mʹ3(GKt^ r;5ޅtq2:pIBQjl:(C`"Qo#óFBXZ|{(Vݻ<_6EWN OtԂ&-TbSR5FܵtC O ZR(gT+iSa#_R\E6au'M% 봴߾ȕ&"&Su+GR9M* h^&Ŵ[z {OFjbOPCᤏڻ 8*9|t.UN /I&Fjfsx|dN_/P E8ӂtOh@&$C(GŌa}!\lʉeYHVf'%KiuJJ)YVh& F<8EuJh RRݤIGd|c"g :$ɦJSĤEvyC-/W a@aFr4kcEO14%-DYi`4L]J.wi9.3^od7J<=A(MG3Ac68&p ł_ξ`{)L(GQ3g2յό~6k.baW*=J&7`{8GG˜nu-et􇪟oA:vdxh&4QBųf.}O`nȤ'C^ܸLnMu [`x7gU1Q=<XyV'-8=&֔ ְn]juJx&$֊!*pZPŮ U6#Q3[mPT>kɟ W2@B2U C X%;vZ+]#6=V#㴯;Jl5 ScT\PGscĖmspzc)?-ͬqhieѨkktiR`76`{b45=`,&58DJCf]=ٿ֎D[Z?.;W!H9 vרBU=bNiD:ё.jS ӷbV0㱵Xks}$NMamdx.$}"h/%㘠xL76`6ɴ,8R!DFsQD]KC^1mW3囋}fXۼ}[U"/V?[<stws0?_\~n㘌>mcWUʼ<_6Wќ)mƊ!fiai*EߏU>_7G)٘MnT!ωd!/DClʮ"?փ#WLz@鄶݆;6:Ju³[ y&dSVXE- mǻ x: u/[ y&bS MX+[(.چwDQI[Ӓ8zMM 쯶gMrgZ8B!ewbIza"QH9=j"G̐CA`5ꮪGTY(EM!EB,UFJzyՑZxlc]M{9en`-2ò P_e *K{4hn~g#;\E_-1P|h'Р/S֗,&*.К}` {db0]5[{Uʮ}=ּ/gJQ(S+Z7*#FCǨ*Ċ!~' ʽ,N<: lt@QB!jE*y\hÇ.q#&}IA/hySߔ3pcQ,D|Y_ uuuuU-G@qYqq&h] UXgڙo@Ye΅y?yII؂6hm'z9Z"Dqx٧0̷|륬%F(9*,CnȒmYV'-R˖y7ay//džqcA-Z/6+)a4+Eb%*H1j-N;iMTdfӼ U/'] 8 bGoJgSԌWW(hK3A_(^L_uB׉y_͎؞|΂#++=+*-*Iuަ 9I:y,hLVLͪƆK"AiVEqxavDAJ֕&h"D8XzYu&Zzl"LV-TIuiJP(: < JNY X H -ܒS,F4:Y4 $eD BXbVhCKhIVКe(Z:iih@Ws.Tc⾜ήq4LJty Y>=7 M,zOxZqo˲g `vx*1Fw7 ^lx0:j~8+FZ,A?eyA 1ꞑ1 XMm&BrXMp} +sºO=rR9?FT+‹"tgŃ'dUxB'9d\W7W'o'1Ԙ4̋O>$fG3w5Z}bnL}+7ݹ&chX˫V WW-0}g} XsoEiQ8Qc]!@ bAP<Eq%|HwBΜ5mxa8FqVLox%+-OT*c]js.[ͣਏrVOBQpBkyI=qh9K9GїVɐXw.սI2,{+џg!IÅ)ӆgZFݑhM\u6bv4gܰjiZ!(Z/˿+EH>/ެƏ}MX)sdŮY@#+ M+=R)D=4*ac_&M G=rQ3mfμǿ*细ywӽ-C{v˜Ls=]m0Zvsׯ٘݃(@T~AUp VmNK3'1<dz3P9~gW{~OnQA뉣[ʜZ&V9몓?ưKw-Jtnz(aڣ{ f"垨T:{|X݇Oɂ:3[; PؽjZcW7p$RsMDQSd7N"u+9#g"tY*XSOy hCqjs06{E tP&\ItYh ڶ[z{%ٺ~jX0.bm1iv&])뻋X{E̗kJq!jBw7?yW"=sYi9S@^B__z@&T Jtj !7+ {IW N^ŏN>N.Ъq&<*ϸxvϲ9}9iLf$Er-U-Wx!HMk8g:~?wi, 2{:O^|lC3(?2pclA,cuuERB0xg=QVh[yW+ :4mQk3+I6|{*J ;g?6^$e­˙̨=hdʓ}kw9lCּuˏqBLs:%{גg!P=78JG8J9}zh&tLZv\?Xuj7xyHҝi^D`WjrNf;M4mf6H,=Fy*eQDFpr\Yre,D!2Ze.&U?X,s*SEܼvLkoA- [I uI! e̗kp[7 }zm ՔKgGr6N6՟_?fXqN jeI՘iTJUq7i4xf2 1h\cT4&P3fPB- i>s K#(k=wFւ &y@)nׇ)E47`Q Y .y,p;#DSZy‘;2Į\dkRаMkCGp@xaՊ.ς}$Cn7=hv*r.ᑳ,9n9cTy5Ez|Ģ(˦=yG@rg a; ~4ǵ3_Jڣ28jG"8ZQ+GQ=4f\?V)[?jzF"/ɘY] Hf$}؞,IDRxS:U)u\LV!;N*#%뭺+Co{^؏cLҗDɕ)t1Ғ[$  Qǝ|fWY S,Fn F2uRXFYXbVԖ.5\\♛GW"ѸRX- #@s$ R*W*XW"V[i `A{;źr~XO?wjPBT4AʕG\Z9BK*9eױ"@z{T߬ޒ#?M/Slz[_*JMER1{>/wsFgwa_ٲǦ/Gu-cNcNcNcNf)!+R޶#ۻA7<$>O!J r‡U4ߊ6 O@N“n wC@ .BJbVJ,]DOҵ^7_@Az}ws3DT2#XbRE<̨6 K _,xkefҳg|yLe|P)HkzWVc%zŅ7g+t\SMLW;}N YOQE#\-+S4f-vTWp?5\ɧt{6ӕtn.0  (F Kf"ݐjaUqd w$BJ 2uyD, J=Yo%/dIJ90LUB;(z^^&\mA*tyHz04*= J[J" .)UֱS`d&(1jDTRʓJ*+%RXpӶ2?9WMhds:9i:/W9>fZhڣU݄/~~&tb>[{١> ,/^x0&@">!Qy.+mM(Ğa~IX})u! 1!;Q2uZeݺo8ٟ[WOW Ь5_3ǀW?\$34)"7R{Ϲ@vWRw-lPt$"l\Q5I? [SkRxGb'GCޱ*B{m 9*$+ОeD*٠B;MBnF{Oc'^ W8*fu!ڍL/F(IFGbn(ۓ)?cB(TS( 3`A1.P:E+B"*9gz܉dAEH\~Oqb!pkP2PϢ`geckŽ96f]o9W [`,EȧIv1|&sV23-ew-L0ql,AY!YsXGq Ӟ}Z }Y8܁HoFeĞOhM۟gy>!JQ;+7+ț۾ه܏z| Cķ@ 6\͗W WPaz #C$? *st?/(V1u uW48m5Ҡ|B_^eNd{+ FT#чO|/AKA?Y3p« 1J5;@g[ i`5WW }n|o @!,L.:]Y&q1qY9a!?Sbu0O'?}mpӐ ' W ^l\_gW܊+åw`o/}J08|h2ǑJ!zmbloGty#~:n{:Kr@OuOO,CXg 1艗~!%;SbāOklhOW ZA r {Oqtv݈vjK^XY :H%J ?CK)U5벐 :d!jA:*0\T 7D!ז=Yi$D}(%}vW]#OgptZTTs.?Jla 9v2|O&{EEEEVnp3XLh̝\" M2刜q4(B g/q[۳77׋^iug?.-I&:J2lI1_y6*<:Γ,$v!q抆-W#RiIiIVGkcLntQBrhńoBsL$a7VuR.Ƃ,_alJ *!XWi؜i؜V/6J");k "*ֈ!Y>YAFeIlT]'T]'"!q0-O0-OCU>0cRYʠ3d"rO*\RlKu&ޑ-l,]'$]'ݴIeAAմ߽;e/6j9x"~AkLF?y_ei&+@:"t4Y[O۸s<*:5":.M$at4Bɀ=|9|Di:`cAir gfv\{LF \' r֫5)i\W p!znIbդH|#%$H~0%F聣hK4BD֩ Y9cOwj[5>; 1N(a\UwNd 83JvEǺAiX߭=7*X':į'4.^.`nyl#qm]}u>wuꋳ.~]-e?[YJ-ظ [sqCpKY%~+w6O CYP.Vz ')v>/GjjzfMջfl@L0;b$ȵ Ԛ5g `RY_n7>SA¾T(pmW:ɧtJJBgaF%o< X4KXw %a+etHJӉc,K-Mr NHs_fcz:@d;Ҫ4hS`Ϫ:AxLXB!yXĐiIHV;xt,2;BU ZyK|-=PBYyE3#͖Ƨ216ET.(hN%k3dGjVε]Wk 9 $LeE&t& iHD*41T&5՟R>YeUIAv.`]Mvl]гEl6e~ʮήHUM)>}o!@ 0 \ !EvFq 1];lnw/jh̓%/n{sƛ &A4J!@|rDE#~I3Na@i%xy%3-cV6t?2˂b(?%&!&lj}:_NQF"R]`y@9sN'Œ6&ZdxS-$P=E>Ãß><νrdq lK9ˇFcOov:Mi%]r }[_ܔffl/(ž;_[F0tXeO-N?SE}_L3w{;x9]B~ +:8+*uu=YX&Dԉ9=2ms6&vZύSR0u-5L<Iw x1m}q3&o~r~ZÅG&,;F-^V/̩f}mY{'ׇS}ű 09D龪 o/׆R+ C|R!wvE"s-t00[J7R;K 3zcbRI'&2(117{Joؖ&U*IM͸r%&wjY_Qwxl|]vh.Gw hx]hT/o'eԎ?z({,$c@#\KI~w~y)vB=_DN~}wiZK'XЦ,P}Z#=^Ei h7ڄRIlveǚAiHc`x^J"XPeX/<[ PgJ*l+<8 *KܯAxTty2].wɕ6&$&7_>-.sRe77wrMD6}s7Uū J8 wi@j`돋#ar@Cّ~KGekR/+wgs=C9?teHcmfuR?"&1huFf $dIcu֑[\\$I NXᰗ^CMIrt踎.H e'tHȽTt S9%yNR'ؚ˖Ii4Y;DȲDJA$_d )f: 42s.H$[$Y{Z@` 4:*5IlW Sk6[~gm_0-[igPJ2x!b]7Ap׭x*tE&m} K.J?[7(mOwIe5SKR#nR&*oD1\4NZFވ+Ql%o%ȎV+o R7I(%jDCiClkynPMZ 9"i忭" UER`1Af;fZЀNoMTdIff&ܜw[7(II "/bk:S&|rFJK^N`t6:S&!Gت6řfP+gYJJ2tHXc k-g  ^pRP CmlvȨAw& P.xh=˥^߷#wZdn<{Dnc6LH f1h( f.eBmC,gxThCqM9#ϖF!r& eNFFf-9ODh]0Qcz.2|m]Jv RZYkIXCIB޵c"eS2Hg032,b}JrVvYRI,nNn$vT*~ew5uմ_ٽ,p]]O~Sc9apc.=fQ71a9"7+qX^:04T:GHRSn0ӛϷ7 ?~YqCȃmw?g٦ݧ˛g/mt;x2TIh[ í1n(j[1o52V0SM'Ƽ=ǪF{vp6#b^r"/?3x >)T=Ȧǃsi3LҨGFۛMц2m~wxI@H0W8X1L4Q20SzG/M#%xQa+`riV 5 C^y5\yй~JTA,vVZ+@;1C5oŃCOi@7J?gT<2KZWEcۻ~ Z',hzBJ/kX](3esKy>}p{sYVjo$1%SƺrXISTÊNm HȗySm>Z&HeGJzG2DCɅpWӪe7yd)H*})L%!-SCuKAИDynƼ@8!&1, ҠõSUdP͘3yGnNIC.p kN"sE}٠ ME\aKg!cIwP1nj e>64EjGZ2EI,G!(RBK1 }rT<eV"p\QZpTP5 Q-IP1[j l=7Lpu2y6T QsiX7c); <`%~O%~\DSdJ)+T! MEd EdT!sgrXXW=cVW>D_|}xJ5`ʤ,@,~Q 71eE; ^g2]̂JB"1އK8]DmTOo"*9l"Mۨ>mCp{9zgTMSIvUցGf!lj,|؏ċWό"?6(.15z/: c/[In(zNEFX^`'PZ`/P06z"BSyb6 _Jڱ&.erYg@9#;n5X'(Չ$Gr-Y@*-%mOj] T`UqʧO#}A=t $Rˉ(vq[!5;=)bK9ED0Ɣ!*q*p3UP=1DmLaQǺ *2rƒmVA`8u*t9u*o} YFZg|N8f[Seӱͯ|z-m(^S!4I!u\Ǟxni8# N&Z3$vz1W[w 0xwM[n +grٶg$gV "g" e B`k$ǟz,,:67[ᑦ"iRRz6^Js,r. 6XږփPKĆ6O襗@B"$sAqA\%9LL!sBnURXC)KN"Eٝ<9R!/YO?]L*||qw~mņJ1>Uc Gegj:)H NfSf>уX.?\J]VZ XhYC4h4Pț1糇Y n|NcFc,tڼK[SIV5pdBi<|*˯unPp'{(-ɴ.%OmDq Ϳ΍]!>U/_[\|s.;id,Bɧ_ciJom0 gOt9/.uc4 jU?"Cn[PhӂZ| |ue6>($, 1y"qH:B !>d?X5Wz̫KWzߢ \kp)P/' Ǔm~X80cN;?:j/s[w:ɞZrBޖv 9GBbvu-ʘp]~:+;U@M c_{-kc^E$bfevTL?BCyn1!\cHuȭ 0Zdr-sQFH[0'jڝTכoB@c=kLJMpQ./g F})`r~[M~rw뷳'xO)T#fx˭ʇܞݭgϞ->--~OyxabPVB ca^kSB ޣwW vR@1*V(@ l)㹑XY[rصWwpá_DO}wR$U-.M|ob 7j6(n8+!M5|'qF`Rā Q(̍g^JCgk8=n5jB{VDx91 ' q %2עk cLrNDJ =\?]=>>銌[pTra媵] mrՆc]jmE`''Rɻ~rYxB gɇ0\AOC}#!?sKKt$ q4mgV^ oϛt,g,?dfފhVVNEH,c9GFIB WX)^h;j0-5Z)(Y-)I0jL~C5]AL1twqO[nݭ Et!2`-Dt۸K8\u~"]ZSZ]m8e85x:sᵣnzT?H`7gw./{-+8]BA1^~bF;\K1a'3yB/|g c<ݭWļ9j`^@_+& L)F2)`s%]%%SˉƕM]d}s,FOᕵ#'4~*C+$UϙuBjY 嚌QE!Ml )Sִk6A'y3'C濞I5Bgߏ.*}xf6*KX|uu_3Wk_B\7J<(T΃R9+mk!gQ9Fjι"ya "V\BpLٿ|>菷Qz5.Nl! N/%~p61%.!QoQ (9$QJv@+Ĥl}*ZB K)L YH*^ P"- j%  *G!n -YXJiTKTHdP~9zs+;Na?UxM5#XN!6+1f܁oEfu!!߹&ɔNn*#jT bD'u6m}իEEj.$;ѓe!*-VzY)^fJNw;@%z(Ƙ2T/Bycx2ʞDB"҄~mLN/?ܤJ7Q7J@vQ$wb.rs\53f[S/WCbs!H}¼aI3IG*IC`Z7huuQ1P}WżA\ɤ[Z3u/R)Fh*.$B}ۿjۀfJѫt|SZVs!v*^WJ>wWބX߻v(s<8 1-+.7gҞm> VgS&/ O>xh1oh<(.EiuH V7pRŴO_bk+7Ȃ @`v F}6QQ `U(/hnAq\\Sm EQ8icXjJsnDqB T$u- BK:Ths@(CVjŅp1JK<A^h=*GiZȝ Vp;},lE0g4= Ji{z-s)G;DKlR#bѰ9Ls.#kc zfmeJBv30f[FAZ{1^1`Ɲۈ~7,G岼 -OZ?Vf4.ц7.v$,idr"F@qCvO5 0)'eZ8S& <Vߺ!V->_PMgNq=[/7|[ m]+Xb|M:C'>&*E%1@@$-;-H{")nB.Q+J$` m5JojcJ;xPokIV` +8͑b(q )a]CƐ TANE3."T T` ԅ|&ZbS0y&{jAj([!YSFm8[bkлw+}lց|&ڦ$dI>I~o%C"qn$#$tati'ֻ=(`]viXUr=^]יSSҒְ*y>LU[iޤFArl P7"d)} )dwmN>$e3:׍ӳ1 'G@2=I$Uwj", kهնݫj N }!00puٖ|q;"̭mi$joֲĪqa>b9$7&@Pv¿x 4ǥgܗg:>Y8xcZ8Vpwh1R!!RG+sP(Z:dYSjv%w'⫞h:ӽqޗO,;O^eSKIUz7P_ܴna`\X\R4@"*y:(ҹ*zø|iq08)OWT0cB`uVC) bHh)iJH&ԴC2Iq:o"uX7ND}~OS/INM9z9t`?{3(ap>F%|4D*Šk*<ƴ`-0ꅇlH&fΓ]#f)h%jc0΃^!N~hr63#2p\f< :/@LNŤB5ެ` Y<1PG5FeˑBU1|ekHy^>/L;Jʴ tw>L_՚UƁ $I:* ݖÞnR !M!M E7M?"i0YӥDNXȦX ??Ûo;&u; s+shZxO!Ɔ0Yr f]f͘>dsFa|Pnijڂ}p?NΗJ%sN6fM].LQ+w^i.b!` "@)zk,&Pܦ%8 0~m0ER̬(LLy=sa hD H`d:3"ָC,L+KEVr Ca4!gXbL.CLP[G"8D \N \-!]Ȭ?={f1 ghjȕWOKoR9#ߡw?=}~\0_ď{"{ 쵖 ed7/6Lo§Zf'έ/:?a /c(`l;d zk!I| gtcM9 ag/DABovfMX  ´bRJM鬧Sfð8.a &Q }Ay`Q E [`@(3E1c53o4V]XwS T8i70,0&͂nm0˹Psskély1Yp^L{{?S¿5\3C}zwDFwzb|QK n%a)&SLݛ?h KZNC!AEB]` c冃36͓_@2hyTF{{j(JK9iAk`TA¸=03]GV01 Ay  :/ߊgU) JgmSW( YiXE5Ao77}@%%1U#0XL(Um%>\cGB D",:?`k2XPah"5q:l>XYf%K`&[gƄTlTR$W[uex)(+O./+peQ054ĀfE$A榐@irP>:]V wt5w0b_C09){x*S@\:@6b_g7e%͂FX9B{=__ & 5Lj6γ:1b_LSJiԟEcys1c*qۘ`+ݎݎ.>/59޼>Ǎ%1Ϗn; 7 >k'/@$_\iyݍW u} L=i%A_=xIT(B*Uzn5D$C|Tڍ_jᵶ'cqL%;ڋb+B)]\ŷ_X`ň Y9@`q9~4'Kez2D<.nH^5'P v ˽ [`:9To0Қ䆙{+َdHp vai׸eѱ,bо瓐g8.4 HKqq@NS-1 Ty JZX?*  XRz_gJRs `.N[o/5B E`Zxpbq,8{1 :w2aIO7 `燅-B83lh7wn䧈~??\eߥFIJzt.IB);N^)4Nj K=`)z}uTJ.O2j}Mm<+c#p]TE%-On]r)=V1fG F0Oj`,l-KyMwbyf%ĵs~5\c*2n: &HN'V~|&Wu9d/)෿L>ݫukgi¯󧫭o":\4K3% vxeX8+L#dPJ6e~iHcsZn満4;i̖LYT4Z_,kJ 8!rNq6ڢt$ J>O^ӹ?+%ŗyNE" ,t %[k,DeϮ)}^^@ _[ozo=_i] ̐}5nLP3ԯl [=X5[WW=]Sz)uV!^W=]/vd 8T`%9]!7" Η6Ēe=\tlf)zn|L[$#5e!G@&vAgͭZ;cD0 !4%d2(G^I1J9rM1B˓YhYOdDYH[ˑt/he7A o\iHf  tO&-l D  `"GS`OM]qS8 {nH@lF?ƿ(d7ܲ[g\bt:F_ZQtL@[Rfomeٻ6ndWXz쎄EUz=qٓT<+QZJ忟Ɛ")a83PR*e2_7@pNӴ7:r[OӾÑ."G{NQpy t '(R3V 6$*嶮H X3܌o?'2pvjf/NξОͻ>KIq^<J c^j*84T,V{J?\,R#o 4fXi&W3աcBy0^> $YhƎ(0BS b,H9 \ ONsDk(ridSdydyåe˂=ᑮ6kI浯Z}ds婘xZeAw va_JG$T9.S|4eT^d yN~9YDBޤ~{n+EҰ/eusނo őqH{l^x9/& i/i<Kџw1api;5 &Aӎ˭C nnq7gF}CӯCob˴`/C?D~|q&KΟ:G"24Ww)Uw;)av-+7)w+ ry|@ e*&Zɍ8 +5JKu8Zy)!5d~N߀޹@ PHl W |-ٓy@;IZ~saO&9d6J PXEKVDq$q:8M,/0 (@Myf1ۇp}%,n{J4hy4R=i\ V1Tq<~v.E1j"J Ɖ8*mw$'v'Vs#”r(A+0 K s̑y&Y@$唵<(5^[EOt(9exD!bK'$9'[7 +LH Iݕec\26דuogz0xMQt7aR~>d3,傲Klo`c^ZLzb)L+ $!/\DTewvcsq@iGv@<Z5hvkCB^6)k)n͕|E0Zh]_WiOVkn-ыX@u6gKxȇp=ۊyo#L3 ?ax70C0#]#fhtPf4Wu*?(gdG9 4J(quVe%"Yǒ;{5t>֡(I|C')h֠D{ǘ)4KX WW89rߚL"BYy@^hfRAJ)f f-tp) ^qD =TJlN崡t{N_(вzL۸tՎ}~tTg&H^#p|O/!ƈ9=ibp!h?'E.u'a}]"bjF޺9-l+Qg*=6)ګў!t^tor_>s<о{5bov hѕxXٽ8ˆjW{&X<سrbLe_-W`98*XUwzOҪ"x>_DtߦxGO$#[id<]0jua7&4nXvbNȇ0΂>1L)} S)Թ;ijD8)L\Hц߽u& 0 W3,c>x}3?Ss]P:tP/kD1B*3^^&zeRUZ5 j5"Ts!jjX:nhЄFIP:}3iGnf7|gQp5p\lV IqT, -a4X11}G8*jFyK9l#$Zd M*P "$n5|F]ASvi]"R /p+ hWBB)gό#1uF~\M FH,RpqP?X$V2D `1,H4`nbMX(eu}eVJE_+jgqA#Ufs.QrH=]T밁oyٶu~< ){yVp{QPeUXq%{p5ĒKtIMKuEf1-9> U)#o 1h ~ʝE: kbV"/Cp"8n+$ O$a%1aI F@H(`:@(aTL@HX4^jQ"6J1YBFHQ Y\x0*ȃJ)hH i/ӡ}MT=HD~):避Ge *??A R"ߑwo?<0)x3W)0gk_SB~~s2d:2ع#n4hKBf8ů\S0),F8njT ̉5/k"}Jo/5L2Y˪5dX. ,>"\n2VZHe=R:aeFQsω쀌5 hDbv30P5y1\kI,}R ;rR 0%`1$w&hΛZmj뱕kH'ǀ<_:I4Cc+4Ԉt+fa"|).ؕ{mR"jnXXt= )(%B=p.)'qpVI=Wp(dJ9`\~46NZḏ` cb*=(b1Q<Ȥ9O~̦;gK\wY0nޣ/Jz*` ޳`vFz]iz-1qSAZt`1%%E٤ RkiG@)BГޗChg[S0lS #ݞ%}\V idOi Mmn:o" o30"3N'8qkm]erkpQ_&ln%RAh)R?uǿAIS?ȑbK'9mT(Pۼt%l|8.&)gaxA3$|q-hϰVQ;2-ݯ8-&tu!=nNpgpϰkF{ *FTku6<7?=מkzx[dq,+U[4z\{& v Gypt`M8aSԄZqh5"KE)yڄN۳s݇ٽ=/oFQqmHe-ߛg׆=/;Cv NQ10/e;n>nfw CUmИJŽ+9P >da1"Da(iLTb:x-Ɇض:h1$ڊy4ri&UTh!\B9b"T)6GlL%5"[Oӌ(nzJQAЖhAT(*q=Bّ^aqa{QcQ/ҋ.ԨI0o"dRm:AJ-dd?odL2I%-[=V"'-%r8%Q%2 )I8=1]mjμ4{{ZkGvNmCUE i).۠IuչR4wwLM.XCf8|N`7C2[p4Wnz>(_A;U:XB C&3n?-x"8#> V'J#V0M++z !){eEOXQZ[ՂKzyrwy 傅"t2wp9*6oSnO`)llqG jy*Id-o_:w+*i**ʢ~dw *ĤR׵Q\c=0OE\#f `[`uwVVV~K(EtDZ]XM!d$='ۢV3Z)_ז턉F@,rQch;"baoP B//, 6+5Ȩ$yFr`BAG%mcEK Ι999EJN&&j'WYIѫz._cֵ_uN[ 6n$2>,jceYYJLLO7% .Y2":]s8WTEGa*^mdm#[IN&5lQiIюS lRꌋ2ha pR0vCxzt(Hy>ဝ貽_+&$cң9u8u:ؕс1ӣRc sEM+zCi2OBKgV9!P0J#4'a129`.R uHLbk&^AY|Two~p!(S dW#41d5H4_UN沾t@%n*e͟y ,MvL؇D .䬔f4;㩨|ޥf| G7w+ߨd}ʬ#Juҳ%? +5:o\DSdJ0zMv9;MawK DtR:lu!!߸)UيLRPo'!k0e߽ߧ..tՍU}vgc?9Tk,U._|+Wث{/ɻ~^>Y?eKV5;vIAFJs_iAv䟲jk|EV"\saF2*I2;CO--XxSLƛ.f«@UaJjxgQ9NkR[Gt7sUgrNW8Xl<`{O7v |bn ?'|$1!N?/ۃÅcٞ‹cha $NO?nЫ9D.<昂..HޤnPg0Ʈn<[6)$Ka9%OcW?^-#mk$FsJ3"42"؄/fU&>|.p;Q*XPp I YO{P,3AKIm=uA)!0|BO6&mDkό}6v7~}\qǕ_Wf*Q%y5Ų 5DM Ч1̉*DL~XF|efa G9Оeoq,O~y|X>(l`8`ׁ{` + \qOQA w˩ĂhDZì , U5ͦ\\z'sWK`2X# %Ʒe<j(@ YX_+E"6.ƄF*'Q%Qۚj3kAHCV72@{B}x;9x"`ݜ_*݂nxǨ!'7ȗwYcec(Lj4bD!9!R6\`n( uZzxvC1aGE#1pi)ii[OFW"8 Cu\ݣc ǺDp;+rd/֐/V Iν&-a B YXFd,9VH X#Rqூss.5=nt>S@%tS1)Vz;tv𠌞g5Q0*ͅN()#OLw4Z7p$Pto`,IMRxvN)pBQULOBS+\j VG+*W-; H 3:B4/w)̐HAa53OEL%q㺆=$S e=R9NAxwMM: =׍~\t7+ɖƈ X,yY+LF3)-̐aX^G4PUBlN`L0(Ds;̈́!?  ѡ-3(sDƆnZ6JD.!n}cQÞa",T{IzA 1[5Q|U WdV==GΗ+ `XjqD;٫(IUse"s1W/!-֑ʧBT=]m.ޙRz7Ǩ-'S{p#{jsZ]grKt{wɇ5~l!#t1yzA|铇ˏU! )ZS !9Kfe!@V%ܾ; &#P@l>CԲIjpp7U )uXN 9R )~kBXmk5)@n?sX($ 0']DZUXH  N00PTBn)Np J1LH4#kvx"@*rkTBDFʇݚWCbW ϼ)ӓnk9lkSoQI D@ʖ齍Ygb35ܟOxw1)ffw [')V?3W͟wcC?{XG?A*WAq}8 I } Lu5 )o0o^&(c$DERƄ#H9$O%CvC\ ! * @N ˡ,:Y^.vKJ+H& W'" +ܹc'V\1g(X{js3s4F*g=,.AzNrs&9[\:EDs1ΟLY r \ v"0D+xd^q4)8j!D6ba'LNaGQnP.فaQϹ훞u=+6XKuYdfr\->;IZV0ֿ>T."H`'1H%xW^M-k_\V~+55M.(>@z|'S|>믿$.vyaZ,6( &ĢfAbg~4BN h)n2Z}%6kxShVnB^w|+4D ʣv8Gل&d{ ׉=8O0m2AvؒI'( StĔSsЩxs%D .hjK8 P-Խ;a7ISg49FNçB CA;wV7_=r=L {,gCNݙV= `PbIːu21tɑI[3`H0"[b8C*MIj_C^sx/adۈ{#GoVN,6;g\o6KK_YU _Y\W3÷x79b/ݐߙ_ Eʢn_ɝrc7;j}×@D1z9͊&Tg@PqDKt&ޔǙ`fBqF?2Bm< 2GPX=Z~KB;emB(E3 gj x soNlȔs.Hq[ghi8 S@:΋2{.LoU3vdSDy*p5'*L)(?s(|-֔9>L`*%>joX&(Qѷ/8coٗꟚ:nT^ύ^n}-0`L|[{` '36_>S,'O l9*W[{-8[^_&h%?ز X dD(g:\ (IȥÑ ^;r19kx} 5pɺkMRZ67Q}t}$XSUmιF q&:/Ӷhgeq|Z-`^3;EAt!e2#HL`H2F Dt;+:$R$+m V&'HIݲМHZj$cRr2o \#|oLjO~=*͌!(,ndY[$eʀBF ƑBha}DQ>CHC&=)Lُk1}|7qF(q`>: NNN(#)#78ݥAJNOF"{k?}w4@ٞjbEw\gFt x@ȐW!? AX2e>zɘAfKˡG 0`lk2<)Ql7kBQQh .rx |Z*LH#[poFT>za%=ɶ+F\?hR_;k 3cgq0}* mY&ɬ3иGo@|8Ҩٻ6rlWy$=IQ +ĖZ ?$KTicZj..r%/OB7ַ<.ksKXڜ>VֶSlf+E"8(G#v"uLŤrxP|yaSl-0U _:W fTT%\QXjgSPssy{#1p+{a7rÃA+Nq~R"4x.o~W琡[Y-IœnYg厏g?S;}+!OnLr-wIIxAD;5[rL66gKPd^@(ԶW _U[Ppn9o+'($Om1*$eA=6`l`ʆ?۶{4HDpkh`?S{`@^EPW٘4SmgAoY;?|>zLKǠXd 3bd1Ҍ1B0!)VB%8J)S"lv^*w'[Mi6VW]c(c|z*+ͳNk-eR"FH9(KTH"T  TsI; QJ̅J  Q= EOt`6XD en٩ asζCӶ\O'AIo>}?լ~I`֥!PRbՐI~>󋊣Umjn 4@BD(عhP #ࢳ+NsqbƉu`!lT(JiDpR#+t`f.`Ȓ|rCȅS V? NJ&g߹M?<FNC T1ڽ2Tr4,b,P/`L8*4:ZѼg˪>t]`Z4霄\B iY ([{e+y"3U4h.l9ڴp\7wq߭_}gJ%jh<W|2 Id5"(SImnjckٚO_ϙ.#x.a sANm$(Lj:us6Nk¤: 9{85FS.7Y9vH:!|0-Fx9-,׋vZ t@oҧ6DÖח;)6,ڦ[[wui ˧o.uzNAԒ1YE+di W:5m= 5m3mR9A58PZ >Z\9..8 ׸U/<\00"( &YHOMgӵy،0ʁh+H7T ]6 kP| j}Du BE8P>BhS"e嚃E()(R Jy= pm-~޼±!`3lB(/D @䷶ذ@Z/3 IMn/ڇ (1 C5fD0ZIh &U* {C)":YD 86 y᩻`6 gf3 uv>GI^`_W(nh{i/J#5̦b23g,~0w2Y>Y#|# wKh{??. h}-!:ؘL:;:.˹W_\Zk}e}Nr׀n͔\'$KLrQ5t5$fIi8[|p#rRቚ-E};bm;W !9G hȖܥQa! 8262T~waj]>uUB@EwD}sN#IvY'Zܤ 2 ]Ё/7NHt[ 6@i#x8F *$QQ"rJH֑wtZQDe`ry'NNt YVr~ayQg8uABa+[4 J{:;8l?FlP%W>ۇۡs|5o> >nw6:}Z4A>{/}pI܃-}*$`yhï%myˊ{℈~]=7BuI28!rq*@mԙYjD~YS iۢ>}M-d0u޿-C`V<\w P9T%9  ו/:JPYk @~fF*ݭ@K"ׯ=ϟC惛'CWWTF$ I=Lg*|=Ti8FO>rly H zB ϴω{"C1Lc !ai:c† S_&ShYˇnC@oӯ|c˘`@ },v$t`QFH4p YbXd]! ~بhkH>UeT_^AރKݷQ6˙nC{9\`=epك}ll4 cSI 8D""Y$&*2G'HpSS5r_?^ \eGaWL czdUQfz+%Vs<%̨~Tir:̦X\?8HG51Ls4{wWߙ.$|~wqb[@伡YqfrjPfb`|<^>1GD&$%X)e #8bLh,=$S0gz$%ũ}}ݘqI![唰ΫԷZ8MhxT}8,!խT qe E(NaDxzVe<)'KA`׊ySnr7^>ƞ vOj,0&ㄦ8Nf $%zc330%,OIi#Ngك8BFn9Nw8%źCB;5)#)H͘^Ab#K*^[KҘrr1<(A;i7*VT Wݬ(o26cb,ʞD_!bHhFҳ/ҮuDJ #IRbp1 H7 ȇlh; y~`9}0%!baRY߭[f ]C%~8Ï^>Va*-z7#^g~H9i ڱޟo/o_C5=?Ww⇵?;5V?/c1m$!b oJqK ^oV;xCX¨Ppaƶ|.{ GㅚiNL{z lPO|u?5l>t&]xr4?H5!^n|ds35,^;ǛϋDV7\*z\|6J(z,`dBwASf7I6'{}Nl J}޼qfy ;E6c !Q2$C)c54Tg{?ymj|\4537uPS1 DMU|XpJ1(1 Ƿ%8^I-N㸘)g&eԷhLF\0DOzR9ghpG.0Ru(@)O8GB\\'YADX :q2AQYLXAʘ̄JSm{7@ފDP9g' t(~a¤K+WwB Nb~$Kꔄ@@ܲ;8Av. q:P48Ua21Cs&R(&iϾ0MbW$nԩYheo;s ّuAzC5SM^D]BA;1qljwBrhC1 _^3w3L{̸5׹k>`0)~1?;-:[@+ͯs6|вjk\ЎYOP9rdmV/ *;&u"Ti{(H Pny6[x')K8_X̅s.ָ`Lsj r& <:~  9:5c䳷{8!}i#܌p !è69e*-*ĹeCR'p!i+AEߢG6NE{}NyG{!W  |fo&_LJ] jT-jzJVunvh13ns3P`iEы\;g"iЊIvG;H:=eгwȈ$PF.7Ù~q*)U(D/עVu-K^*R[Z_Avs5,f.b#E`Cg`b zL tV!8`֡JZ&,ē-&I%BaY WP\xDZVҜ-TzJb=fCu0+ udCd"E^% -0cPݚ*18FmnTN"ăzjx;AmRdM:^rs rr6H9b$T/C)*zkjza^hhڗ,yKR 6X;\2 q}PVH`R$ 8L&Iրp O.u#VK6AfXq#n(ʺȷ;aSvH<~@G`zO)sh R72]8'Sƈ9jRw7}L,/WܙV׵Y'˺vfnxԘ2W5۩\vQ&Fx91i8&6$ XEF I;A"ʝLcaě(Y-Gx?&;ڭV/VK߰_% v0a74m r?K#Oyp+؈m0YŽ[~&bky~@mV>\?rNyQx?I(| Τ'%o&eOڒ|"$SRC: t jPT1hK"yk<Ѫڭ EL֮R79-gU8}W w&D Er *aZbe̮ c->rr,nMvˇ{37Yih8X | @篊r7>K~O6z(9^Oo瞩]-_ ̯z"{&E1C(0""x\xjk`/"?/VhPJ؁Nx 3a< Vp\MFvG4ʗKfnGT^yg\>̭_U|,s"n"W97q&f柂7p;Nll6f9Cxli4sn\M?kƿCuJw -2\MnǷr>'dÑʫu} ( g k2^|gZO~d>S8^ڽ7Ƶ&"4cübvaqpuZ Fat:SҖS^@X`s֌#؞fM ݡbPO*|dsim;\~ 艊 "PQKps%$q^RFDG>)-]4iW3D-C4mgkwR 㳕r(%?/uzZѩ>%̨Vo4&j4fXmIB_jAJ[OWb'cl H?FY,lHGY;Wx:CeF j5ڜV\nnZʽ[K[G.gXOgVUγǬeƳjm^|zw7n罡&BG1C85BZQc s̘E,$Jx쒸Ku[8Ō':kc8allb)BxHD9O6'quX)\[7cu'̿/B]vGUtap=@; {i7^xvuM^0T$2X$:,r(qsC,F Iw1pr s^]QK[RUv E\1 kX@#9O:bNE\ND-#g%w8t$}B bl YcA+J;%{/p-QVv#0 &%2t12(l3j%" 1b"#GuP;y֝ l'T0Nv.unތyi- ;s[Z{wXov:ү {qim ̀Gh+7ZH~z/<&1qc(%` "E]h?!L%f!l\Vczc˴:(̅Aǥ:{*=9T-V}bD*-48QZO%j(X|P@x0CU$ IK\z9uuU74e~ 6Ղp솰Y*=@KIg,1`9H|BpPY^1/oq_6nnl-g+_=8v~v:Ͼ?2F{0x2-߆HtV(rOb9}JD+ްQ^} c gBp 8!\;c8׌g/P ߀mEMwYQ9VN0TK_lL5ffhKz`h5cK l`.HX* ds4VDH¹#|uJPͩƂnjIGw18GܳX{7pϊBhRRmq)!{6oKd}M0m{/rb\O0f1Ƶܳ-{'%fHfҎ: `Yk'6az,9ձ/~QvU6T|}_`N h᧜bnZ\DdJ[ڍ _P脎QG]$[w#y+j6$;"HB>nB脎QG]jћv !TQ!!߹6) 6?4|k 5Ҋ 7LdJ~"NvA־?LEɩjy/|Hܽ͠dV׳it_{Urm_o.LU<#5AOu%ʾb TcվӵmcY&)!x˶֙im߻ZCg*)Ɲuhݜn.Ђy,;U2>&B pm0u5SlL]!-B*FF(wxOs Ub1 '('zn7t81r.uS;]d1eFX"ѫ)shx0Nty.u8GR}sm7|MIN_k4MZ.sESB&ɷx(d\ߒߙM%ӪR{7EAv.@|g=/mŴVj: +&8tnEG޾@>M emTlru0E6¬IsW藆1>N%A?HU} D^םăGrVt_^qN'{a|O]fQt(&5ޥ*v ViIٌ133w ESf`~פ[0 GCbVv~Z).C Kj|b&MkYjј$@ G6R OPRM[9m"]{ַ,?[ƨEd޹U±$GM jŋ[DRG>U9U)tMH9&Ϫ%A';GO`t6IJrhie~Y@{_;=V!?{VrEH )Qނ v諭ز<'$eR$JJf̌s]]U_ɡW5"DG~jsDW$XI:Asl\ Pђjۭ7$ERpDQ$.8O>?~RFq$`UJx#bZHJ Q)֦(h(2M" A! /'cXyk3N( ]1 \2(}ZsYjYCe,b~>XQepe) n{iѠQ3Ȕ3&hG&=Զ[X)tmSƘbh`4^Kc; `;s9M"sD%A^yw HIX3xڲ^H wu5ʾR4GNq|00# KgG$ "zMAQ)YgZgaPEZj- 31朡f|"^e2- U[uW+\3:b\JPRϥ&Z?z|rU$L1T}.^W@tU}.ﳅFLjn0Z L``l]Ǥ0inXzRG uCFUEJZOph3Pn.n)N#FE7]/={%ThRc6J<'s0ADdBp "H [ U}73WH$pZ(jM o1uP;fT#,8vČR^##vY93uDYhK qZk5&I}>]lF,콠i1o̯@f哔f>͇%n 9?۬VMxȪ'ѝ1.GJQd&_dҢ?~II 4UiIsX v_Qf,E8-i#lLQ>0PySȏZ* Xor5"h=6t)Zh;^;>gqj9ڌM=G%2s\-WD ^A)V2ԍof|gPfZ.`c$J>/7Χ sB~(-sZ $S(65 X\~9 yê;sPQ@Wӭ_ofv{L$0ۻVRy̱J.z0Vx4HM# |y;E9lk|嗏}'=1|/m4̰F_2v9;\oLB5sۙ.9^<΄Q;7O1WxȤ<'Fڤs[vA*Gb';騛#g"Om/Bo6`>cU M2*x*rE@] +$ []̆=hֳy^]Ǯez]-6aE\uϾ<q-3=o/;O{{5ysq;dO >ߊR7JpJ^*JO܇HoHz]AR*/.l.*i'Gi7))ʪ#5%FF!0X]Kl3bJE5`ǚrh[Q 3\^:/ID/4R?@ AkDIrR'ݩR߂1k15`Cݞ F Q$3yo#iUA;ZZY) )l&K}b1%zYBM>60 wZcltqj d9N8)=0~ђɰI0(ENz4hxTr$.a'+ d] s2pi ܴg7]>=1i+2ؑ SMM_hȏBG4p-,-"*"+aV^j{%{[HItļq֎a^@Ni-îOUߝjő:Rݻ%;Րw{8tiS zuXur)?ݩ6MKy /RG/7FL 8&`jh;[q>$g[gfqxP岌u MMu-p~t3%w@W5mN_E&n[+h xc=!?n)n}ePc:}ƻ.:E!&nVz!,'7 6%,FWB,OˇՃg%wԲtӕX}Uowg TL*ƍjPdu<FjHIlLFϣ 4o onjbV=D,]`- VU)D5/  @kE]D"yt+݅X]|zUPW%U,p^=W!?biJ<{)58w"{sBƟc;g%0 3tLUIq5\f#`]O:I<9&΀)EzfOUst ;^y#ŜL~3`~/7hO;ּ<FT.QRVMQqqmҗ/C!TU%@8YKRdH(*}),ws/Jò\v̙` ^sRфZeڡ\rI'pl3փjdFt l+UT-RVV*}jl;YKR:+X qܤ\+Up\0MPl׏S6P{A-gQv|}wm?K?ُS}|M%_O&o t^*xzc 1E-@r,[rtƧzwzZؕ`E%]Kujd#џԊE+JY0}X$8"+C;8`L_= m Z-0ym\70ƒMxTDʥ3fǥ􌤓,3v|9i6ʸk>% Ƥ hIGtTǞwM'݀0 ?2 $!e(6:'3%R$掃Α 'mul&sޙj  >q򎼖P4)'TƬb2*!!8fS=$aq7LԬ) : ytD, PM"O,p- ۇ-Y L\(r'e9 "s9dGI 볭s(` B۳hc7ޛܴ#U)D58C},(?]& = s۳5=\1Dg;&#ݞ.ݞc)!_Î3ɥ Ѹݞ}lx<)Ts%3SoO¸K@sXI0@NמK96LEfƨ8#L3ݞ5o홙╁ |D8ngܤZ4kSNH/ITQj$2uV*JTםE[uV*iQXuVxc$J_Be^Ӎвϕ6F s6S" / _G-]魵i!vaJnloٍT' IKÐ]- w4նSA^"xezgB+OGG1SGGwmqH~YΞ^H) X $yفXȺ8,odI[D<_]YU$u<V# Mw?06ϙ 4ڮG+W*qpt|,a8cr|rstvvg+ު}Έ?Xp}&W}LIGgC^~r;QUq_;\ <<|lquR^_2wQIN)h\2G \ZUw5yܯ¶u&YIq]kMrmZ cI48!Ƌݻ4ړ4_\D'%G[XD5J^,'Ӥ%ٳ^O\^rzKRZhӬe S%`=wPLO0p-*bSׯ^mkӲ067L-]j)2mN$<9=GGV}%Ss'Id`%ҽ:8g8zxVuI~`ΗKnG~|clrV@Sj=F HW=ClFRU1_{e^C +A =F?한 ;YPYꯪFWO?/s0VNzatVs vҠbNC2PbX?2[O mgtA M0P9E;m5`y7-# :䆚pZh ښm,>& ķ B .rȨ8hO<-S<8rw7-# EqP,ɁU AdDĿ w2B99 5'c>A{m_H'{.7xXTcRӨ)VX^Rx X $TC*3Iѽ//y]G7mZ6NϺNd<=$cYs6!`'E'*malLpRZv,o_NE #ޔ6 Ian*8yvyN(T@}hRb?Uɉ{۸m}YS^ f|gf?%iZpK[>jƖh P6TNw+MƟ3#aD'gs@,hv 5NI _>}0ZI>(V]N>A5X>7&p`F$sM40w,6ۍ.q>nNfqA6T/2MFht{n"d3qF>Fr1f?_%hD=]Zȯo31KKM!L,VqM&%3>ѡR"g\.rILٖIT"/jۖHBEHEAI0^;uIW;VNUPp yP&"Uhϲ,Hla=6dvGAVYwTdf2A\Fcޢ˒#rďg0qjzr`.F L)TK1jfɔц$Դy(ڀRvǝdIWK-zG Xt񒒵uq&JQeZ6z֔hjֻQ:5"$ 1c$bcVm2(ɚR&g_ i[fB)cD2`.CEd{̹1kIxdGge 58uaSYZa-WZxzHF p9X v>${cyc%#'JiCPY/kgX) H @5ivy$<L "<{r<}JM1"ڱчS^ia1Z -@ZMdg9"ˤ8%ч(LAC$Ma]׎L.Lk9GyCqJ1e,SY>DIQCHB +(n,\OQ8~FbQrvz-|$,M5sPglRx6`∅P`g&VAj9Zbcdnsx.8sQʘiϑ $$VQHQwGMC;`+0@6*yjqdMb f_5GӉ͈yɖ]캞[rAXVyI֦[c-a˦[gF)W{SY<+65VNc[3<#Ͷ[dl[غkEӭKXwF FS$$އA$+Ϡ ԉ !3= -#lZl'XXVc07U쯫|`_]/JFǩݘq:_Ri;Sw/;M#M5uUΗΨb#)u{Tv>/}^ۊ8ٌKbgS7839iөQ^ [Z>۰pbw24nzOˏ`Em)qYnx.wI3ŏٟ|˯K{`"s|z9^oVn{_\ı w˫gpտFaotO.Qxo/󅭤Ag{{NGs99z{7QDEZAW7Pa0Y=FؔH;n׸yv/^hl[[Z]즶|viv,PZ^n1h^CN# B=.\-[@s;?UXmXmuSc.{wnR{>chWO_m!E>pk>)Fid'6իՊܾr2bzl΃.0]۸B{Ue4 ۮ*%s͜L =O#42Ӽ雷۲Ձlf79YvƪW]ƚI)k|rsd[yw$ GV;ش_K?3^''5W䖛k;d)9R0zMB|1\moXX2eeCx=߁[[W$lo45d7A=*͘>^UFhG<3'g 9=_Gx[ثnr 3`'G:Neim uOe}ŽC^~W[W>~yz~#kr]Isd6:6)N%:Z0OL*-o3ݔpY?,MmZ Jee&aox[ti- *P!:.I/qUڛp_ctѨL_= #Ɇ>^WIG`/S# 鱥t| gȲ_BR?#ۡO=2ǩ§AoPa XWJqg=؂)(\7Z?Ʃ-Nc!ӳ\Xg?ߗFjoQ!WFVz7$lT&g PdD)%Wz~EZrLA]*9?=Bs {v~ROBmr6}t>1U<-#doԧa[UԳQT@ ޳q$W}9 q> ^u0z!3(-I_5IY#>Ğ!eHԫQ1xeUA_@2Rwd/sä=> KaV+%r'paK*b%rQ;2-a/#px݁zP__/`TJ ` b r ƹE`]k j]1Fp `#p@xb<DŽ. |2ruF{tC5e_DʫҔ1y[r# 9j)1#/VU:W>;G9PB%.Qҳ:\lgu 28xG/o=K!.Dhv0c$tp)D^ {,y SN|:C͂'Bb(T(֧P! `jɐ^((vF  ^$<()EY!93sRVo$+(ݫ)tGq3K?֫Z-|ҰLpy'9 aĄ:HCkMj)=86VXƊe}<҂i⤇5vDe-cRJϽ{˹%٣А3Oϸ"N5I):T >O((V1CR-##ۑydY61(JoCy(A;W;,PI ^QZ&R*ej/Yd5UT"ނk ,,/+,Spު&q:,Zp w)i*u^Of JsG*.X» #;UFVYT 0!J !ݑo1t씅 ~5}~dL7(wQX]o@w怜~'dljEe&۹ ]tY6o`Q|X/VWݓ/Z?}FEt(in%Y$@ ՘^Jo%~t~ \˨{`~Uv(toU"fvVL鈬Ga1?i1rHΙV}#i덴'*:C2f+D1z@O^D.$@o~~o&Tk=`lr2[~Wω“9`}ͭJ ?=8GR,G9!ߖ:6H`ǚYޜFίkȸ4x*c9[}w_.~#d:8AAl" nF-n8\'廝sڐ!.~RXt$T@p ׶$w|Ԥ_5:=FXkqd palݮ#0 5&;5j-Ev2s~k^ګ_s*HScdwE/ <^3Ԙ(yGaj-Y `<,xZa~}5\d|!]HFkl7^)*k~׷S~&%\޾CWVc yDu\]#%ϻሕ}6|/[Ůq;#hu~ʷsx Vo I3j9("'n4HwnQշv/Jn%$䙋hL }vv˥A侣v;bO`-=Qg.e 4qH] (KXjČ3ƱȫZyUy(a% zͫJ7MiG@S'`ȭ H;e=jIFjZ!%*[&Q M{ucQ^ 5Fe}. RnBstBiWߗvOe} y"#S?Ivo2)Iyht2 TR["g.92%:{sGΣH&IhU$sB/zE2Rd H3-JLz҂5L5?k\C\U|~ 'QX8[acKXQEyrD#[b pڽJܥ"HEJ{|'i?ōx=RҍEϚBf`ӳ.|Kr .E;SOo7 MFé)Jk0 (QQخ.9pϞ럎ȃ<޾=։퓙NYC@џ TR~23s.e}y4%yA if=rx\|}dj\ήV)'Cb[ɾf]&|SI]x*lIDu tGaKX]iRYY.SGb[:9[q)dI{5>XRZxP2A8o~}AV8 8bsO+ﹺgUS?}1 bһȈ}߉Ki!pۣ7$VLj,MbMPs5(Dz10o۷pWJW1Ҕb,#E1pN sS8=1b QVnYl_Τ6 r dP1Fc S5 ,H=$:ŽPhʢ};pooVȫ%BZ:5'a9P+B%a#g)90kxaeP|iH@i`]P$zd; c)0Qf1VB^ġG5c2 WP1+ZZkE`IY@VsHa,H+<^$BJN;o)8)+A^"TyP#+F4vըf4{J /(+Nh]/ώ ql]1$Z? flz%=ږk F\m8ʅ>QSi4knԆptm8ΕmֆOZ)ʊ[ kn%q:e% > Y޴&ѷiIX-{ʕ]r1:[s%cimyM&鴺^]۷;첑TC˰ߺ6 ۲pns''\ہgjz䶙_1|W,9ŞX;v`RX!%JzbzYLIFǫ\*N_͊dGcWP&7 #M]B{< dS!v̆ ɒ{.Σǘzp#"e}o5 Y^x? X BL:㟁~^ '" 3-p n\(J9SN9o>Q#qBAT33dz(7˪D :\ ķVZ[/9Jc\ΩG Κgh:  n֧ ^ $M݀~.><޻7.p„\|`M %>t&S-&(a*~?~|lۄ)ZvVaII ] Me:Ń$'n:e>H%B_-NzeDKX5H)pFEZÂ:6m5ۙogqA)988l9MPBf*JIq2M&D hΜـ;D.<Kъa[ziW2)4GeQFbdn ҐP y牰@}=Rױ!J*?Ŏ1FȲ8=pvSu}$ ;_c0xkNEZ0omN7چ6:!O 6; PI>IgK6ԊEٱ5ALz36 4)ҬN3iװ'/aHNH%I&sVtX X*qRkJ?])Z&&Z \@h4!ɪ%F)yAv(pԡ7DnMCƶucLn64_xawu;A9C(on1/iʶ]2Զ\}ve.Ҷ:/5kk cDZNe)(-1-{$ mU#Go2D!xԄ†^֢:>Vs*YNh2F&[Xj̔7kNרLIryd%M5'w :u # mHJ߀l[؎ рs MVrxM >Hz3s)i.4ccOi~bdS ,Ul:\a)S)b.웰I.P%mdyL!Kx~9Ff:='U[t Y2ĨtRh A ~,%EaN+KƧ&ulh]_9ysF8+2ʛ/mT?+3"2UsLjri4l1%Cᷓ8@`7S۞MMvm? z,|茗>C /̤/hͺewt1B}q KxԽL%a (ag|9 M[Eѫ![weZ#WLZVeܧW(0YY5sN+1b"/؟Vd={ZiE&9Me]yyY %&tG.\M^2X^zFPvxXh%OV]g51GY@+@pݚbyY'k7I9+^fM\cI˱JZx,l XJKS@a?7v~ @RM@T=k(`"wwlRR^ n`U9KE![(6.m1$2ZG A(qEhmvta VsIq,10JGP&Pף#~3|, k} =IR+ؾCO%TpeFLOdduRw(J3ܖdTþk>u,P€;,yhma2;gNؐ ^&ګ v^Zd FB4AG'xNp0z+V'B튝ZzNI߂BgvFYLLG Sj$wϓaOKx^R 99zEL(4IR'\Ҡ430ice 66 N ~nUּgUB1%UtfP8>o n+>a' o~wmtkZץm^1=/xWw)0yB{ѓO]r)xgst>9O~c &>>~ fȟ7o;U}x>:|b[tn_W?m?cl?O0nͻˋV^mUޑN./G|FA<(F- (m˝V${v佰=|#z hXLF緶 *[3|]¿Çgޓ3d}V ? | 5P㟙krfV,͗u驛}WI0Y4}IzWj -.Nf-)h0XI=mU6Gߎr}w[Β^'$؆'㇂,/#iĂH%=/!E"LZFh{֮j7W zxu!;()yt*$kC LJAFYyTjZLԲш@07]+ck=ti .2z@jG9eAclba޳7Iy##ؠ-,1޶QƄ}C x: } :ʴJvoWOk䮦"V5|ɟbrBW؛îXy_оʬL΋KO9"=S×8_#9`S܍wGU(%]Ljw~:!Zf}|Pk誄8g8hVc(NX2%xo\q KOsMnrF3۾cz=B,ݯy3_ON? Ln <-j $A+9lY@{#@8@ ڞ(@FX逭tDIvh;-Ak0џL\Q 9xVkRh9a 5^[JL%FkjKD`&vT}~nF+ھ?n3V+eKoδ/u0FlБ9#TP^[!)ϴx?ead kҴbڭIF}$۰sqw}IO.L+uI"ue.i+0/&;L@_F-Vnk2Vj+JpW(.RM'AYgwbT J"tP~g1r¨]FRkBwD 含¨ 'kX ZXr4V Vg5[<﯈^➸.9QQq&FmUYWqÿK@ǧ?caͰYD&7E ;*%Ė7Kp P [)TLmPC38Z[J%7̫ijT*;\\Q ijTTn%1mz̹î#y0`zJ_#Hnml퀴:\xǷ,syzf/CϩyQzYY+kDM}ḭ\$us~gMswy@4  :\ PJJ-&;O(SîlW:Rȼ.M//ي91vfBP`cML%SAg?==7ߞ;@LbA6wQ[:FA[t 3>'ee->GSיuYe&`<+gWmȌy6I`Ir[YcM 'fj-p+N-*ڮ_Y^qU3^i!rW'pv}Gނ/z8j'FʫDY.+}S*vʫt Ȯ tUzfq/_-S_X lʬWb4k:^hYh0!|ڇe6$YK&k7E9ky|M P*?;NH_N&ikN=@g.2թ44emxu$IɮLlu9E򲑺ޭ9WSљJJJF W>@lX[נ 10 vY:+9?{kA7BkTOdY<÷QC|NuKY8f^6$,T)s3gyrJn~2FdVWq/>a2 I"ʨ!#:Ol:v h@t0Ȫ^X4^ݡڶUB* AvZ5YtZR/ 2=;ɪ2/+gU .` E=%/~HdEeXA2Argr/IѺMk{GmUIV!_ܓjqqOaKR ΅&'=a5I Oƃ6s{zP I $QTܳ)[ !G~^e.[j{ezSj>/bC w;Qݵe.n Xfp@)԰-+Dy!vyD>jP[Vi(3`}v[\xǐ]{z:/_z8a'FDjngJr]P.I*tGY}g pⴵ~}0.{qXAQHiV+YxߙЦ,37޵57n+鿢˩,e*?895Lrr 8Yv$ٓl*}lS(EIf| 4k4@_Ғ7~hZFVjHxA  HS.]RΒewH"ݐb !*|ñp=z\\DG' @}ƈDl&.GD5KaJtK=*+ eWGy8nS zM*$1$4o͉rK4F! bb%:vtx_#HC(.͋<]C?i|0DŒYH=W''HoQUBV!R ?փ8>ijj6P!) >O ?rxc`##Pzpv>nQ1N3T,IRHZ$)5NYdfR\hB1r̒'rc(xbVS8n(}LkՆfL=rG3 ’mţq#fKKl>pc> ܼNfꕂ5J}>VY`1J8AsW=Ez!8^~:47mph *iHugFxm;^XU/rW1oa+2G#Lrd24aڴ%I4++7NM;T(Րڣ;~poo(c65?_Ⱥ2L=,|(79_}r]Z`<B+WL{쌬1*)t~0{ ^ "}3R/q޸FQ Je|N| C$$A5 a'NeP^\;_ogФЏU4Ё=)0ba7 isZ>uieNgɣXIqT\w:}>.;7?qM:((wH0Mἱ0s@c C9Iu ,dhƮ/]8h}dp* ækupGjZ>NTQ:1oO]=s Ȇ1\N'Lw0;Kj<: N*GƊu# q)V6~UxU|sK'~@FTqӉuO?F[,].?G"}EʋUQ-6B&2Nc+^bBML4Z;)MĦ&M*q%r>,/va]:D. ˼mݜ;ߒ(|,'#FpoV${!1b%Ϭc2#3RdFvjP64B5rEQ*eńouci6=ݕOX;^-;NNnѯ=GhNΩ/?|~l9'Ig,aǹ?yr32k3T,rяya] =<53%giX~C9-LlOJcp~ u6]s ՞VH/ڃ1ӄk. ;[P]swgL+ *S®ױKNf<>]ѴE#fFA4*RO{fZQT)uLc27^]ΗoPE\H6vEm.KA }H@@\H$EF8 DE "Jxfa%0B;SAJ6Ga#w? MGp2-g^ W:u-]C ,%V6z0woԖt{^*t>)7xy@){_t2; w7u(};p}sRs5ywJڮ{a"$6gr`98mt& Fk؉JTbwɐT^>BDp>w,%N{\2KNX#ct5cR1ȯZr5Fѵپ~Њzҷsȯ.7yoK|?TFNoS?~mrӊUfBy66NE*wL}Smrvgd??c~BK4^VM|'7;|03Tȑ)]x}/ifۣc06J5giX~c=Ab==nL~6övZ90Fl}n;U4B^eWPZT+:#N0at$@%I$dDb262ZH"2&2E509IOmmGM<[Mc*&|[݋(KdJF7RZ0ǀF,Nx9"uuš͕Á9u)FTp aH e=~"U}Py i{{ԅ!,`Oo3T\P sĘd~DTIo.#Z-]N Z-jںT1xChwFRd9sJN /~ `؎$]Zn䒻c(ZYg8sF72s_+@b_1\?JJU.Y#ڤp&%y.P.6kA󬂝?׿x;M,ӽJh{VӫTd,ҬL3BȥH Ue טH`̔YR"IP)&껧!A4nB=(>JwT9`4p:" ܠ2(H08IqJTAΩVLsS؁;g݈/x6\䩔 RIV4?)%MX!r0 )/Qy*B:2 0fc0m5rPbfÏ"]}6RxR(-T*  "SAi+y%%Q{9KA>85V_ J~aQy)6Vw)nw ƅ2FqFRdnw5hP{jx\ iG*y+i9B9*22$ctOG8^ BӱKBܸ~WD3? %ɺyHyp$Eãݢ/5CSga?cfؠ :ySy)؟PHUP@\u{$oI'Fjy^Ӄ_(UL>tl{stH}&2\Fp8bIISOm6^W ~;ԧ/<(X${GƋ%5|IY9oSGOlz_GJ=H?^MZ_]U_7W/E=J\8.cGr3 (OUb; ꍥ ^ yp,I$S;e %" _}}^% .f#Rf<@ܦSSn4XGK f鋿S4ة:ͫ+&tMu|*zy7opCgf<Ё39/~b#0l ㎴!!++8VÀ",@OAtJ4 $%bכ4][KEao& {"h>&C:E6'9vƖ;ǣ I[ijSc[J*T≏zd_Pl, .>;mоZ]q 0Pݱ< pܻ!jq:ųE9yh=KtigO $]j@"BV˩((Fھ^v5C0YKezPΨ"QؠKS gDM^u[۷Mn6cme"JZ2^Tr[ݒ%4тA5UY&Y=Y4unjvJho{+˳J2#\RxVYRy9\ s^#/Ivo~uߖ <_>4>M67NJL̰E1yzY=n\_?]_SgfP3 4uէI>O&yDU,Wo?5ӆ,,m*SeFO4} Y4Zgo#WVcaM+{2BvM|z) 2fM(%"E$6,Br'>}X,_ kf3Q.ý0bTg%e$Q̼{UWB< ք,r1/;ߌ>YCk @ Ie){.K1Z rԠ2Eʬ(sMM3%H\'\0Ą|ELRZȤma|؆.2qv A{__6#\(fP%ڽ 򿜰T+Ν⌛1>nMܵoE/Dfm!n0ԫo35^j)9g$Tm~g}CſJȆz6\y/toQkRTwu-7]oC6>V]mħ lyp([jӃ_H(}y;Zr~np ?#0Q \ՠ^nP7WpDxAO#tFQFD5\@ /v(? %m<1䛐ef[_=mj*ŝHAq8#0uMy5@G} n]jN5|,ڏeITI;K͉ԣ%R+E7R/rDX~:pܜk]39#*b\?V 23=o$ُܵn|F~Ǻsu+BOt;1"ӗSV{(=/ϓ}=:Z?8wGQK~O_##ӕ}nF\ׄN>"N^VOl.ʰ2{ J#n}s[ZdRw.0U#!dZVz|pNK;H;ވCCI޹g |%sjg#I%`tX#F:Rad(eU▩ߒt eҨ믿$W7YVGɟ7j %Jr!8 cȒM˜ȵ^)KLiM eN|5\PYϵ1AJpԉ#NXrn:q#l%oA0GT[K}ۤY(Y4U޻Gv"\!U2\ kC F&k9xK?T:3.0(Ƌsx;fdĆҾg8mL%ʆ%GjN 8AAJg$I%(eD&-nf|A%FgFkqxf"K T̴y^?e*""})8fv"TJ yRH,7R]802͎4[=sC݁v_(iZ-c1!RABCQpy]9mwX! .=;B7ӸN,tz5=#JzP=nHfQ3=bc7A}/"mt:;! :a-O+xt'YPiT¦?ˇGsB!xY7֎m1=LH/lr3|:DK`{aBo8&OYYUSH-\ nhإpe•'.8)ԔHqq[b5/͚|Dm -,h1} q>e4 籰>岈>Fn=Mϋj?> Lvއ.!njG0L^}]^ H4ޖ$BS-f祘HX*"RIPTzz 6\Nz-Pͣ) ?tb)X#p{p5"'NL=53Jzm{)v>toEóZ͋cY`'/b],y_2/a,RI/ʤ왲:78Ty>>(.w}Qx DeFc+f e6$(Q\FDJ{}6b(qjpRP&|#Qrh~(S\ W&9WH#VF5fr~np Jd˚fi7e]]{G) ڡ''&R k{b]0tcnS&s'D 3l]:Kp]C^Ps_Wm8+C;HP3@xȎf Fo?6nBs[9QkU:f;ƕqIBX,7Wqn&xZJ΋ef9{zy-)>3}mmvaol?ٗOŲq疫ѬmS&2+N>/Ջ5cReDr`Z{jU+]jPY!\EtQE{^Ʈub- u;h8E2u o$hbhWRcw"Csn8-η2w nZEАu ݩ,p[_,+;:}P$V{^'Og{~pb|<,='/ůןuۃwl#)L!ъƫ}ҙAN MeIeJ2Ud"/r2x)ME*-*_pfP ́gJBd$CJ$֘Zu*BgZȑ_a;S4c#}  Ԓ)vT| $i/"Cf"HC:j1Ԋo8r2!qPy 0 m|_Ӆym<|4Oc]23J&_ EQj45~M=c#vӭ'ϗ]iq5?,o߼ '~RVI,[H ݚ>6(T?n=ʙ˟@)l'C "|!3ye~.6o\KcyQ97TqRiubmTMx麀hE- FQK%zT"-Şϗٝ-MT$:W}x(⟥q~Omgf~4z]IyHk V/{~Aq_>>^{2fT\%7C1=4ֆnPҢґzz=-c O^TyKpc6^~__j DB0)'@ov6_\??v8HeףW? 2a'R!xto#ɉvƣ&JOyNXabSֻ`JsĻA$Πl¹7jP%#ṉV2~8냾<}"+!n){IDm VgR$LeP+|堶PsUj7bmC}Zrkam]ڂȵ,R,mԜI=HEK)8)E: )=z5)h);ׅ炔R/SJiaRkAJ/NJ9yKbxnB9pʚw*y;gyZ<$v>K{g^g h’$S1q#hJZhuU3xd:/ PǪI첥\%lY7a)R} {XR8K[6bI9?9^uڸ۩g\ $# f}Ӫd X3E#LV9;ڵwsR1{xX@]ڂmԯ>>l'Kd$ƀ$۶\':2ض249udT,u<{QAȄ`SGx$*9;od? XFxKg+P^WsCv,KQ j:aB$j84KƹJ𒐉:fXerY y]ԁ*ۚSz<sqZ -9 :!~Ko6 ծsr)a'Yy)irtGa"-7WՆfb҉gL\TW*cR\-&L32Ira pnF0Bd6@I)IE567:s gt=-RK2DJC%5[Zokr.%r/LQڌmLm9H"!BCqۨ`C܅X hJK.40,U ذqiu;>ɅW DZ>WBQf-a!Th=CR%Cq4۸A`u6v0I;Mvj̨ j!$ >{kJ[!"pRhBRzqa%Rho.͜HYʥ3bF؛!oiڡEsp#GK 8>HB*Z)wSw}qAh/Mj@HODj4rGۇLE(BB^;J,{#+vKwըΩKkKܽrƔGݕ0&K3])U9>YKQd>s&9@w<$wMM>36m0mGtogQry*'WPh_yFK*EY.ֵ}~d Aˑy91;ȵ'zg˫(X9w7V+#+'rBeľn+C@k8ASIlLԝR@a~;%ΫmW^'yS;pyN5,iHh"sRUB3M/LVK{+ǪU=$,MMR2a[OSں/2 KW1[uɨx[Án%Us?}U'}"}Q|=PiQaeFYOŞe)՚lѣ=&Wˤ/#xڮ7(֘K cFhG`۬bmj0=zdB+ZH}摅OrcHb \b ߘ:9<͔2`R#tr k.sə4tގ͐*!S͍6-*@WԤ`[Cjm-ϴ.(9!)&$ݼ Az|n̄epW0zCGrh~Fz+XP_?ߏA?\կC囏"!uQ~;/Vvao^bףW?j"\\}9:PJk3^%bSTιy}DdE-*+K-ږ DXDeFȜ~\k"C3 JB؍Mn?QJs=%kKT8RSQzPv v  8(;P:FLJsOR3I(a'vHmF46R#][sƒ+,l5}Rl\qMJ52%&)9( @q3f)ۂz{zB HR[L)*he,/8@B$*29 OZtz21Qhҩ.|$ BfrX! 9 ,kfE* 6?f) #bW2 Y3 bŧHjHZgHyj-<PZ7 XYyOɮ 9Xbc03äH—i3 ](2)ee@JfrPؑx*)|#:-'`fEL:X*mO7FHF,iZa° >RD iȡkH1p3hjݔZr 2J憸.r$k{D[WlJS| lε75׶K7DX @1f/2U AYFa5Z|LJD]нٳQ+i"߳2R;֕Gjex41[{j\͒ū*jE^,gϤ<ь82Qlk)pΛ# 㧟 '͡ w>@ G.O02T#/Hm@z ݛ)^FJFX lܽ=拐zAP6AN"rmpTc  yOD^8=WLen}rj+_b??;,_kwΪټֽP<ޏo&%+l2>z3`1ex@Lc46o* w4~|SJ4$!߹ɔ$OjPN7|p~$ n![ E4HjazڍP8n*QŐ\DdJ:ʔ]Vj:WVznjmث<ǥ|8;SYz'8EϷY<E^BZOHU"R[ reJ_,:ap c݈%704=C韎6 7N$$U6_5,Q` Mu.` 99ېΣsJ2bthqg~w`9t+ŅVؘ}0.||w?;NPR4&9GD]*)ՇS `q\GPs0 gN߇Noh?u IĐNyozQc(Ў֢,!a]YFZCd! *fH<D2 bO^v:2xuA/J;^\b6v ܳ3t;o\#>PZ 博sHF]-"H rKM~S s#>j~S< H/;eqSK J"`#[K(@Pz^SHd͖ ɽg$AB_B$8KR%BF)NsW!8 HT70]<\0jy9`4~YkCze1ţOf_Q4JHm.uJ>!R;fcD(1 Lј*i=6hsi׆);8}_[ ^$K8;߻pg\D 3Qk5mwGWC!)݉gu^ٝyM͆nCż]s#ޅ6> 2NhnrBWJ|f D=ݛ>G;cwb6&+fW_-ͯ|ҝ?ɻgfȕB2CB$P<,%IP ͹'qZmH{4 U[F@;!ِY+N.qP$[gx\,prfBшϝWl`tc'׵ wSRȎ<9@tlNh\G5Xٗ}l.8)ƞD`@Y4pe &GZahmHoa>`tuN;8LREn6!N}CQqTiV 'KLsqs8/xtAD !Iv,r; K9 JR,fTj8)VR!R*ti `) 7 ^ԸOU6藴:nddG\2DÏߢo|˥@#nt]*/&7߹Y^!XpRWɇ^Mjق;+$$Mۓ!!Mzm'g'iOB #T³b_os.YP! }x !`$&_F[6!%rGWC}n{r,L)*0Cfș$c H*A)DZ6m7펽2s>7Ti]fW2͋:WE+._%HR%gɿH|5I~2\xw*!wjdM=~v^yT6r`)hHD<wbj^w})pAq-0]gCb<]Y^ݏ`X%)8!V;X qS߸?fW/*plMO'a$)un&q¦6̨Z8EEtpcN08]Q|=2ȤˑYiH6GiB%̤֩rA29+k($1Gus((ԜGfGs΄JmB;qcS(3XR)RE)8(UbLRs(D->abenW`vqd" pC$tRcք}fKMo{s9ꯙr{&:7S޶tS6N6SȈL QɄn YK9u_1ozA2gOmz$9lop(\ dR$5 T&2<2TQryC\/ 9L"p8.u67;_ e7И-nd$S1HɗF~OM⬗v@[}lΓb43 טIIv~'vsaXcO>n6WY7suhGg3u=dD67mQ۵=?YIȾ^0ZhU'wQgOٸ8g@F=ͣhTAA512tB L P@ɜI;7o5_^$ȜԥQ}n9|&\$KdpYO2ؾAf Мiyk5B̊n*έ,f!MWP1 2@ #gCB+NKpƁ,q\ #?r9OVA.j[gg8ylmWئ &(W4F ;q0Zy8}ͧ dc|:7_w+wzxtR D}1m&# ǴY[Lpbs6Fz&] 4L:?$DpTu4.?H. t7T؏KqK?x/9S!=9Zٕ!8z6q+}}kj]d.hq+ xM E)9 BY1>g0F0b~M~zwJe{ؘOH}.JB#Z^2(ϒ5Pˈ㏆X(Ƭ(bDM*Y0? Wg'`1P_zcK6ap|閔 ӑۣBxh$R\$@d$vnR^cQZ{OKI 0ZxY" D 9 Ha@!#[o5﩮A֚RפfT>"\:5-SSӳBtjpޱ>kcydji!'ֱu2O**q` uljK( dk<&2a -A9`D"h8>_3,ttw&ڢ0%1u%^?ܶ\|-,uXXp)T5AYjvNӺ,Awٴ Km`KDG@Me 53q_ ĐupƵ;$.zE %o:fT;nп_}QmLGgl W:{$ۃ6X*t>ޅ`_LwcXۅ5Aoh Gh]`@^2Sj+尻H(ُ2`?Iw@}$*:Y Pa[:)XVаA n:MFD`jL@0?o%H&8 ^lmya%׈'F1@M_̈́2 5Ri/;eɶ(xRv7P3Zqq<(%QzDs.ϣhBǙ$t>ۏ2+lUq ^T9f5@=x?RK "񦂏zq:Q6E7G&cbGgo+Q}MF\l `1ZdEM5a"l+BL ha(k,S Tohjɷb~(G<!*#IM0EH]KTQKNXyhA !R~LbO)@oԔ6:_D! 8h{|a:+xzv'XvhcXep|ㅘ>ćMsK|ؤS[2kgi 8)a9dp`\ +N%O ؛^)r)KdMRF/+E<ی_t<_u[/jV{QpC@V1|4j8_4eEk0!@8R.->?qz.>+ϳŦvX, )ﰃ@=NA5 |ʔf@H!SٺI/PH`*/m1AVcUI1sp;<ڼ8t1T?>ꚦθ/C-v&7wb",:5KPQؽ #]^Zn$J7Pu>h\H} 2FN Y:0%qxNqMzᒙ)_cJպAs>t3e^i7^MtmëBhRf6%1&LSX<$i_x8&*T8ˑQkWC$ ^ߤ%_%ޝP6)V#ĵ$JKZzΤc^; 07KMMsV0(L 04-'8a/é+p j/0BT3$5XBajBhq1Ƽa9C`w!7 @@vC6D0' RTȆ&Z˄lX~coMDʠx"dCDLxjoB A62txa/  Ȇy5(CID &,Zܱ@bJrbƻC=PۧGn2 ݡhQbWύMN0# Qj÷QV!nRdzσ G?=Hb cbA;@0JWdPݸgq"K\,])N=}VF|<Su#z)tܻx "RV:NʽR@ |IuHu@w2m'ղc;]?Cl*lcw?7wm=ܸjt>} m[e.m;.u_ݝ7h8t*;ɫ{/;},B6\iw1Pk=n!yUh,;7^6WMEg t#ǻ.:H$zh/#m&;3ޭ RL;r w N:ޭ?Rw+a!߹nT" 6Qx 6ag %>7D0˵c1V;HfERQn eZfªŕ5!ߠ:}|ne&pz3?o/P'v?uuyγ=__e6q=_=,o#^_,_zXLSxf|&ja֟hڭʙqjf]uWW}}Ca6aoH6.J;wj&T4VfP)e|t3RoyI/~ph}sYKkt.oyB#ʋ4x&j|u3jo|lvQ Ae.Ocg9xAuTA90=4BQqFf6 tS)URIr ;"%Q(dkgUv*qO56kLz=o~dznNEWoNO7F9@ȨcB܎!7C2n3 &?4Q*W[ ArF$[#k p"ijU~iSrns-j VjE&a5tGc=[>"dhdy!SY=>Ι\qLH^'*ȠdMD؈]F'*YXT!qB*wHYjd]m ]vk ޲ )B AH9bx-Mm}6~#Nׅ`ED)Ivh6?YJGJ[^b95ILN));l+ŹhMs#LZ,-՞w4\Qevr\T`p 7b`鞷gXL=^X=xnnS|tԥ(%\@]t$d83.up7;+t#D2%ɟd<`(}NdarR_HI-e^b00LX10^k Da,e ]cAȗ#,'ưր1HN9jFyM5H9tJ @j5WZڑ-ْDْ EJA HcL1#eXOpJ"4BL_BbBU  )tc^2Ei[4QtBǝ!qE1,s3}POpLc1ڎ~qT|dR [-]{jY KS YC,S?< !>=&wBR6VCcwεFW{6'k .GŽ]%! .70/Rs7jĒ)(6 !/v[$K]@!ssmlcaTF灰j&>'`"@kA&x"|rNNeeIp(" 鴷șqXB-o!GE0[G\5Vگ+xiZWlZ#Xoc%%QuHx?GHx#+DPwp}UOgR%??4mv}&B!I%/_}|'HRsKz}u?/ &R (Gl&L 8n)_noX]A3$1 s)s:)yЕtw?'&Q)da( uJs* ( *QcE "nteyهB蠒|j~n>S89wM2w \(%nvp8ae1TԄ͖URYVQMe !FaH{lv.uS޿4;Z'׈*]\Jw{,Q۸ zIEvZU`&{-1@l]ȁTXep5*2̻,}$@Pg.ܘeiM0iLH$/q 0QR K`90V%,"o\z#,_ν9qHgBVyp4Ě|Eq2^BL.k^ց.N-k(UƗQ(^3dɬINDy@   ;9x] +t`C۪ajZ{tQqim$ggj+*,6,JUJH]E5mY& ;S$jW-E0fnkWnA% F}ATW60߇oG+_Boix\8JŖ8tht AX?? roH,Aۤ-8UIpXTLuXx!REvk rCȼ&mcy!Qn(7cmjZ`0\!zC4Cȗz}b,1`UPeΕr Ԫ7,T%Ɨ*ʹ2Y)f}/d9ٽ50%tVVo &Ohxxs+5`I1Z?{_I)WQ#/a-0ԓҚ@iMfhklRIRT mpBR XXAj3Z8bRbcsg+LnRL-nm·T.&q7bL2\1'2V*x}xQ-z lX=hKy% z o gωIg31s4:Bt6[e"wgE>l~zfoot~t.?˷֜{2;O@1سd_mVd;٫, уo\A{RII\sݚ65ED{[Iro N*QtOP8K}JhMF*bm4 NS9 *AXd)*־DrQ- `-f v`S T~0gQ'I&)&߽,۾1j,4nB">ϨeD>}xD&~=E=Ey-JOvǻA[:q)p_tUg<4đPG.@3:ٚgmx0(w9́A,R-8S h|4K*s zZJoύ#LAK-e狩gh|'4wiHYind5)?I1SV =1&;F8Wz7 0h1ӯY!StVMM†D"b"b(.bʠ?[`V^Kd[fX4Ѝ ""e6Yy#(}Α ŇUf-m\q-EJ[8m5Fe )Rݙ׌`S rMyj|!D쒶ghعRtF?3(8/7(rN9"tL(0Ž-N& jVl'[0,bHYӖ'NiTcXe)7)'v*IHZ "[!D@1RjriURSn46Dc  өBH@R&3 .vsu}HZn+=Z?GCOi7|gcܞ-иM׮30Dr!k=㆗R%WZtޱ@Cj&1Âٓځ3Ge~>:Sk GAcQu' don~)# T&\ >BΤ`8( mXMS)#9HWW%̤FX$J Z[^sR[]ymOmc0SʁH8%SLEujp.q>df´)VSKwQ}]ZR}NY9m-y-0Ī]T_WrYKOQK!_ cEuj0 ZzZTsct˷."{Qq&Ԣ em6S`"4="82N⸼HD:)UieIsk5G}rۏtK5r4[Qˋ6J>1qO>4$|5׹>=ߢM(dh@"ZN}b$Y DZW'Y1r%fQf*rJE'1r)b4K!Utz1A0AIOugWSM;.q;cs6s@&Hf9bN,[*M/zgDfnZ=k*4wXb bIibLX8fK'ciSHҘVB5u,Ru9@9Z Fej2Ir87ܸ ĪR\Nu[+uB@,Q@4U2G6iD9̐e"yp[w2hɗάݾK߇Ar2pEPse_nDPwXʒkWڡE>|XY=Z;$ڡڈ*-ehhb+u5Zh>$hwQ[EL B5gCB9ي#2t8i 쓿׬>?x4cLA)k\}_i,aJI7BHݩ k KhZS䡕 2#WTae `&8ԟ&U8rm nUgS\d1Hg]LEH*;u^y`+ҊP3 =)TNu }Ы.,P*ͯOPO-.+2;O\w(~!4]|R7^5ojޔUxA2L#G*m&:)J@&la:Ijcx&t pV/9Wϫ1,[<|.N_|mV4??XF5-ijb mThPrmzjK'~~P[4gW(!bT"IJ“8%< RncJ3'hFa[]ҁ)ٻ, ,$-\ ~9o@.W鿴#~EF{(}\ W|,ϯwO_,g=_zk&|e QyM0 *xCۣc`Br*y>cX%kcڈѴϏ*i%$N1-],_ZHwK)sVu|d1aDp1dKI,RN1X +5!F:i`PA^6ZlBg\_]s"OxxٻϭAov 1_x Dr iFWPm&BMWm91Fy |l"fuCe쥔qY > wuGȎnq-F8p).fd}Y3lM./;KF\y_OZ jqvS}F5c͉Tdkpit׭HǝafOٚږ4s"3D1<Ѣ+Zg:svyyO_e]^9Z0cíIpӽVB(_d=WYbY}w?#WT[8Z4N/sB0Xߝ ɞl;)3jИ/@mHՓ`K*km66VéVفT7>~PxlLaL?<zQ.Fs40ht8Lu3sun;"˲!vQqn4'0[Ŧ!Y&{]WNM0l.]/͛2xsqU=Fg/ʿ3 '!i~,R, J~j He)X;k[o,z4eY\WRV5wgaꝜz''={)F/4ʑ%Wn8 h]w BlUEJ |ޫ:@4,/ f& XmGih}.IJl}Y^v<"O=.AZZg4Y&k(HeuEc<" RVP9T]mY AcCu7.!|gV/d|-oG`5h S||--5*l"E65H|m" "E>*IyZ ﮗZ7#& {lOxCءd}$5r<7+Roj-F%ľ bZ=f.TJJa7zu&_=q h3{^IhŰW"(هA/"V !ab]bn^S|]=>V8滦3Z3Z׻% Veh^"Ezi)>|4)GZ7'{q܇J{Z(Y(CQV,kSlE6+)?I0Zj<-ήc 9}Ho~2@>I{~;Rih^\%WaӺW#=5J}g AA/!iF j<1i2c2'%!W}_OӼaV:K@5R-or ĶJ8䭈)5[nE[xq ݱGrw%?nJU4K,D .X *լ͚KM/R5 K10"U M-r/ْ$Jmh8e?nJ?1E"WބK9r˳BAR:I"M!4!*I24G m13z=t,)r#AW$S u_>_OGdJ V EA&a{ٚQ3V`5k/Rw77»넲JCUBc4e YB歹ы+q5~7WF—^)(JPT,u{~1l{mfG1AcG*8騝4_tJ7 kM?$%|\36o% s2wU%$ f+(G6TDȏa?jN˺e"r*%`>Ko$4ZA36iDQ W<\W\8\ QֱwRVa_o\q%TZyAh(X`MK#C-cXjb2Li\$=ʰ%$j쁁F],94YH-dl'fDw.{ytNx|iPtyE^V|˭"G`Wzg&DZ˕(wFJj U"/@/Krє7>E[Tm.w|`D0W,VeN'NZ6=gJ),:9@1|ZA{~{5A>ݞ ϵcD9  e`(#ƗZ>JrrSQ)bҨ@ _]/vW]QT-HcR Fon_}I} чxu>=xE$}ljG߳W-u#οT͖DcmWw+?.} B =܇Eu|;25o&]fdߨ gn${eg:E@vGrbh,_c0#V*{3h0 8:$d N8\\aч'f ǻX(kQCB}-~"Jc][ @ = Sksû ⤣v:2 !ᤛVQF1!j>>/*䩲**dʪ*2uU{CҚ/Y D!"J#?-<*z鎦i7H&>nVܐSpM%Z^IG ?P)*"#}B^S;~e?EzZ՚ɯptљlΞ]Vǜn\#u6 lg0q$Vb-]'rcٻm$Wê×* 0_n=,p v,Exc;$#/Vۤ(e!xdXOU$B_z#q.@KE"M5g6lN.T෼綻LڅGi2GUhɹr iq8/QeFa*k k 2R~kIMR[]  +%rTV %7[to hEZAMtjw]S2ҙ56k<ݻOW WPזb;JE=IT? #Rψ(]%Gr'JՉHOQ3j1GEj%G0q4 A l$x|bE$WDQ"x1͈'HA+@|4Q/FOt )*`ZOl8X>7:?# CQ~mK#i3T9*_9߃)DK YκLS!o6IjHDOv !2:?;Tkv+q12Rh#%Dv|gJc I1oː#Mf+:NcJD] [ɪ5Z[v4[?1܌n(ܖH1%qsa(&E5nMgܕO^n:]lB2bQ>mtcy{ d29n)&M'lodB#Ag$t 'w6sEx=&cS:.3eD3FLOM)aiQQm7azd3: 2M&]SE rDnÃ>p\bp2\5o-o^,>*]O~/o ,:;eLRM4Rگ (kuɲ kCK'dӛ[bE'x.Cӈ5Fweigk_Y:ė''k5BaVbTl؞BYk!_R)1HYZjV^RӤ\smтI*c}71[vJę<٪.H@X]u(**є%W"R %T-)I*LU=WH2ʼn@{CCQdC۟ _". "=g%C++1{7îj^GwZRHQ'KBgTh}!0G6I'5N43䂃v,ӖYrQIP#~ _;IJԭρ1}B׋o(x7߾ 1y1$a(>jթ*5!I`Ǘ`u  ;DQhɒQ VۓE57X#)%w92@U`ѨwK֧C!>>[J?~}rX _8NzpJҞkiv\Sd>).HJ%779=єy"'Ixc1 h3-#(a(h+P;>aPv1ON }s9}6#7giUOEm#$1 S_䨹]0f֮)FT-BG;)>AU583d3yՉp0&>*w7sӣ_SՔ ϵSYCqf4 )Yӽ~K4rjT&l;mi/hi=z[\^d!oDsl ~L8ꖫFv+3Vխ y&eS| u{-SQ턻@[ݲvcW,䍛ME&T94X*hb$.[hMkQ FY{7op}pP_}**h1^ڻp2>/zr}P⿯(\g p.nwϨ}^,7ԌWR6+s}wO4F{b?W^{W?oq[m%+RF?6[obZ{4Y -кt1C^?数qrȼ642;<7C8ۦyCBjEhraD9CjJ$~usL0[VZ8-pGjFf@)R67@9._\ Yk=_^ɑ[SDk{׬B=3[%)tW7!{F} ‚n` eENYZ H,)Z/]Z%Uu] |#6YͿ.*)J ]@BV6>K 2~K(J1! Q͞ ϼl";]c ?y|]TV eaU#u:7~p+~瘵?Ds:"p`h BPz $2 =w:w۳@I|3d<(l IP0p>wBϺVyDex9c0x9I/(e&Cߘu)%8P E$p[hM%%J:͇U"Uf+BrŴ\UXs%ъ,iE)Z$PZi#a9 RA,tU t&pYQGo!q*B&^X.OCc;]$s/kм[С ):za nZ)}lF$Є iԻ,YiwR:WZDWZ)Z[sEߕiX OzFLCt2$`mMn NX#?jcmqzĊRˁLD\ tTYQXZ*YMJ́zGۋo]CejU|׫pS`;Mho0Ubo˃ljy r *8oe{-9 Xs8._V g6V淵a_?o4xo>Jߺ2lu}_|mwA6ߐEo1v?;u{M.{\M%MB\MV}҅v@{ksCNFy$К+M+nj +a̯nXcJqiZF"g*P:J:[ٝl/5ۡ;//kfIf414YSm$2QZ![{?NEAQuIUa/rf-ْxKsӪ/|Q2EZA/LH"J2fpVuQQS@jY(SRg{} Łooޞf-sm,A\ p,3F0ع>c /<тaH PIYG ɘRM]mQ @X˳D2~0#TVF1,O'QRԗOVzޙȝ)R>MQsv p$ 0khGh%겮'"%JCU܏Ns%)} a:wW;T"j}3dn#3BfhmkOE|bw{j3x#Jn|zB3ni\Ќz8R^&!J_Ue=$G,)-Cd/_')QCD E 9/cH2 OG58:ܩfK_{*Q|?O6 B쯿\GermE< `3 +YY\XȱH 7'ďy^B|a"<6umv\Z6EoRfDB$ԗRS:?[KR㬔*IX1/wFu>|Vu/q~L'RkMVG υ3;O9-kAo] 5fJ1Ed* 7eMj*JY]eԲ[@&0K}+5%ynĹEuOJFjP$гD+$J9IRj5p|J)Fj@HxgV{=H} ZNb4n[J/|,J,\ 5E2IPz1\ 2"J^.e:ySԼPSbHꆲqeNerPL:+*+C]Gd9ֵLNW"[]Dg-b(*j>ɺ B=~z왇R1%;;cyLLegǍ#J{Otu蜬L"3F}H 0B50-::wP-,im`sI^Rd٩&]Zy-S4YI\].V 12 2:bdNY#5l/J!wDtJ?ɾl|rWZ*Zv͔ lf7𯅏Oh[McoX%SѪ WV?_wէKǔ%}*>1k9 # !٢Дws6a`6 9 bO gO^3\i{ .OO( ->:#ih9I\cW` !ǩY[b daaEY݂cF,J,uaT^RIuˆ%)M'lOzqH`$U[Wզ'/{e I_=?|T\o~ry^ЄHKvĤb!(fuZu"fDSD_'* P `hgz8_ieT;3#"a/32ژ2 !hSL6Yl2n"P,Uŕ_D8$o@{ND`a?zJKeGZXȉ~kLCT xDQrB9s2!gEe|GkϾ"P NӔ:ƶ#*"-Dq vl}]Vs(,ti/r!b1hxV7HKAk2F Ҩ=g],}P)X#?Jz֞Co@+R${b֞V%˓nUus"beb"6"[ zgm환27 S:nOSK5!6pDN n@kJE!aMs3Ӷ2l~R:+}]skӶR,ŦIf{!J3T /@KƏJʬұl'lT5G5*5W A<~Uousw@Uo>R!vÁQ#`m{hTQ0{?T [=\K\L&0&%K#K v$I@8Z!`L<}pΟP] JLۧrJ֊A0Qwgʔ|QL!H4&7a4~\λI96e(R[/\mW3~rDJj%<=Zzވ<,TePCUϣBla)$SfŎ[CFO7dHl/p!'"(bO `=BSDSCugSbjS,Ub?U$犤zjN#fߓѿjV=ufGO^*.U\?='cR^ 1z[SRžA6|r|礒Ad0m =8 7 U\(4Xf|W>ҵ?Īnn.\`E9uD& Z2i+b#QU#T(mTU3o9CtElH<&M 䎉F?ɹF(DZ 8jC(?M.ղ]y8z UhԞx^V 10LZ'pd)#aut Ea2r]Njbmʎj ymڂMrhGa@eVPrSR4eVM8+ESf ll~$׽D9ϠYÏW5YXùV1JP.&G_1ӂWyp88^}9HW2 "!7b&beҗ64`Lu#?A=~t5A=gkǰHUtFʅ _glق|n]RTv~:qZnӍ\‰!HjB33ݸ#x;7]b%lPF C lQhFy2#Sme$@c9Bs3)1$?M.ժ=|qJ͒nZ !=EkCMkgF/&&ldMTjJ+CUBg"yRS3To 9ΖiLCwd@p"|>B,F \)\p)RxM(H^M%E-#g],} `_"Iៃœ8a;rV$ 51 7s 5I($X,ə}B8Y=FMhJQXU"pQgbJ2(Gq~]Q3`봏BYVVoPm>[I[)_( !1 ؾz8[)Z)g_JK0@A5BVKA["-XTnOHėzoUp5uq*$*t2NIlQ2l?Sƪ:H['r#gsZ.Dg0 )`%3Rl.L 5p'C]}~N0]8P 2=/)pv^CP>FC q}%C$Z}؏p;5cW!XHNFG(ЧGc DRU!Pڰ6A;! xfTA vA A Gx^:2L 65}t:txbPЮQGC$lI;;=n lug>ޥظCLWZG #CX.mƒJX֋Vc2l\K>VeӾS~}쬒G{NEկU T،(;ŋEY-B@D= %|^J缗kq1gK8{礎Z(g/T|Y Z2rb9H|RgNܔ` \:|@D"܂`.eF#Zry0Pz)4L:)Fg,|h5cy蓗 2ὌQ'D`<=3 *ʎ B`s֡-测iӥ, B >DϽBs.Ƹq>!/!_h]w̼3sAtpt)' Xd%1&(PBX50yP0 tS>f6:9D)24w&numsa:gw.lRv%_{T2wx߿ `7+x_cH+T,Y||jo///?򞙤A? L*cLj^-.ۍ$%>Z/ji#?^]we{_V#PCI D/v?|Wҍx}Jh;(D Xbd}^]K K7E,Vdn&[S.B$%SDe].p:M]`cʁU؁sx:T;qw S<{Pl^}%A`葻W *~Y"D>e5u< q$͏-U dzy@ǒ| S_~!&'NDR?V۬#:?߷yLom9~i\mm+_8f\ʈ|t_\^$~ub]Ɲҏw~U.]V7/ Q(qsEiCUs(uƃLmHs_scA {/ͽkźa}q@zȠۛ2L:k0m|s[m"^5Sp+Š&AK bb1,%NQRۚiKTQʤrf %72 :'ALI>9'>'#RVvHvezK!2NR=C}1u\"蘡ZXQ+4oL EAY#3Ęx=gQ1et?c/U(b= 8⌁GL?a=GQ&27dcEMq8(<- #uyHܟwU囧h|`.ާm%.g.ߍAi |}Q}z757|Ʀ<.^^ΉȰC6[;$K!&CDn`3PD{1Y1uC}sg'67vM͢w5޼S6p|>!>u]>uZ|ڧѨԒXSfbW$\(;Q=b,d^P U FMQ;zoKuN~:hQ>lP,Ms͜~ ?f s,f^D7>?^ MC1Xx?-uoa7`/MO)|Yu=8hݽ~6|*M)(M3<[[>Z\'퐧ɘͻ /*ncXWn26e)Ҍ<,MԈ7q Ւ_<ׁ: њ$2- [TՃ||\TךeĩpYS'K;R;S^*,! ef^Άؘ|B,0D\1Az8rsatPBf.L)g޲mj#,zx6Hom)F(ﴋ!gdQlt䬛s?_?@@FF>g>\_||"wi)5Á X&3hI9x]~"LxG ?{׶ǍdݹT2Yv_fŽxB2-JIJ_TwSB7IP&ي,ͪD"d\kyRtXu1X5JI -__jYQ354)N ^\{[R'޾-oi+}o)Os3x 0'%V,6,:?1[w$VL;h8c%V}CX37".!g P[|u:w+ tJߑ&]K48]X9cjwBrܸ+wJo0ߟ:կ7(ɇ:/cd㓷%/Nquـ۔umˮ00UN=Y}~rc/O6?6?΂WRՅADooqܸTCֶRxlU&ű%rTͦ.d.$[ \s[oᗷ'}z[Vm"^{M"8W}g#ͳ'~V{m }7ݛשZړW}0;㙆C%fPJ"D(2 fv&sR0JBpDeysϔ!Fm3̓!1J%Pɠ`aRgX _\}5#AW,}80h$#aF@HCCĕt4v'%*Y]Ρ.ïfSr<ȝzcibZ?\9z>ZPZ gJ@0iG*1ţp6B6"Q\H9LQ\MmesV@Ś5V/wkZayӺT9^r/%᝖jB"70FM]mVu%APUW1@h;t܍ dS,|% ӄw'Maa bQD0 Slڧ6Cɿr۫[ɟm* rV/:Wˌ?S% > eۆՍ@jMaFfw·T?V6xMAk5 ZQ6$ A%0-~ǛdL2xwp+҉,ys5@hXV&ҭ6@[g75&߆Rs>tu7ugow6ۧUI''ᗖ|ooR,Pcso~?c8WWnwA/Obc: f>.;n' O޿  յ9o{y,$)޳_OC%Q׹UG\RTJAPѳpJ;O! ׭oшu qA)DKyXXW{&CK% ǮlTO|g},_4]ZQ-7w {U&4hG $@eoO˻"aT [W7(kZ9bhK`D{#;/`ƵP ꪱucGj;nSB3J Oy+c ;{!QGt5Q7tQQ=CHS4b ͌2/QLuzkby;.'2Gʈz$7 ͋ Fnj9K l9ȢC*I\ڪLVc2ΆA 뱿njyzytpurV3:[ہx-܊cf.PrmP~}Gypn#az$;DsI ktРBhu -cIh.nk^霵N+n\-+czJŴ#PH.ld h]Y h- + 6$ߨ+%'M?dt]1>I~?qXr[jR~#jh5 ‚ TMEt",‹j;nLxσ4R#fB`MQ\EJ8wmrv}jYRDɁG;~օ}![y+cO}؛~~Hr1zq *6:q z!R6{Mzz':SN5RS݀p||ޣj-kl O~m5Mb4(P0|%8.b4 ky57kf\ksA{ɵ[Aʸ4Ja 6JN'Ty3<{W9ـ3; n9xsIAfk}羋LzCa6ˑ'` C~fb=qoI(=9"K l9YC:Ⱦ̡y[R'x*Z0au3;99J1H!$*}Ggք:ݶDل1P"gMȦ@J8Jvޭ)}GwPw-#[y*ZwBE qnBA9bb:n3"hۻ'z!,䙛hM7CL:z}gvbrD&]\чH$[IkaIǝ~ХvaJmT4̉JjN9ZcV928쪜q`4YhnA(cBw,( +4We#6XhDO'. v9}֕5ۖ9=xnf*>u.(LFjOG)a=EG1(0_0ƍnLJJɔ:pbgX5 )ĆmmjKqi:ti%8H1^[eE"a 0=ؓg% D!Ũţ+%&QW~~wysgk.0S?X' ?*r`*1TθȘiۺj}*cT_}qĝWNGu$RL$R9X*;)D(^@.` W >@ѱ5xH}nBTYɟ2|C!RV6DԲ_ Yc%6aFLvK4)::9Ni 2q=M3uxCq#!-|L:=gb㸏~]a!,tꝻZmJ%vӯ_Oߝ]^N<]ޝ}O.ꪾ >_oK 2j#wh@iV Y8-R4+]J 1[JLU˪@zr[Uv͛mXDx+-Df&N_.lnpJgiʿ7g]\J5 W֘),f&+ʷCq5_uy3GU[ /EeoM2XyǃMk4n*vA 48QuQGtF>ꦗn! 8)t%XSO`+KMTb&0Fj 4F5ԘFz-:j  *ssw/*4A.<!E!YzfɩZqӄ0Smr%ԼUM^;pj߅^&:&TyW D÷IB9_>B&'qI|ݢ 4$NU6dsrc;#];)}oO ' أс-x<iIV+x@ Ď抑mʺ;w\ѩI*c(ԅbmK.9yICj4V%((yϛRK!Om9}y 'ᥜyi%SO{ܿR$9ϛRK'/=n/:`LK'/=f/Yb$2l#4'/=j/PIx)0/RqK9y)e&üKmQ\z^*L WzRa¼ƯxcR8j'R8WRKKUI! Q_? \5m.DvĀD*S^;R3f bA@oblbpBqB_9X7tTk#1?_VeLY*u)L2a(eΊH%{"t`Zb@: Aܹ@ЛZA 3x6lCMA\Ȏ5DSQڿx i-zCF3ZfqS O [wW/i=]k1k&uu6$tBBgirL'F3Ҋ6FL3Bi}*{: B(H Ke,O=,oJ\z 7tLI0UǣYfR+n yN:Ú\TI(5=;!wdӘH R!n %}-b`ULK1YK]?8yy<Mx^JF5fB`;*q wCuC 5p&C0kҙ{ ]=>v%Rh-wD KG` 4p!07̼Dj~H? ۲pUWFRKW8 *'q@X{%Oobaq z8 XEI4Xbb =l/Eq8QJ ytWM*pEᆢxfqO9u [_էWUUnMG-.ooV<ߏo*,_BϗW.t~L2zSjo}\ Ar$7l]4R.f r5ODť4xKd[!B` `@3"LsJrBL*ɵ`V#M!o hQ{LKݒ* ]d ,cȘpI!G`0P3]҆XtFObdX8\z˅& S0 /p3V*`GY\q]@38CF"23Si\B(tZp2drAV&,)=W%SG%Uʙ->.?Ls6IT k7Qvា(ekQiM4Բvdn`Po8PEӆ4jV|en?Dl`MD.yB&5 xR_AR${q[ˀ<>W9C}8CnyՕh/~]KC/vh/WvٳYj=n 1@T'}WN3HUr.,!M4㧬u~~ .y4ZVcܛB;pcT*dV)8M}`N-gi .b 2E!-VEYuFȅdq.N@n hF C^ja?;(Pchd ]B!, ,O3fPZnTBp(3^ZS(RO%抈A\;.$@d"4u5 I)O}IfrJ+G 2lOthuh暁m˺|.R}W2@*\19aj)3֦YV:Kݜ9DV),e **r2^LR&R.:t5ߦ=걻 X-R]Y&MYq.[iIs. t|9?oUhԲz [}[.'L$cK]]]^7մ 㻿խlh8M~>hW5_'Me.l{i!$%ph35] ~vwTۧ&]?/qvߊ^쿿U|Ʒ"8Z:ƻ7] +Rm$ˣwlz{Ёs@^MFtޭ{|B0xa_.W+փD ]]mHw p[ޭGbrQlDvنhawuzˈl I+ًmv]Z\f)ʫW+<`(adAkڏmt_Ttv>sٲF7Էw[\&7gov֣;Py`&ߢ&4 Mn> n?lsm].[5vÃ) A9 UG1]mS^Hɠi)>㧟CG>%tdC][|0uV-F/x(Ju__xAS[ 5Z9:+g`Jv3tlg!#І3`lS΋ݧF*@qF=xD*PcW` 4-Y:۽Ԇɣߩ1N-ԊnFuL3ma%]G)C6(5B._jg*^'{:XLo6NO*6fW+fUSe~o5ygxe~Ծ]H{s_^e\ \gӧhLSUk#®|2u?WGnuOy]L>)C^vl7ݐ8Anu:݆Omjp٭s/`hvC^v)h&Dv먃 6*%[gFhvC^)c$왂/l9 73g l.oj*-j;|P^ PPmo?~pJA~ FKa$ʌnf a'h$퇰$sQ!N5\vX9NH!acP3: !rv?2/DKLLu2͓T(s) MRDbLbW/\A\.'I3hGW1E+Ba{6\O,NHm23hͦޠ{#]@ 6f26&˭̩T%!KQ{JP5xQnj:78ȘGMGRU|$5: {g&j`Rӹ$/Ba;a;TKzzh6觨fq>APj8΁6SmXu-ZcBŧ8%G>Oʻ.¢n5J_Dp;wbcmv/mƩ|ۺ}GFP}:%RԒ;I]C ]Y;lR %uSR# ySj.$L%U!߈Aɏ'C^v)L4iptAtFLv:p6Ue%٭y.ɧHM|݄0ݺ u~#&pۺ7f><䕻hç;뤜3W&23#B K!"KR; 1JQ)\^L%oHQa%`acWͅxOh"}ޭn|Hw\$H XEoiϛ"q#ԪYXc~vlObYM,;xRWjaIF73\&1=h> kΜhcFAP: Bׁ [^|B~}+˥r5_>{z&-"IS7ol L#z}_~n\^%٫V͍KF<_tJ>f]h"͠XM.K_dosϧ圤to/nK :!m1fC9XW]#AɃG܈u]돽DS`.Wtx0GWiU %_VY?V{n ORA wj1K:AW +:#wiUS~*u*zP_*t|ʩp,u K1K]j/,u4)S(#>rNbp"Ie?MD-JF}ߍ&Ɇ>~PͲSʖX!R=il)A M"t9,Mwtߚ^>~ݹ(&5B(@?g<Шi"J + \<_./yáy(<ņiWj⌏TX^&&9 Ko!19(4`#c^mmޓlnEފd!2U4jƴ.ˇ۽Nםc|$9!!PODby+ǧ4[,* 5晉\,-rpҌ× ~-D%Gu~z 'E+AyY^"?vwtZb95օyrخAyNhB/Y*oEIڲ|$0@04xYjDv{] XD5VJV9^RNv/]%o8vAa"(9\9òf ītK$ٻOi{LZ[IzxXߞUM'2\nɵ~U&rRuoLxfer ie{eiOurV՝t32׳m>#o鰗 #C4Ym|d`gxk[.7ˋ5{~%:C{u$nNެpYUn+tP!\RYRAZ$C+0ʇlamD+|-ZApl@E6|.W<$u)LA&)D6KPj:O+ٓ΃ UF$ ЂMzOsxx*1T`E9M>T74S=Ǜ"+^I <v?Zy]R5 Hh \R'p\I%;9}`7"d G8mO= W8>I *O[4u撇Kkj1UhQvgBUtU`lYA'V j9nv$ꄷPE6}'Cxsh- /*>\SۈVxfVpfKMI&eӿ2`bպ%/UPGŬmd`2U_yF:.`z;{#z#O4kECyk* 8}BB\E.o? *"=tt^&:@:VgND׫睾n FZ {#R+&t L!_{xHqX ?4ea,BсY?J#sR2-ሱJŊHo/~a/Gs%f_׭ 1TFÌIGi5<**'v,+۴ؽvͿza(ɛp'HYDΕr T0PbkLǒT01QR #-LˢtNyX9bD1OjoDer)$AW&Gf^Wc;t`hnO#V!/c5ZDžObP(gtfF b\8l<`4uPL "W݇\q>XܛY_,}~% .zwo׿,?Q܇u)|K+0Y^5ϥ'?E2*F~|\;#k|۳O(<$(bvmXKF@*EL۝#CIǛ<}K U!=˽=@iI.V? }/e-%yo/ bllҔM:>|zX^f? Rz)l {c{uFǶv6}V&#nN_ɇ\Vs|B iՑx]jEcl%uWgkeJ / .OLVDt{\2^#"̊kYO<5,9d<5,xPM_zuXRC:ILyz6 0MvWZ\^X?=T/@m4 al>[Mm7KfsSuqۑZ7 ¥{˚5_sMnV`O,.[.Lw׾.XJ?}v7|+h_[7K-Qr':mH{]9\Lz9f1Vgϊ>{q; UfW>E{ͩ0v[XOºŠdFvTio֭Hub|SלҧܵnAWzDmY|:?=к 1C)}NQEEG ep B@\2'b$ lGJTj!o4WUueeI锅 _*)y"^XQ3̶vdtC%u7>4gMrkzBQDǤ%0[R U&Ps>s}𿘍U?ݜ^/ݶl2@]4]L:=*iFK%Ϝx QlOCUdT|1G)S) B tjۓ;^ґՒ6F&[`-Xieڪ- TұUȼf0sY:-TϼJ02+4 \d!e+Rd Iy: p_Ih :w=( 8yoT߾5+'V#-(> ='RFqȱp8J(F쐗jAK֍` 2#H,`*#XXt1&VP-hkH\5Y Zݫ}([pJjPS(j%3PL04 Zn i^QR"9@52Ė%Kr#*{27#aS0y(^} D]p=OnVp!nRvkj֎5SGΏy2YE\~Y,޿{GA{PUr2׳m>4.7i|nՏ,zu+{?_lvʹkR:l6B6ג_# Ҧͭ7ԋ fC^R181ĥ?ORw1ɩ㺏ϙ3"K)+&[z- $Ŗ |ɠetKDyPBU5}2 y R7B >]Jv)M?7yϕ g @ri$JHQ^l ZJ Q1Np=4kntc~i|7ۻǼqN,c[cwնXrf}8?YZ2ўXwW$ޘx?>+Ap1$^tl%.P<^'$Am MǨu4ˉl- js3 :8O΢ xRrŝuetki:FED&֤[ 'm'gFL%XD`aJ.n%  CLD%z(P0-]eu$jK IXwz,aO äYhAiZhTmBGrrLxͶ؟Ց.F"jīoOƺb)ʑBl#[],5*t{=kXid7ߛ''Rb1wy\ OyZ`AʴfއknPds{@8`FQ0#?׏r{?Q8Z]HS@#?iQ6; @1 grr_>,___?b=yUkwԘ;X=䃧L7&icJpJ"JX#Y$*F$V c8aB:҆Pd)HVRjA+ lf#>`=pE1Q2Rwǥˡ>B V;rAs)~\R"^p2ϫPsRIl}CRh'K1 .]yjq?l.}Q'K .]yjNq?l.ҏK=R˽ePB+rRi=i{*0=y~&T@HE<.b0f+ gS`x2!o|\&,O Hν=:2}ӿ=˯C$yi&0˺Wˁ)%I9}HR3?0nDrըy t#zWk ZX} HRʻktQJz&!JwS2%Afqkd6TY}+>KmK)]vV &QzC5xZ5y?SKY$* dAe$`3zE1ƃ&uV1*&ְU97` U&J+m̑VXJ2GڬD.]B8L=D3عž㎢H%۝\F>v)$XJ)C$ YjLb CBF*MM%er>C=H;c^(6*D\ AibbiGRe*L˔r_46IDICGC ;y,M`ۮ^ZmrDyi/87@ZrԢ(הbrlbn.;5:H^o܂\yj0=rAs)f~\K .]yjNas)Q~\JG{DqcRI;l\f? x~堞WADN͐A57vGdPRMXO@Kh31Z>`6FB{]wDG/>1ixݛ?_/0tk_D^pZ_\ߚ|ZP9r!> dhd8 K:C41h=cLCqw׍J" l{D*'}}B?| ؏B8\(D[ ?.OH!WGMZP$ٚ/RiPt),GC Et_"fPQ"ک~[ Z-Mfߴ^tl묰zQi ALaoY:xW{;IAz u>6g\1aJ$"TBbM`ɰ&b9m1$)uTT]̶ 8S{e'ij1Sa{t\GXRD)UJJL4)OI)IDC#Hɐࡒ!05^hU((81'D :'XER LD&Hd):}ӭѥ O6˻vi˂G1[uNVgӎ 99~Fz$4tlܻWݾS-GDhP _^=gϯ]%uK_qVohֱ79E=jlCqAsIw, G[mQkX``}y-Ȉ(;p4Hy[»)vukY~Y]G$& ޱ+իꏵUjAM B =a)n 7hH Bat*J# ̂˩WL['fs`o{1@iE6=.*[9!j`D)]O8DM':rCT` y@:ܯ% T Tu9X;s;Vdnlf_,'& ȗȗȗȗe"=0[պ `a_T(a]ܨP G)QDT$q֯v>/prwX$tK_b?M®q[<1[@jnx x=90<٤35 i+{s% 1 owP3:V)Ktəm)Up2quRZI0m,V寭 ,+]ij㚀SMSK$azJI wk À ,D*J4Hpf;-YN1*N *1tKn#V]tW`36Fk^!^wm\́a[`},6B.(΍YT613Wxb0a[sϚQ٣߯On10afN|2G{A >mH8a MZJ FD$TF0ZEc ,)qtRMh?Q2MRK0-0E3 Z_ goڎO)o[Q} a񫋋g]t~6{0=nubu)ni[8Ml_R;n#u2k v mK`!ț J{A3+XoS֝ i9Owy9m85=}]FeA-`- 9O (38>#3kE=Бq7N\B˾cKNczQ=iw8{_l{ vv;3?]G0bۂ |M,*66*aRuoIkLUw͓IOONO_Z'@0[9k=~z?@ݫpl\L6"<mt`XR1,+XAߋ`'Ձ.;?P0q8ȀYEc/d|mi7oWooڿs}8͗=<ox/.)>tgm崩[wi0-N?0_kͳn;} x󧽰۷>wn84۠ف@}p̴k}+^)Jnf#@ PaYuOθ *n,a$s&ov2>&tex@?{jt=&o̸~f4}gENX%׍?<PAB XgF~UpFE.7q=,:XoO^?bdGۼ|:}/?/`">>el,{tc< cv ~PTA;+%Go 'Ͼ_NG_,AX ή`aR$8p޴^BpXM=_p/}8Nj|qul#I a}tKwnj]0v_-!&bfp9ViRwJ m|xwom>D[a4jtj|+n`-wѵ-ƀ&^8aT#V#зұ0i]@;ߣ ;uޣ6X;[K/zMv^w Myu˟m]oXFοT'b"XQIʬbĜ= 7ZNf ۛ׹bv5@ꃼk={IJžQq0lU~u9mg3tǛ=:I *@vYK(pXmِ f6DSW2SQzI'XU?JiKN9Y'2Njw2Ng  Ę*YC1)ַGCFq'@^w!8`$)?{"H;58;ڙ-=؝5Z_;A0Ȫm5Yki፤pǿEjO-|ynO Rz OrٝO}趻Sk$ :/45GiwnHhBɮڛ(^<[mJ};K}iv%?RsI50uZ ['zg#A(IE&>u_ $iʓ l닐l[(^C+Wu=5hqܬ+?I(v46Kuw+G'9*V[e?/`X紣Ig9 1Z^vP<^ƈ&$o@ MYP1;YrT1Ew|[/Ý/ٿ|@bE)*WbSxry."SwA7'VNB2 ܁އ0MhT8{S喾Z! 住﬏O#`@y碾{Nm82^7hC'8A^u:swq_ &Z%+<R@6jȿ"$cll:7v7NE%Njm6*jy NȬOZ<$ォ/j)U*#gFHG N xCSbVWSې}-iLzZmzmzɾo"қZuZ:RO*Z{et%(#҇ -3I#K1fU6St廑;eOj!*wHu_ip8zvyl`I;fgʹձ=fͽ+ qU޲7B"8qI}ZT`5SGՀLc+}%5r3nCa\5/ɿ]98 5[[cXs71˿n$xK+vFiD3ma&ەOvڰ-\35-jA8_m(z23p`=)Bmuˣ4ؾ~PJq/oPɓfʇWXNݹdZ['O{$Ve#Qe3L%"B.u>&'_&ϭ#nu./~n>s㰞<⬄ t.Fj86+&rעKז ':\ooݜyN& @"]f1 ixQ 5dR9,7'TBP< @UGtZIZ;y#1e?Em[\+Ʃ=BJy~gu u|rM MqbApTdfoCm.X0-m(yCa0Zb狍=}"ȴ?cR=}o,6omC F?ڴ|Z;Җޮ:6"򲥔{6I|YN9&X~%]>_ ,/o/ ܅EI?Wǟmb5ν;~QXu] /5[R^.'[mD".ݧ˹vX6Cp/!+TrRgВX|֐9Y(aS[⊧rƤ(*F^,|m0?4~N_&6ӋH;^O2^pqQ0 EKdr~~z%epXGMTry3QYhfS\rJ[f!NC&jjvvIt,@gHMh2!zmcqyCՙ|09fjWX0LHMpؖ snRkEv2D!8-i ۀa6tq-@=۞\ʀt?xՄ2Ya뭺Ww.t~ ,x<xn8a2ߦ8NUƟW~1(-O1۸\Pf?+ YFCaV~$Z~~?YRַm4}N 9rc2WBw2is(p1yF.+[ )w &F;{=K`#%&.qzW#]^Cdwp[=7;Z )5tZmh~ѩ}(zv)j80W#|pl G9w@YA8Poڄ$ oJ^)FWbgQR&kkw=Nz51ygpIs$<r<@`Ҭ,ʣy/ͦ:u{,ZQ<,`?1]-'Hst9 "JUES),ϯ{uŅX<2?< G@By:$#oDҤ' !@< "fq\4yD -uOg_TaoQ OZmW^n{r8 \qw>]}޽ߊX&(Hjlb֩ .fշڕKcPN2\& -}վ~MN4jժU;2{iB:'!H𙓅Q}<DϙDFr`;J_:(3 @( !P+n҆RcgG-|ܾ]O&-~Oc8i[-4"t JJO(HO剂>)5f=MNWa#ŝJH.\",4)14yrKf#7)n"&9]n/&˕pO#J,q~H42yH qB)F 6()xodؒ{h%2O$Դ5=R#?KP#$;VHE:h.{4k2L~S&sXrdoBLm_Nrݗks/Vn,gFwػ+mpƢG?V|-g>_>}`ኑ]62`mha:U$N&02(E`{8osm X7,Oq!z q$80Xq0ŰL#,Y%"񎛰1,EU^5zZsɏrNh9]ismЕwebm&/[ԗWp{Q@|.G|pottttPfl8EPq FB `C1Tp̥Rbp><P6޿2< %ުf3Ǹߧ^g8'_2Ge}dq7m CF ,%LTJZ>rZ=TZ} i=8TD VHbD4M<Ҋx}xu 0^נ";k4:/:ˉ_Ùiǣb3jOknfڡ^g=0ּ(fHJ"sL7gCI{QJѹH׹]_-b"b(d.J& 8\t=KvHJi)T}|@10 DNZ(xn |!Äq̖>!/ڦhhj3δ!⻺z2б {yjAD7@$<>7=xJj:ƓGrkI}$.2=JZż{q&AroڲE.:=<||g@ -d u$dx.!i@N[ 7n Eo*OqTl?@X3Cdg|wYw?ڿj06ѵSJC$zq4?UGX1 ~xI ɅW HPax 2ю p춃PCJ#BԿ:` "tv.8ܰD4*mTO5XhIg5Ϗh.I?3A\N }gFjqh[.4)Eǹ_+ џI#0; qr8$+o3p1h9_FnnkϑN^Hs(JJ[tM]Cϯ^* `ct #4E#u}}'kљ[#\wso>}d:.va 9o9J5e/;rlA=C@N>:b)H/B%ђG ",EFҶL}}"d:<&  GD iubMpo_)By.rGJt˥$ 6vQ @烇c峊`ba[r^`r7X]0sv ԫD<8s~,IZlW[CXdj5#%^^45,jedrrzJ,LrLZ㬏(Z⒠ C5k4l2I?e8>RH]rJS@!cxĻ@.&ok* }ȷ=j+|YyPZ^òt=·@Y C;ɧg'@(tq0F X "+) v;=4BJ8C0|rvFxX(GB{!cd(u.(glп&q<Π|(u`EGHO)X,̗ڂUHe* kg./9Nʑ:S{;>'زD[uʑexm /QS|L+gbnڦ{|m!+΢y/"phy$  7:7uNk, ޖ]Q7XV`)²,H#L i4bġq6Zث@L6= (E7p.4޽)]NP%^(-t8=CI.WsNgVVRv%FLMgl؈\KB:XC@N~7Chg, ͉9\%Z ޷*5, ^ǼF`>0Ʉ|6k< >a?/ۘ o(]QFKÏ?+tGWzE & ZutI-3cTj.HVyŴ+.|80$.vWC3?}0[L2ƞ*CZjs[p @0SBP{TS㠔P[͵ޔPա'GXr*kDI+k#,3< +4.9o1 >V0y#IqJdr2I2.BB20|c1sOyEВhLYb@XeJ];`\/g_TspK?0G0@=+zDPb }~k PZAa[}֚\ |@`m"+97Vcv$-Pc lR6ʉZ50⅀TXlO# *Ҿ ZY-h̋ >LO9-}_+U 7|=v1 N%י[$rNfabL&vrWgX/37*?fѤ\Y*[hb4 IDH1<CyIR`VقqpPzglU}j؏~%͎ܴt?z^gTFq$lm49~9lьMG~a[[y$4& lhyE A ɢ-^ F? #rؘ*m*d$t -jռYC&hsg^loO*dx^C̢CX%lPڨ4&=,+ ^obY[X>@I.I&XYFͅ*::x tv},yec>/R Jy%# |i`bN`]2EvR-G'mQTQ l07U?-.p&*:˥9+KXUō]ُ.,J}e{#d"LGкۜR3+%ձu>I{fwsԞ;D|oGkor}=0?4Y.%+Mu7epM:"cpaaui?%޹D^?`iƾs֘alD֬,Xzm5J|<[L8GK6]$ 3.at%B]:O/SVւk#`5(Mil5DAͼ$HH6qG3ao\͘[VP3VJ-FG霡)s%9&mJ~x6Q)J1or`ɃN9ռr݂*>e݌4 ,M30_!'!su: PmE^lG]0m)gw˶7])~J3?fo5`ocry*FGpq~<<9t6usEƙqOXvhV*c$ҸXuV2mo:n}C C(uY@,ȢԲzU ߊ@}U=)4>ba|nw|UJod ( t1\ jFKM|sy@u[]&W3Q>߿Dxiq^DuR-d*]arJxcw :s0ҢRQW05*1be#ڑ)mguԃ͟( Kqv3G5Q ]5\t6[evCbY/^=DRf1Z>-=לnYo9+5AgX0NQ,2^6Zr|;o.e4)u^I8к_wԅ(c^8sHJI)ƳV\:*04eiєɎLtUA Mi!f(MHj8Oր^9L^:žh9B#i fE$hS5ۃwF* Q9uoeR.Nd8\[4x)V3:jq2Ūn[XU\b VŐWգ6)V%jFߩ[n}tEB}Z~Ì^\[&4O4M-7ԊدDZܬ0):1l낣풴bwQ)?!\==g  ǶlVK#mCtM%R­q7;‹S˅q 4{3嗒dzg<4 9/Gq}WW_yP6ִ~a{L4qZ.ŝΞN4Q3=}H"< ȹ 1m υ,7q8I(=i W.:: uS\Cn]yP:]ƺ=QNɺun}h WѦNZr^404Ǝ,v7KQn ,Msv@_(oMm j[ e>5PjEHg#:x )JX ]MP)ORzBpx:m@JP%\4eKWlr`!bte(gbz^6R{J(Fhr&LD43Oxǃel$HԼn2zMDk 嫌vR12I6i釰JlA #|iAR=8oG?9Rk$*㣕?P [:eer߀iyX(E5)d2`ʃ{޶k/EKerw-n #M!%rHJv H$3gΓ u[MT. x jZ@Uclv]x| 8k\v5>}zGEydw׉-[cu-XS8!V._^žH@LK&7L.!'6EBZJh1t2XpFRAf;VgY/ƍxV_AQ#G .!U`Gsگs87/"9W=Ƕ¯m36٢4ݸ܎&6O~/"$FO@Da@m;imê?6@l:L="{Ns9mCP\̞7DZ`{ 8/Gs/fax½wW,HpP!aQ8tGA*DH.)%bfv.g ӢPq2% bA,,U6X}wQ񂭛z`yҌ~8NZİi;^1+(ŵj|0 |gtR^8#"q椩8jcONj!=?pSŵ>!Ǟ#Sk2tصfhRaQ H2f#9,QXL~xB"8.ӿ s4)L Mk c? >*HU֫ Wui].`o3!l%1f"^ph1DMN!L9Y)SbdTa'l~.rO.C3ȓOmbbAlx-];8?BؙF"%S$I=T:68Hq°y΍ N $@yc)5qJsL e.U:ؖJ>'EpQC85SIQƛ8i]D%cLq/rE.L.-K# (, p9=EO >CI&[Bn[ŠNp|萮9jcfWYÝф-&Z H)U\Ն YM0=0hoq #~H$ɠ~KA\/ ST#_8PP6oHhQ Hh){uVcGuf "Agsidql,UQP [Ȏt)m4if *gFpHB N\!Z*8$ E܂TqsB r7lк)C˧'fv+RIj &.\p~I._\;ЭGZwp )3DzF__ߝw?_,lv}n,Q+ۓ}(9l23j,\לq) %pskĶ Šlr0oK*5Wݶ(H-xc\f1q}sFv0lR{qPIsz5⬏v0O?ܼfxel$ƒ=2shOSZec#0[;0d3pl v <\ޑ#<@;԰ Vݠv\bAr*pnhD X{(ה9PرB&(JubvI‘L$rZE [_'M3L'c_vnܽw=_)lHC9͛d ҙR"VEADH1\e%cLfB|PKv6ϭZGmUkw7 ڹ+Jߒ(}1v$Tm]oԖhLPSjVP'tc#0K1 BX4cP%6'-+KWd9~Rug"V^be֒s ULlɪyLg(߾tr_) <,n ht*T"(;SJ  hc`?Vx㻴Jzǖ*Nٓ|"$SB9yVRi#:hg{[zvvCBqM c87[* bD'm!o( Dj>$цLEf:c26 ;Y>/Y>f$7%ASD9M iWe2dB7뛅ٛƾ.J;}/O"ɍhc.Y{rU0q>o>bVə̰aq8?Qo/wo`"ɯTA6ݵMnD)7.s PaT)mE E-DjO%ziPFg+lqn%>D{dJLghؔN{N|+/?Μu^H^&G'VDv> |/"r$AfϠ#~\wniKh U1muqQuxϡlĒZgDtGa%Vxh&tx*Cѡ'E-ݑ 27o̚_$̈́2wv83v`QW]znjSjQ?j9RdzWeڃkHAGَw5ZFNZ1(k؉GxK>oTgzLvDO!5Pd<+z zvL $8EAբ!40r^` 0,tgsiWmݝj8ոz8ںjvBJ{Mq+H0:z 'NvDU8 D :Qݑ 0gww؆BƪUfvXD,Z 901T ,`d; G>ߟK\RIQA!ڭLJ9*$bTQ8% #.(QP\)|NX[[ס&29)O m[YO }']Ӣ=P7X5E^GUi)&#.|#e/ r=zVX'Fyi8uDFT$8]&QT԰ Iv9ۑ@׆δ2 uB K#+ + ,p 6(: [#dwVuqR/Ya4%p(tԗPkNRE~3QH>/Pz{쓔D)>Z| >(tԗM1']TF:sAb̎@JC}لN|}-$RJI)}SCj#lCeX!kA3[U.[BoĂ@-YmzrЂсĿāLCxx1 Y~kDga(aƸFeɂ!x*p'ZB3x㱫|(m٢l<$*9;'dx< ?tʴZ]}9)<:擞7N{ ,T& \:ZvIë́&Oz"?ue'ȜI).exOky1|w +V_V_V_V_Y\_fTHeRa#pFVb3e㰮Ԧ\{"Ĭg|KpO*WqVɲ; Oًqx_Y5|MqhřG $M\B-)0h);87F+G=V#J/ 'K-ùqc҅6JkKk^݊QC 51"iу% $ 1Fڍ\XiB#lNT(l S>':Z\eN} x1M14SuF+/+4`\0m(Då# RIa'6, /Q:Bc‡Im DDJH"gb"a *ȇּ@JJ b(#RIyPBSE‘눴X(6S(T鈴[eU?pQJť*ѷA26zrtAWk V#cځsFcch 50K?ed _{h!ytVLr\-Ԛ\-/By))(:ih.3݈apNO iV5;@I| R?:2yFjU%Īb=O7&䖠50;kNӣsq;tlQIid9P_;y|nt.68Ԋ`{'Q-6L[d~VJ 2)>$WJCҽ#iHkO/YbB_A`V+j+a14iv^SYjWݠNdwZaLm%u[fJhzp* &BY`R_/qrcg`t1⇲s UC2i6ko#d'1w<5?[?/h8xa~yV zϮg^㴽n +PoOhLњ~'vXcn4wn%5kh햞0o\Ddѣ7[* bD'mo=6vK/Έn}H7. ,^O%<)oR [' sg:2 oԗM9䔅T i3#Tp2oTeZT]ϡembIsN;4$C:mb?R4eɐcIzSB)@Tԃ xtYEAIe^ Ƴ)<%CB;IFa%  RQ{P?nn# d̺{7tOO]~pM$U]?|NloeMmWEFCm@j,Tn{:mz0/2>=Tb5&d<}F}*?Fo7w29@PuS1'(W/gSИx&ufByܫBUS/AD  MvͰ*!5<:,uBm:\6DouN4?Su:) V{Jڦx `&JbC%e1Rh1Vq)lNW"u҇wf&4^٩ KYr4WJˮ L?BM%ηv`JH>t#u(p[p%Q1~ԑ@*u<*;o[ejWI 3נMXSvg %tឺqC"  A-Nף;܉=ݝy|ݙusnƌQztETsu4=',} 19X /}^N c;zR$嶽en~KK_v)]+θ#2`u>U+{7{A`BxMD G'4Bo"kGPme945 gmV?Rj@PFpGՠ}CÂ[VV8Q΁\h{@|NG$!3sXQuc-B@?D8<÷var/home/core/zuul-output/logs/kubelet.log0000644000000000000000005574545615156526522017733 0ustar rootrootMar 18 12:09:25 crc systemd[1]: Starting Kubernetes Kubelet... Mar 18 12:09:25 crc restorecon[4692]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:25 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:26 crc restorecon[4692]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:26 crc restorecon[4692]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 18 12:09:26 crc kubenswrapper[4843]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 12:09:26 crc kubenswrapper[4843]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 12:09:26 crc kubenswrapper[4843]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 12:09:26 crc kubenswrapper[4843]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 12:09:26 crc kubenswrapper[4843]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 18 12:09:26 crc kubenswrapper[4843]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.720582 4843 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727272 4843 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727299 4843 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727306 4843 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727313 4843 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727319 4843 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727324 4843 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727330 4843 feature_gate.go:330] unrecognized feature gate: Example Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727335 4843 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727343 4843 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727349 4843 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727356 4843 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727363 4843 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727369 4843 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727374 4843 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727379 4843 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727392 4843 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727397 4843 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727403 4843 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727408 4843 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727413 4843 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727418 4843 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727423 4843 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727428 4843 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727433 4843 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727440 4843 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727447 4843 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727454 4843 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727460 4843 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727466 4843 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727471 4843 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727477 4843 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727482 4843 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727487 4843 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727492 4843 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727497 4843 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727502 4843 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727507 4843 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727512 4843 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727517 4843 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727523 4843 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727528 4843 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727533 4843 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727540 4843 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727545 4843 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727550 4843 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727558 4843 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727564 4843 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727570 4843 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727576 4843 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727581 4843 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727586 4843 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727591 4843 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727596 4843 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727602 4843 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727607 4843 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727612 4843 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727618 4843 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727623 4843 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727628 4843 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727633 4843 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727638 4843 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727644 4843 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727653 4843 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727658 4843 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727679 4843 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727684 4843 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727691 4843 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727697 4843 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727702 4843 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727708 4843 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.727713 4843 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727819 4843 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727830 4843 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727839 4843 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727848 4843 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727857 4843 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727863 4843 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727872 4843 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727879 4843 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727886 4843 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727892 4843 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727898 4843 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727905 4843 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727911 4843 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727919 4843 flags.go:64] FLAG: --cgroup-root="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727925 4843 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727931 4843 flags.go:64] FLAG: --client-ca-file="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727937 4843 flags.go:64] FLAG: --cloud-config="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727943 4843 flags.go:64] FLAG: --cloud-provider="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727949 4843 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727957 4843 flags.go:64] FLAG: --cluster-domain="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727963 4843 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727969 4843 flags.go:64] FLAG: --config-dir="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727975 4843 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727981 4843 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727989 4843 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.727995 4843 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728002 4843 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728008 4843 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728014 4843 flags.go:64] FLAG: --contention-profiling="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728020 4843 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728026 4843 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728032 4843 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728043 4843 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728051 4843 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728057 4843 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728063 4843 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728069 4843 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728075 4843 flags.go:64] FLAG: --enable-server="true" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728081 4843 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728089 4843 flags.go:64] FLAG: --event-burst="100" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728095 4843 flags.go:64] FLAG: --event-qps="50" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728101 4843 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728107 4843 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728113 4843 flags.go:64] FLAG: --eviction-hard="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728121 4843 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728127 4843 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728133 4843 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728139 4843 flags.go:64] FLAG: --eviction-soft="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728145 4843 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728151 4843 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728157 4843 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728163 4843 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728169 4843 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728175 4843 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728181 4843 flags.go:64] FLAG: --feature-gates="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728188 4843 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728194 4843 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728201 4843 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728207 4843 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728213 4843 flags.go:64] FLAG: --healthz-port="10248" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728219 4843 flags.go:64] FLAG: --help="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728225 4843 flags.go:64] FLAG: --hostname-override="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728231 4843 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728237 4843 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728243 4843 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728250 4843 flags.go:64] FLAG: --image-credential-provider-config="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728255 4843 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728261 4843 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728268 4843 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728274 4843 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728280 4843 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728286 4843 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728293 4843 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728299 4843 flags.go:64] FLAG: --kube-reserved="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728305 4843 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728310 4843 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728317 4843 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728322 4843 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728328 4843 flags.go:64] FLAG: --lock-file="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728334 4843 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728340 4843 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728347 4843 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728356 4843 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728361 4843 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728368 4843 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728374 4843 flags.go:64] FLAG: --logging-format="text" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728380 4843 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728386 4843 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728393 4843 flags.go:64] FLAG: --manifest-url="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728400 4843 flags.go:64] FLAG: --manifest-url-header="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728407 4843 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728414 4843 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728422 4843 flags.go:64] FLAG: --max-pods="110" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728428 4843 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728434 4843 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728440 4843 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728446 4843 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728452 4843 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728458 4843 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728464 4843 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728478 4843 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728484 4843 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728491 4843 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728497 4843 flags.go:64] FLAG: --pod-cidr="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728509 4843 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728518 4843 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728525 4843 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728531 4843 flags.go:64] FLAG: --pods-per-core="0" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728537 4843 flags.go:64] FLAG: --port="10250" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728543 4843 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728548 4843 flags.go:64] FLAG: --provider-id="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728554 4843 flags.go:64] FLAG: --qos-reserved="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728561 4843 flags.go:64] FLAG: --read-only-port="10255" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728567 4843 flags.go:64] FLAG: --register-node="true" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728574 4843 flags.go:64] FLAG: --register-schedulable="true" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728580 4843 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728595 4843 flags.go:64] FLAG: --registry-burst="10" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728602 4843 flags.go:64] FLAG: --registry-qps="5" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728608 4843 flags.go:64] FLAG: --reserved-cpus="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728615 4843 flags.go:64] FLAG: --reserved-memory="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728622 4843 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728628 4843 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728635 4843 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728641 4843 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728647 4843 flags.go:64] FLAG: --runonce="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728658 4843 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728682 4843 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728690 4843 flags.go:64] FLAG: --seccomp-default="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728696 4843 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728702 4843 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728708 4843 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728714 4843 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728720 4843 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728726 4843 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728733 4843 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728739 4843 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728745 4843 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728751 4843 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728757 4843 flags.go:64] FLAG: --system-cgroups="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728763 4843 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728774 4843 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728780 4843 flags.go:64] FLAG: --tls-cert-file="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728785 4843 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728793 4843 flags.go:64] FLAG: --tls-min-version="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728799 4843 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728805 4843 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728811 4843 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728818 4843 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728825 4843 flags.go:64] FLAG: --v="2" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728833 4843 flags.go:64] FLAG: --version="false" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728840 4843 flags.go:64] FLAG: --vmodule="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728847 4843 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.728854 4843 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729016 4843 feature_gate.go:330] unrecognized feature gate: Example Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729024 4843 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729030 4843 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729036 4843 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729042 4843 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729047 4843 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729054 4843 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729059 4843 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729065 4843 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729072 4843 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729079 4843 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729085 4843 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729091 4843 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729096 4843 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729102 4843 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729108 4843 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729114 4843 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729120 4843 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729125 4843 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729131 4843 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729136 4843 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729142 4843 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729147 4843 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729153 4843 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729158 4843 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729164 4843 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729171 4843 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729177 4843 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729184 4843 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729190 4843 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729195 4843 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729205 4843 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729211 4843 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729216 4843 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729223 4843 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729229 4843 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729235 4843 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729241 4843 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729246 4843 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729251 4843 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729256 4843 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729261 4843 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729267 4843 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729272 4843 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729277 4843 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729283 4843 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729289 4843 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729295 4843 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729301 4843 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729306 4843 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729311 4843 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729316 4843 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729321 4843 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729326 4843 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729331 4843 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729336 4843 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729341 4843 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729347 4843 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729353 4843 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729358 4843 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729363 4843 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729368 4843 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729374 4843 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729381 4843 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729386 4843 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729391 4843 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729396 4843 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729401 4843 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729407 4843 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729412 4843 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.729417 4843 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.729425 4843 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.738786 4843 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.738813 4843 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738887 4843 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738896 4843 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738903 4843 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738908 4843 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738914 4843 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738919 4843 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738924 4843 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738929 4843 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738935 4843 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738940 4843 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738946 4843 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738953 4843 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738961 4843 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738969 4843 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738976 4843 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738983 4843 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738990 4843 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.738997 4843 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739004 4843 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739010 4843 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739017 4843 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739023 4843 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739030 4843 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739037 4843 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739043 4843 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739048 4843 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739054 4843 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739060 4843 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739067 4843 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739074 4843 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739083 4843 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739090 4843 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739098 4843 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739105 4843 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739114 4843 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739120 4843 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739125 4843 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739131 4843 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739136 4843 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739141 4843 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739146 4843 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739152 4843 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739157 4843 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739162 4843 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739168 4843 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739173 4843 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739178 4843 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739183 4843 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739188 4843 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739193 4843 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739198 4843 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739203 4843 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739208 4843 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739213 4843 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739218 4843 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739223 4843 feature_gate.go:330] unrecognized feature gate: Example Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739229 4843 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739235 4843 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739242 4843 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739248 4843 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739255 4843 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739262 4843 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739268 4843 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739273 4843 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739279 4843 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739285 4843 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739290 4843 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739295 4843 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739301 4843 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739308 4843 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739315 4843 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.739331 4843 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739470 4843 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739479 4843 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739484 4843 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739490 4843 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739495 4843 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739500 4843 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739505 4843 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739511 4843 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739516 4843 feature_gate.go:330] unrecognized feature gate: Example Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739521 4843 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739527 4843 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739532 4843 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739537 4843 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739545 4843 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739553 4843 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739559 4843 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739564 4843 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739570 4843 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739576 4843 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739581 4843 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739586 4843 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739592 4843 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739597 4843 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739602 4843 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739607 4843 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739612 4843 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739617 4843 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739623 4843 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739628 4843 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739633 4843 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739638 4843 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739645 4843 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739655 4843 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739662 4843 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739685 4843 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739691 4843 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739696 4843 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739701 4843 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739706 4843 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739711 4843 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739717 4843 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739722 4843 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739728 4843 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739733 4843 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739738 4843 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739743 4843 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739748 4843 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739753 4843 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739758 4843 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739764 4843 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739769 4843 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739774 4843 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739780 4843 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739784 4843 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739790 4843 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739795 4843 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739802 4843 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739808 4843 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739816 4843 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739825 4843 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739833 4843 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739840 4843 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739847 4843 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739853 4843 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739862 4843 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739870 4843 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739877 4843 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739882 4843 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739887 4843 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739892 4843 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.739899 4843 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.739907 4843 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.740096 4843 server.go:940] "Client rotation is on, will bootstrap in background" Mar 18 12:09:26 crc kubenswrapper[4843]: E0318 12:09:26.744643 4843 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.747993 4843 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.748121 4843 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.750144 4843 server.go:997] "Starting client certificate rotation" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.750179 4843 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.750397 4843 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.775062 4843 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 12:09:26 crc kubenswrapper[4843]: E0318 12:09:26.779956 4843 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.782922 4843 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.803919 4843 log.go:25] "Validated CRI v1 runtime API" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.846864 4843 log.go:25] "Validated CRI v1 image API" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.849768 4843 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.856109 4843 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-18-12-04-19-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.856327 4843 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.882443 4843 manager.go:217] Machine: {Timestamp:2026-03-18 12:09:26.878160019 +0000 UTC m=+0.593985634 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e BootID:0efd8cb9-5707-44e3-a74f-91b5a38b13a0 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:1f:17:1d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:1f:17:1d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f7:a1:33 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:78:e6:51 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:47:7d:d6 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:1b:6d:9a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:1e:12:2c:98:53:0b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8a:92:10:b8:2a:fe Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.883500 4843 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.883874 4843 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.884643 4843 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.885159 4843 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.885345 4843 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.886077 4843 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.886222 4843 container_manager_linux.go:303] "Creating device plugin manager" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.887090 4843 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.888573 4843 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.889645 4843 state_mem.go:36] "Initialized new in-memory state store" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.890084 4843 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.898786 4843 kubelet.go:418] "Attempting to sync node with API server" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.899137 4843 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.899356 4843 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.899556 4843 kubelet.go:324] "Adding apiserver pod source" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.899779 4843 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.909152 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:26 crc kubenswrapper[4843]: E0318 12:09:26.909321 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.909906 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:26 crc kubenswrapper[4843]: E0318 12:09:26.909991 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.911911 4843 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.912854 4843 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.914446 4843 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.916119 4843 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.916149 4843 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.916159 4843 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.916168 4843 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.916184 4843 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.916194 4843 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.916204 4843 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.916219 4843 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.916232 4843 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.916242 4843 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.916256 4843 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.916502 4843 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.918030 4843 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.918437 4843 server.go:1280] "Started kubelet" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.919478 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.919705 4843 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.920136 4843 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.920468 4843 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.920503 4843 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.920652 4843 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.920792 4843 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.920806 4843 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 18 12:09:26 crc kubenswrapper[4843]: E0318 12:09:26.920798 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:09:26 crc systemd[1]: Started Kubernetes Kubelet. Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.921219 4843 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 18 12:09:26 crc kubenswrapper[4843]: E0318 12:09:26.921272 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" interval="200ms" Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.921586 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:26 crc kubenswrapper[4843]: E0318 12:09:26.921639 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.922180 4843 factory.go:55] Registering systemd factory Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.922193 4843 factory.go:221] Registration of the systemd container factory successfully Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.922706 4843 factory.go:153] Registering CRI-O factory Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.922734 4843 factory.go:221] Registration of the crio container factory successfully Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.922795 4843 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.922819 4843 factory.go:103] Registering Raw factory Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.922836 4843 manager.go:1196] Started watching for new ooms in manager Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.923425 4843 manager.go:319] Starting recovery of all containers Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.923472 4843 server.go:460] "Adding debug handlers to kubelet server" Mar 18 12:09:26 crc kubenswrapper[4843]: E0318 12:09:26.928608 4843 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.205:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189dee3b590df559 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.918411609 +0000 UTC m=+0.634237133,LastTimestamp:2026-03-18 12:09:26.918411609 +0000 UTC m=+0.634237133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.938172 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.939146 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.939164 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.939177 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.939726 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940093 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940109 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940126 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940184 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940200 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940213 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940226 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940239 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940256 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940271 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940283 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940294 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940307 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940319 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940333 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940348 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940365 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940379 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.940395 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942407 4843 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942438 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942454 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942475 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942489 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942503 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942516 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942530 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942542 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942556 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942569 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942583 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942595 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942607 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942619 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942632 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942647 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942680 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942698 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942712 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942725 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942780 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942794 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942806 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942819 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942832 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942844 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942857 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942869 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942886 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942901 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942915 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942944 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.942961 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943007 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943024 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943038 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943051 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943067 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943079 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943091 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943105 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943119 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943133 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943147 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943163 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943177 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943191 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943205 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943219 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943232 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943248 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943263 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943277 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943290 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943303 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943317 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943330 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943344 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943359 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943373 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943387 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943401 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943416 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943430 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943445 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943458 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943474 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943489 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943502 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943517 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943531 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943547 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943563 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943577 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943590 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943612 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943627 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943642 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.943661 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944032 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944057 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944072 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944088 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944103 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944119 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944134 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944149 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944163 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944176 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944192 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944235 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944248 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944262 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944274 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944288 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944301 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944319 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944334 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944350 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944365 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944379 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944396 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944411 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944453 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944468 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944481 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944496 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944511 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944524 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944537 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944552 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944567 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944583 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944596 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944611 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944624 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944640 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944655 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944704 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944716 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944729 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944744 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944756 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944768 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944782 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944795 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944808 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944820 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944833 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944847 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944862 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944876 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944890 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944904 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944917 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944930 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944943 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944956 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944968 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944981 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.944995 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945009 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945022 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945035 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945049 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945062 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945085 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945099 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945110 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945123 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945136 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945148 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945160 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945171 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945187 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945199 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945212 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945224 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945237 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945254 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945267 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945280 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945295 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945308 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945322 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945337 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945349 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945360 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945373 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945386 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945398 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945410 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945425 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945438 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945451 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945463 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945476 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945487 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945501 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945512 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945525 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945538 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945552 4843 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945563 4843 reconstruct.go:97] "Volume reconstruction finished" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.945572 4843 reconciler.go:26] "Reconciler: start to sync state" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.947964 4843 manager.go:324] Recovery completed Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.957421 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.958874 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.958907 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.958918 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.963570 4843 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.963599 4843 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.963771 4843 state_mem.go:36] "Initialized new in-memory state store" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.979854 4843 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.980200 4843 policy_none.go:49] "None policy: Start" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.981294 4843 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.981324 4843 state_mem.go:35] "Initializing new in-memory state store" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.982303 4843 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.982375 4843 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 18 12:09:26 crc kubenswrapper[4843]: I0318 12:09:26.982460 4843 kubelet.go:2335] "Starting kubelet main sync loop" Mar 18 12:09:26 crc kubenswrapper[4843]: E0318 12:09:26.982543 4843 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 18 12:09:26 crc kubenswrapper[4843]: W0318 12:09:26.983860 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:26 crc kubenswrapper[4843]: E0318 12:09:26.983918 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:27 crc kubenswrapper[4843]: E0318 12:09:27.021603 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.027542 4843 manager.go:334] "Starting Device Plugin manager" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.027601 4843 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.027614 4843 server.go:79] "Starting device plugin registration server" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.028011 4843 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.028027 4843 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.028612 4843 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.028737 4843 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.028747 4843 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 12:09:27 crc kubenswrapper[4843]: E0318 12:09:27.037124 4843 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.082993 4843 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.083122 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.084337 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.084391 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.084401 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.084538 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.085078 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.085142 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.085287 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.085307 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.085317 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.085433 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.085766 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.085836 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.086078 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.086113 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.086127 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.086157 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.086174 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.086183 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.086243 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.086713 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.086844 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.087037 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.087071 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.087074 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.087110 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.087123 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.087085 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.087343 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.087384 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.087352 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.088329 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.088359 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.088368 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.088404 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.088433 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.088447 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.088458 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.088476 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.088487 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.088659 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.088717 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.089331 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.089367 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.089379 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:27 crc kubenswrapper[4843]: E0318 12:09:27.122493 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" interval="400ms" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.129084 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.130273 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.130311 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.130324 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.130351 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:27 crc kubenswrapper[4843]: E0318 12:09:27.131040 4843 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.205:6443: connect: connection refused" node="crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.147088 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.147381 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.147623 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.147766 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.147875 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.147990 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.148107 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.148207 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.148309 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.148399 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.148551 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.148649 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.148769 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.148852 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.148942 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250577 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250684 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250717 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250745 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250776 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250804 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250834 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250862 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250868 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250893 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250882 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250952 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250971 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250978 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.251020 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.251022 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250941 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250924 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250989 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.250979 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.251086 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.251119 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.251154 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.251252 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.251247 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.251327 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.251373 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.251391 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.251330 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.251408 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.331786 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.332770 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.332797 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.332809 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.332842 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:27 crc kubenswrapper[4843]: E0318 12:09:27.333255 4843 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.205:6443: connect: connection refused" node="crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.414259 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.423507 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.443842 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.462104 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.467350 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:27 crc kubenswrapper[4843]: E0318 12:09:27.523456 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" interval="800ms" Mar 18 12:09:27 crc kubenswrapper[4843]: W0318 12:09:27.543901 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-199a7f35f9a124f2657b8b23160753af4d99c23bc4c1dd6b7df9790c8a2005fa WatchSource:0}: Error finding container 199a7f35f9a124f2657b8b23160753af4d99c23bc4c1dd6b7df9790c8a2005fa: Status 404 returned error can't find the container with id 199a7f35f9a124f2657b8b23160753af4d99c23bc4c1dd6b7df9790c8a2005fa Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.734328 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.736255 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.736284 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.736297 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.736321 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:27 crc kubenswrapper[4843]: E0318 12:09:27.736964 4843 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.205:6443: connect: connection refused" node="crc" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.920365 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:27 crc kubenswrapper[4843]: W0318 12:09:27.972242 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:27 crc kubenswrapper[4843]: E0318 12:09:27.972322 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.988929 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6a0eacfb65893e399f5c668164da48a52e92676790b80bd5c54c917e50360234"} Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.990640 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"580e1627b34f68031ec26b4112d034a42b115c4fdeebf49a4ab5cac7d5d5ebef"} Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.991589 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"199a7f35f9a124f2657b8b23160753af4d99c23bc4c1dd6b7df9790c8a2005fa"} Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.993001 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"57c1399d88a34fbf963684549add496f4c1a655552e9529f4449bfcf51c4bc0e"} Mar 18 12:09:27 crc kubenswrapper[4843]: I0318 12:09:27.993821 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a40f46e95faf0c58a20b5f36daae0f0641ad15ed06454dc866b50644c776a832"} Mar 18 12:09:28 crc kubenswrapper[4843]: W0318 12:09:28.040398 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:28 crc kubenswrapper[4843]: E0318 12:09:28.040483 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:28 crc kubenswrapper[4843]: W0318 12:09:28.226094 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:28 crc kubenswrapper[4843]: E0318 12:09:28.226263 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:28 crc kubenswrapper[4843]: W0318 12:09:28.255594 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:28 crc kubenswrapper[4843]: E0318 12:09:28.255743 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:28 crc kubenswrapper[4843]: E0318 12:09:28.325310 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" interval="1.6s" Mar 18 12:09:28 crc kubenswrapper[4843]: I0318 12:09:28.537362 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:28 crc kubenswrapper[4843]: I0318 12:09:28.538598 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:28 crc kubenswrapper[4843]: I0318 12:09:28.538641 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:28 crc kubenswrapper[4843]: I0318 12:09:28.538675 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:28 crc kubenswrapper[4843]: I0318 12:09:28.538707 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:28 crc kubenswrapper[4843]: E0318 12:09:28.539494 4843 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.205:6443: connect: connection refused" node="crc" Mar 18 12:09:28 crc kubenswrapper[4843]: I0318 12:09:28.796694 4843 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:09:28 crc kubenswrapper[4843]: E0318 12:09:28.797953 4843 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:28 crc kubenswrapper[4843]: I0318 12:09:28.921060 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:28 crc kubenswrapper[4843]: E0318 12:09:28.979856 4843 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.205:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189dee3b590df559 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.918411609 +0000 UTC m=+0.634237133,LastTimestamp:2026-03-18 12:09:26.918411609 +0000 UTC m=+0.634237133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:09:28 crc kubenswrapper[4843]: I0318 12:09:28.998392 4843 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5" exitCode=0 Mar 18 12:09:28 crc kubenswrapper[4843]: I0318 12:09:28.998472 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5"} Mar 18 12:09:28 crc kubenswrapper[4843]: I0318 12:09:28.998552 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:28 crc kubenswrapper[4843]: I0318 12:09:28.999750 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:28 crc kubenswrapper[4843]: I0318 12:09:28.999785 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:28 crc kubenswrapper[4843]: I0318 12:09:28.999798 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.000893 4843 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03" exitCode=0 Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.000974 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03"} Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.000988 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.001953 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.001989 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.002000 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.003408 4843 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c" exitCode=0 Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.003466 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c"} Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.003525 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.004599 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.004624 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.004632 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.007572 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8"} Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.007598 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0"} Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.007608 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3ed3869ae559a4cee81da335653397b082fe3822bbdbe675a2d5890fed3a97fc"} Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.007616 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e"} Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.007682 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.008812 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.008919 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.008941 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.009836 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8" exitCode=0 Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.009883 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8"} Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.009935 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.010879 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.010907 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.010918 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.016483 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.017948 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.017974 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.017986 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:29 crc kubenswrapper[4843]: I0318 12:09:29.920565 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:29 crc kubenswrapper[4843]: E0318 12:09:29.926672 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" interval="3.2s" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.017256 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e"} Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.017315 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889"} Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.017331 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619"} Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.019478 4843 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede" exitCode=0 Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.019634 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede"} Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.019706 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.020683 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.020709 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.020719 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.022572 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a66d775ddd85f583bb387bee13d76b0cbbf7e4e045db6976fbade6d2e66bfdc6"} Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.022591 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.023874 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.023939 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.023953 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.025959 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad"} Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.025985 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96"} Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.025998 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d"} Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.026030 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.026041 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.026857 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.026889 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.026899 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.026926 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.026946 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.026954 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.140205 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.141366 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.141412 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.141427 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.141451 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:30 crc kubenswrapper[4843]: E0318 12:09:30.141992 4843 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.205:6443: connect: connection refused" node="crc" Mar 18 12:09:30 crc kubenswrapper[4843]: W0318 12:09:30.229897 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:30 crc kubenswrapper[4843]: E0318 12:09:30.229968 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:30 crc kubenswrapper[4843]: W0318 12:09:30.455789 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:30 crc kubenswrapper[4843]: E0318 12:09:30.455863 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:30 crc kubenswrapper[4843]: W0318 12:09:30.811376 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:30 crc kubenswrapper[4843]: E0318 12:09:30.811556 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:30 crc kubenswrapper[4843]: I0318 12:09:30.920735 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:31 crc kubenswrapper[4843]: W0318 12:09:31.029567 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.205:6443: connect: connection refused Mar 18 12:09:31 crc kubenswrapper[4843]: E0318 12:09:31.029712 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.205:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.030808 4843 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30" exitCode=0 Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.030870 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30"} Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.030948 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.031754 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.031791 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.031804 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.034224 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1acc8a0324bd5325864fda579f8dedb74a0cdc6339aa756d1818b38a6489e6fb"} Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.034285 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05"} Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.034251 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.034308 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.034313 4843 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.034359 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.037884 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.037952 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.037973 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.037970 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.038032 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.038054 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.038272 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.038321 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.038336 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.310701 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.549329 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.858500 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.858860 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.860350 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.860400 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:31 crc kubenswrapper[4843]: I0318 12:09:31.860411 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.019004 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.041041 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.041297 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59"} Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.041335 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8"} Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.041352 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87"} Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.041363 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6"} Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.041449 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.041970 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.041988 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.041996 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.042538 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.042558 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.042568 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:32 crc kubenswrapper[4843]: I0318 12:09:32.652597 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.014716 4843 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.048337 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5"} Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.048417 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.048558 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.048978 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.050023 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.050051 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.050061 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.050202 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.050226 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.050239 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.050377 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.050420 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.050442 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.343058 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.344687 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.344738 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.344748 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:33 crc kubenswrapper[4843]: I0318 12:09:33.344775 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:34 crc kubenswrapper[4843]: I0318 12:09:34.051905 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:34 crc kubenswrapper[4843]: I0318 12:09:34.052803 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:34 crc kubenswrapper[4843]: I0318 12:09:34.052832 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:34 crc kubenswrapper[4843]: I0318 12:09:34.052841 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:35 crc kubenswrapper[4843]: I0318 12:09:35.489766 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 18 12:09:35 crc kubenswrapper[4843]: I0318 12:09:35.489977 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:35 crc kubenswrapper[4843]: I0318 12:09:35.491350 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:35 crc kubenswrapper[4843]: I0318 12:09:35.491387 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:35 crc kubenswrapper[4843]: I0318 12:09:35.491399 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:35 crc kubenswrapper[4843]: I0318 12:09:35.653475 4843 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:09:35 crc kubenswrapper[4843]: I0318 12:09:35.653572 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:09:35 crc kubenswrapper[4843]: I0318 12:09:35.896078 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:35 crc kubenswrapper[4843]: I0318 12:09:35.896284 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:35 crc kubenswrapper[4843]: I0318 12:09:35.897511 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:35 crc kubenswrapper[4843]: I0318 12:09:35.897564 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:35 crc kubenswrapper[4843]: I0318 12:09:35.897575 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:36 crc kubenswrapper[4843]: I0318 12:09:36.184875 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:36 crc kubenswrapper[4843]: I0318 12:09:36.185781 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:36 crc kubenswrapper[4843]: I0318 12:09:36.186895 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:36 crc kubenswrapper[4843]: I0318 12:09:36.187000 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:36 crc kubenswrapper[4843]: I0318 12:09:36.187071 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:36 crc kubenswrapper[4843]: I0318 12:09:36.191197 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:36 crc kubenswrapper[4843]: I0318 12:09:36.210643 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 18 12:09:36 crc kubenswrapper[4843]: I0318 12:09:36.210953 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:36 crc kubenswrapper[4843]: I0318 12:09:36.212205 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:36 crc kubenswrapper[4843]: I0318 12:09:36.212246 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:36 crc kubenswrapper[4843]: I0318 12:09:36.212269 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:37 crc kubenswrapper[4843]: E0318 12:09:37.037207 4843 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:09:37 crc kubenswrapper[4843]: I0318 12:09:37.060132 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:37 crc kubenswrapper[4843]: I0318 12:09:37.061288 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:37 crc kubenswrapper[4843]: I0318 12:09:37.061391 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:37 crc kubenswrapper[4843]: I0318 12:09:37.061407 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:37 crc kubenswrapper[4843]: I0318 12:09:37.432589 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:37 crc kubenswrapper[4843]: I0318 12:09:37.432906 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:37 crc kubenswrapper[4843]: I0318 12:09:37.434409 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:37 crc kubenswrapper[4843]: I0318 12:09:37.434472 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:37 crc kubenswrapper[4843]: I0318 12:09:37.434486 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4843]: I0318 12:09:41.866577 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:41 crc kubenswrapper[4843]: I0318 12:09:41.866815 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4843]: I0318 12:09:41.868252 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4843]: I0318 12:09:41.868343 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4843]: I0318 12:09:41.868364 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4843]: I0318 12:09:41.932124 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 18 12:09:41 crc kubenswrapper[4843]: I0318 12:09:41.957915 4843 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:55910->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 18 12:09:41 crc kubenswrapper[4843]: I0318 12:09:41.958019 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:55910->192.168.126.11:17697: read: connection reset by peer" Mar 18 12:09:42 crc kubenswrapper[4843]: I0318 12:09:42.074044 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 12:09:42 crc kubenswrapper[4843]: I0318 12:09:42.076287 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1acc8a0324bd5325864fda579f8dedb74a0cdc6339aa756d1818b38a6489e6fb" exitCode=255 Mar 18 12:09:42 crc kubenswrapper[4843]: I0318 12:09:42.076342 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1acc8a0324bd5325864fda579f8dedb74a0cdc6339aa756d1818b38a6489e6fb"} Mar 18 12:09:42 crc kubenswrapper[4843]: I0318 12:09:42.076522 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:42 crc kubenswrapper[4843]: I0318 12:09:42.077472 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:42 crc kubenswrapper[4843]: I0318 12:09:42.077514 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:42 crc kubenswrapper[4843]: I0318 12:09:42.077526 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:42 crc kubenswrapper[4843]: I0318 12:09:42.078332 4843 scope.go:117] "RemoveContainer" containerID="1acc8a0324bd5325864fda579f8dedb74a0cdc6339aa756d1818b38a6489e6fb" Mar 18 12:09:43 crc kubenswrapper[4843]: E0318 12:09:43.016175 4843 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 12:09:43 crc kubenswrapper[4843]: I0318 12:09:43.081508 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 12:09:43 crc kubenswrapper[4843]: I0318 12:09:43.083507 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b2bc79eb3c4fe9fa984376dc198372b7f867f67827992c450e062398d73875ef"} Mar 18 12:09:43 crc kubenswrapper[4843]: I0318 12:09:43.083684 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:43 crc kubenswrapper[4843]: I0318 12:09:43.084400 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:43 crc kubenswrapper[4843]: I0318 12:09:43.084432 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:43 crc kubenswrapper[4843]: I0318 12:09:43.084441 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:43 crc kubenswrapper[4843]: E0318 12:09:43.128499 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 18 12:09:43 crc kubenswrapper[4843]: E0318 12:09:43.359019 4843 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Mar 18 12:09:43 crc kubenswrapper[4843]: W0318 12:09:43.776939 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:43Z is after 2026-02-23T05:33:13Z Mar 18 12:09:43 crc kubenswrapper[4843]: E0318 12:09:43.777024 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:09:43 crc kubenswrapper[4843]: W0318 12:09:43.779118 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:43Z is after 2026-02-23T05:33:13Z Mar 18 12:09:43 crc kubenswrapper[4843]: E0318 12:09:43.779191 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:09:43 crc kubenswrapper[4843]: E0318 12:09:43.782650 4843 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:43Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189dee3b590df559 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.918411609 +0000 UTC m=+0.634237133,LastTimestamp:2026-03-18 12:09:26.918411609 +0000 UTC m=+0.634237133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:09:43 crc kubenswrapper[4843]: W0318 12:09:43.783733 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:43Z is after 2026-02-23T05:33:13Z Mar 18 12:09:43 crc kubenswrapper[4843]: E0318 12:09:43.783776 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:09:43 crc kubenswrapper[4843]: W0318 12:09:43.886283 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:43Z is after 2026-02-23T05:33:13Z Mar 18 12:09:43 crc kubenswrapper[4843]: E0318 12:09:43.886351 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:09:43 crc kubenswrapper[4843]: I0318 12:09:43.892042 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:43Z is after 2026-02-23T05:33:13Z Mar 18 12:09:43 crc kubenswrapper[4843]: I0318 12:09:43.896898 4843 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 12:09:43 crc kubenswrapper[4843]: I0318 12:09:43.896954 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 12:09:43 crc kubenswrapper[4843]: I0318 12:09:43.902142 4843 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 12:09:43 crc kubenswrapper[4843]: I0318 12:09:43.902194 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 12:09:43 crc kubenswrapper[4843]: I0318 12:09:43.923041 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:43Z is after 2026-02-23T05:33:13Z Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.055926 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:45Z is after 2026-02-23T05:33:13Z Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.089783 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.090283 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.092210 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b2bc79eb3c4fe9fa984376dc198372b7f867f67827992c450e062398d73875ef" exitCode=255 Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.092272 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b2bc79eb3c4fe9fa984376dc198372b7f867f67827992c450e062398d73875ef"} Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.092328 4843 scope.go:117] "RemoveContainer" containerID="1acc8a0324bd5325864fda579f8dedb74a0cdc6339aa756d1818b38a6489e6fb" Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.092481 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.093630 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.093666 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.093675 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.094197 4843 scope.go:117] "RemoveContainer" containerID="b2bc79eb3c4fe9fa984376dc198372b7f867f67827992c450e062398d73875ef" Mar 18 12:09:45 crc kubenswrapper[4843]: E0318 12:09:45.094352 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.654076 4843 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.654198 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.903903 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:45 crc kubenswrapper[4843]: I0318 12:09:45.922741 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:45Z is after 2026-02-23T05:33:13Z Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.096829 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.099068 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.100076 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.100127 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.100140 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.102942 4843 scope.go:117] "RemoveContainer" containerID="b2bc79eb3c4fe9fa984376dc198372b7f867f67827992c450e062398d73875ef" Mar 18 12:09:46 crc kubenswrapper[4843]: E0318 12:09:46.103157 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.104279 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.242537 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.242752 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.243829 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.243908 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.243926 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.259299 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 18 12:09:46 crc kubenswrapper[4843]: I0318 12:09:46.922443 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:46Z is after 2026-02-23T05:33:13Z Mar 18 12:09:47 crc kubenswrapper[4843]: E0318 12:09:47.037287 4843 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:09:47 crc kubenswrapper[4843]: I0318 12:09:47.101312 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:47 crc kubenswrapper[4843]: I0318 12:09:47.101363 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:47 crc kubenswrapper[4843]: I0318 12:09:47.102348 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:47 crc kubenswrapper[4843]: I0318 12:09:47.102435 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:47 crc kubenswrapper[4843]: I0318 12:09:47.102498 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:47 crc kubenswrapper[4843]: I0318 12:09:47.102361 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:47 crc kubenswrapper[4843]: I0318 12:09:47.102726 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:47 crc kubenswrapper[4843]: I0318 12:09:47.102762 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:47 crc kubenswrapper[4843]: I0318 12:09:47.103478 4843 scope.go:117] "RemoveContainer" containerID="b2bc79eb3c4fe9fa984376dc198372b7f867f67827992c450e062398d73875ef" Mar 18 12:09:47 crc kubenswrapper[4843]: E0318 12:09:47.103670 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:09:47 crc kubenswrapper[4843]: I0318 12:09:47.923737 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:47Z is after 2026-02-23T05:33:13Z Mar 18 12:09:48 crc kubenswrapper[4843]: I0318 12:09:48.062330 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:48 crc kubenswrapper[4843]: I0318 12:09:48.103287 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:48 crc kubenswrapper[4843]: I0318 12:09:48.104150 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:48 crc kubenswrapper[4843]: I0318 12:09:48.104248 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:48 crc kubenswrapper[4843]: I0318 12:09:48.104351 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:48 crc kubenswrapper[4843]: I0318 12:09:48.104950 4843 scope.go:117] "RemoveContainer" containerID="b2bc79eb3c4fe9fa984376dc198372b7f867f67827992c450e062398d73875ef" Mar 18 12:09:48 crc kubenswrapper[4843]: E0318 12:09:48.105188 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:09:48 crc kubenswrapper[4843]: I0318 12:09:48.927058 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:48Z is after 2026-02-23T05:33:13Z Mar 18 12:09:49 crc kubenswrapper[4843]: E0318 12:09:49.531596 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:49Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 12:09:49 crc kubenswrapper[4843]: I0318 12:09:49.759458 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:49 crc kubenswrapper[4843]: I0318 12:09:49.760730 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:49 crc kubenswrapper[4843]: I0318 12:09:49.760921 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:49 crc kubenswrapper[4843]: I0318 12:09:49.761109 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:49 crc kubenswrapper[4843]: I0318 12:09:49.761292 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:49 crc kubenswrapper[4843]: E0318 12:09:49.764611 4843 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:49Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 12:09:49 crc kubenswrapper[4843]: I0318 12:09:49.922648 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:49Z is after 2026-02-23T05:33:13Z Mar 18 12:09:50 crc kubenswrapper[4843]: W0318 12:09:50.286237 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:50Z is after 2026-02-23T05:33:13Z Mar 18 12:09:50 crc kubenswrapper[4843]: E0318 12:09:50.286344 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:09:50 crc kubenswrapper[4843]: W0318 12:09:50.445007 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:50Z is after 2026-02-23T05:33:13Z Mar 18 12:09:50 crc kubenswrapper[4843]: E0318 12:09:50.445123 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:09:50 crc kubenswrapper[4843]: I0318 12:09:50.924965 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:50Z is after 2026-02-23T05:33:13Z Mar 18 12:09:51 crc kubenswrapper[4843]: I0318 12:09:51.311694 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:51 crc kubenswrapper[4843]: I0318 12:09:51.312349 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:51 crc kubenswrapper[4843]: I0318 12:09:51.314109 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:51 crc kubenswrapper[4843]: I0318 12:09:51.314206 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:51 crc kubenswrapper[4843]: I0318 12:09:51.314232 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:51 crc kubenswrapper[4843]: I0318 12:09:51.315377 4843 scope.go:117] "RemoveContainer" containerID="b2bc79eb3c4fe9fa984376dc198372b7f867f67827992c450e062398d73875ef" Mar 18 12:09:51 crc kubenswrapper[4843]: E0318 12:09:51.315825 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:09:51 crc kubenswrapper[4843]: I0318 12:09:51.509571 4843 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:09:51 crc kubenswrapper[4843]: E0318 12:09:51.513135 4843 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:09:51 crc kubenswrapper[4843]: I0318 12:09:51.923784 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:51Z is after 2026-02-23T05:33:13Z Mar 18 12:09:52 crc kubenswrapper[4843]: W0318 12:09:52.583036 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:52Z is after 2026-02-23T05:33:13Z Mar 18 12:09:52 crc kubenswrapper[4843]: E0318 12:09:52.583163 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:09:52 crc kubenswrapper[4843]: I0318 12:09:52.923966 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:52Z is after 2026-02-23T05:33:13Z Mar 18 12:09:53 crc kubenswrapper[4843]: E0318 12:09:53.786559 4843 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:53Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189dee3b590df559 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.918411609 +0000 UTC m=+0.634237133,LastTimestamp:2026-03-18 12:09:26.918411609 +0000 UTC m=+0.634237133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:09:53 crc kubenswrapper[4843]: I0318 12:09:53.922691 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:53Z is after 2026-02-23T05:33:13Z Mar 18 12:09:54 crc kubenswrapper[4843]: W0318 12:09:54.739136 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:54Z is after 2026-02-23T05:33:13Z Mar 18 12:09:54 crc kubenswrapper[4843]: E0318 12:09:54.739208 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:09:54 crc kubenswrapper[4843]: I0318 12:09:54.923996 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:54Z is after 2026-02-23T05:33:13Z Mar 18 12:09:55 crc kubenswrapper[4843]: I0318 12:09:55.653169 4843 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:09:55 crc kubenswrapper[4843]: I0318 12:09:55.653275 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 12:09:55 crc kubenswrapper[4843]: I0318 12:09:55.653371 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:55 crc kubenswrapper[4843]: I0318 12:09:55.653788 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:55 crc kubenswrapper[4843]: I0318 12:09:55.659604 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:55 crc kubenswrapper[4843]: I0318 12:09:55.659736 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:55 crc kubenswrapper[4843]: I0318 12:09:55.659761 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:55 crc kubenswrapper[4843]: I0318 12:09:55.660839 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"3ed3869ae559a4cee81da335653397b082fe3822bbdbe675a2d5890fed3a97fc"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 12:09:55 crc kubenswrapper[4843]: I0318 12:09:55.661155 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://3ed3869ae559a4cee81da335653397b082fe3822bbdbe675a2d5890fed3a97fc" gracePeriod=30 Mar 18 12:09:55 crc kubenswrapper[4843]: I0318 12:09:55.923499 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:55Z is after 2026-02-23T05:33:13Z Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.126614 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.127039 4843 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3ed3869ae559a4cee81da335653397b082fe3822bbdbe675a2d5890fed3a97fc" exitCode=255 Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.127078 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3ed3869ae559a4cee81da335653397b082fe3822bbdbe675a2d5890fed3a97fc"} Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.127110 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9"} Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.127222 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.128068 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.128097 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.128107 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:56 crc kubenswrapper[4843]: E0318 12:09:56.536552 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:56Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.765374 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.766942 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.766988 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.767002 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.767030 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:56 crc kubenswrapper[4843]: E0318 12:09:56.770379 4843 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:56Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 12:09:56 crc kubenswrapper[4843]: I0318 12:09:56.923313 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:56Z is after 2026-02-23T05:33:13Z Mar 18 12:09:57 crc kubenswrapper[4843]: E0318 12:09:57.037769 4843 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:09:57 crc kubenswrapper[4843]: I0318 12:09:57.129561 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:57 crc kubenswrapper[4843]: I0318 12:09:57.130336 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:57 crc kubenswrapper[4843]: I0318 12:09:57.130378 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:57 crc kubenswrapper[4843]: I0318 12:09:57.130395 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:57 crc kubenswrapper[4843]: I0318 12:09:57.922746 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:57Z is after 2026-02-23T05:33:13Z Mar 18 12:09:58 crc kubenswrapper[4843]: I0318 12:09:58.924829 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:09:59 crc kubenswrapper[4843]: I0318 12:09:59.926711 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:00 crc kubenswrapper[4843]: I0318 12:10:00.924455 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:01 crc kubenswrapper[4843]: I0318 12:10:01.924623 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:01 crc kubenswrapper[4843]: I0318 12:10:01.983232 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:01 crc kubenswrapper[4843]: I0318 12:10:01.984306 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:01 crc kubenswrapper[4843]: I0318 12:10:01.984362 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:01 crc kubenswrapper[4843]: I0318 12:10:01.984377 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:01 crc kubenswrapper[4843]: I0318 12:10:01.984862 4843 scope.go:117] "RemoveContainer" containerID="b2bc79eb3c4fe9fa984376dc198372b7f867f67827992c450e062398d73875ef" Mar 18 12:10:02 crc kubenswrapper[4843]: I0318 12:10:02.019224 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:02 crc kubenswrapper[4843]: I0318 12:10:02.019396 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:02 crc kubenswrapper[4843]: I0318 12:10:02.020416 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:02 crc kubenswrapper[4843]: I0318 12:10:02.020450 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:02 crc kubenswrapper[4843]: I0318 12:10:02.020461 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:02 crc kubenswrapper[4843]: I0318 12:10:02.653384 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:02 crc kubenswrapper[4843]: I0318 12:10:02.653632 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:02 crc kubenswrapper[4843]: I0318 12:10:02.655045 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:02 crc kubenswrapper[4843]: I0318 12:10:02.655161 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:02 crc kubenswrapper[4843]: I0318 12:10:02.655233 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:02 crc kubenswrapper[4843]: I0318 12:10:02.925365 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.148679 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.149363 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.151429 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fc4a50c97ac7bb07b9ef9913807c2e886819464af1674eb59ec2d21c0127160d" exitCode=255 Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.151471 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fc4a50c97ac7bb07b9ef9913807c2e886819464af1674eb59ec2d21c0127160d"} Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.151518 4843 scope.go:117] "RemoveContainer" containerID="b2bc79eb3c4fe9fa984376dc198372b7f867f67827992c450e062398d73875ef" Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.151638 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.152597 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.152728 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.152826 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.154097 4843 scope.go:117] "RemoveContainer" containerID="fc4a50c97ac7bb07b9ef9913807c2e886819464af1674eb59ec2d21c0127160d" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.154401 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.543145 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.771234 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.773074 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.773110 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.773122 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.773156 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.778936 4843 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.791880 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b590df559 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.918411609 +0000 UTC m=+0.634237133,LastTimestamp:2026-03-18 12:09:26.918411609 +0000 UTC m=+0.634237133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.792928 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77bfff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958899199 +0000 UTC m=+0.674724723,LastTimestamp:2026-03-18 12:09:26.958899199 +0000 UTC m=+0.674724723,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.796486 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77f7e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958913509 +0000 UTC m=+0.674739033,LastTimestamp:2026-03-18 12:09:26.958913509 +0000 UTC m=+0.674739033,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.800747 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b781e4c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.95892334 +0000 UTC m=+0.674748864,LastTimestamp:2026-03-18 12:09:26.95892334 +0000 UTC m=+0.674748864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.805388 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5fbb80a4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:27.030448292 +0000 UTC m=+0.746273806,LastTimestamp:2026-03-18 12:09:27.030448292 +0000 UTC m=+0.746273806,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.809775 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77bfff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77bfff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958899199 +0000 UTC m=+0.674724723,LastTimestamp:2026-03-18 12:09:27.084380174 +0000 UTC m=+0.800205698,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.815851 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77f7e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77f7e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958913509 +0000 UTC m=+0.674739033,LastTimestamp:2026-03-18 12:09:27.084397925 +0000 UTC m=+0.800223449,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.822458 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b781e4c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b781e4c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.95892334 +0000 UTC m=+0.674748864,LastTimestamp:2026-03-18 12:09:27.084405615 +0000 UTC m=+0.800231139,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.827685 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77bfff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77bfff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958899199 +0000 UTC m=+0.674724723,LastTimestamp:2026-03-18 12:09:27.085300147 +0000 UTC m=+0.801125671,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.832637 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77f7e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77f7e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958913509 +0000 UTC m=+0.674739033,LastTimestamp:2026-03-18 12:09:27.085313548 +0000 UTC m=+0.801139072,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.837518 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b781e4c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b781e4c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.95892334 +0000 UTC m=+0.674748864,LastTimestamp:2026-03-18 12:09:27.085323868 +0000 UTC m=+0.801149392,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.841837 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77bfff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77bfff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958899199 +0000 UTC m=+0.674724723,LastTimestamp:2026-03-18 12:09:27.086091265 +0000 UTC m=+0.801916789,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.846816 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77f7e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77f7e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958913509 +0000 UTC m=+0.674739033,LastTimestamp:2026-03-18 12:09:27.086120296 +0000 UTC m=+0.801945820,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.852476 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b781e4c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b781e4c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.95892334 +0000 UTC m=+0.674748864,LastTimestamp:2026-03-18 12:09:27.086133067 +0000 UTC m=+0.801958591,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.857931 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77bfff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77bfff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958899199 +0000 UTC m=+0.674724723,LastTimestamp:2026-03-18 12:09:27.086168338 +0000 UTC m=+0.801993862,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.862399 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77f7e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77f7e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958913509 +0000 UTC m=+0.674739033,LastTimestamp:2026-03-18 12:09:27.086179729 +0000 UTC m=+0.802005253,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.867377 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b781e4c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b781e4c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.95892334 +0000 UTC m=+0.674748864,LastTimestamp:2026-03-18 12:09:27.086188169 +0000 UTC m=+0.802013693,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.872723 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77bfff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77bfff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958899199 +0000 UTC m=+0.674724723,LastTimestamp:2026-03-18 12:09:27.08705798 +0000 UTC m=+0.802883504,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.878527 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77f7e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77f7e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958913509 +0000 UTC m=+0.674739033,LastTimestamp:2026-03-18 12:09:27.087080221 +0000 UTC m=+0.802905745,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.883439 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77bfff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77bfff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958899199 +0000 UTC m=+0.674724723,LastTimestamp:2026-03-18 12:09:27.087101702 +0000 UTC m=+0.802927236,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.887381 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77f7e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77f7e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958913509 +0000 UTC m=+0.674739033,LastTimestamp:2026-03-18 12:09:27.087118503 +0000 UTC m=+0.802944027,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.890929 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b781e4c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b781e4c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.95892334 +0000 UTC m=+0.674748864,LastTimestamp:2026-03-18 12:09:27.087130514 +0000 UTC m=+0.802956048,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.896042 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b781e4c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b781e4c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.95892334 +0000 UTC m=+0.674748864,LastTimestamp:2026-03-18 12:09:27.087173586 +0000 UTC m=+0.802999120,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.900984 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77bfff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77bfff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958899199 +0000 UTC m=+0.674724723,LastTimestamp:2026-03-18 12:09:27.088353191 +0000 UTC m=+0.804178715,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.905455 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3b5b77f7e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3b5b77f7e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:26.958913509 +0000 UTC m=+0.674739033,LastTimestamp:2026-03-18 12:09:27.088365542 +0000 UTC m=+0.804191076,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.913212 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3b7e59a243 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:27.544128067 +0000 UTC m=+1.259953631,LastTimestamp:2026-03-18 12:09:27.544128067 +0000 UTC m=+1.259953631,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.917548 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3b7e59f956 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:27.544150358 +0000 UTC m=+1.259975882,LastTimestamp:2026-03-18 12:09:27.544150358 +0000 UTC m=+1.259975882,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: I0318 12:10:03.921235 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.921695 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3b7e5a9d36 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:27.54419231 +0000 UTC m=+1.260017834,LastTimestamp:2026-03-18 12:09:27.54419231 +0000 UTC m=+1.260017834,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.926265 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee3b7e64e676 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:27.544866422 +0000 UTC m=+1.260691946,LastTimestamp:2026-03-18 12:09:27.544866422 +0000 UTC m=+1.260691946,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.930082 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3b7e9139a3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:27.547771299 +0000 UTC m=+1.263596823,LastTimestamp:2026-03-18 12:09:27.547771299 +0000 UTC m=+1.263596823,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.934382 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3ba1bce8fb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.137836795 +0000 UTC m=+1.853662319,LastTimestamp:2026-03-18 12:09:28.137836795 +0000 UTC m=+1.853662319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.938141 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee3ba1c84200 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.13858048 +0000 UTC m=+1.854406004,LastTimestamp:2026-03-18 12:09:28.13858048 +0000 UTC m=+1.854406004,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.941588 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3ba1cc142c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.138830892 +0000 UTC m=+1.854656416,LastTimestamp:2026-03-18 12:09:28.138830892 +0000 UTC m=+1.854656416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.971005 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3ba1cc1e18 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.138833432 +0000 UTC m=+1.854658956,LastTimestamp:2026-03-18 12:09:28.138833432 +0000 UTC m=+1.854658956,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.975869 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3ba1cc52e9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.138846953 +0000 UTC m=+1.854672477,LastTimestamp:2026-03-18 12:09:28.138846953 +0000 UTC m=+1.854672477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.981718 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3ba24ee09a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.147402906 +0000 UTC m=+1.863228430,LastTimestamp:2026-03-18 12:09:28.147402906 +0000 UTC m=+1.863228430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.987457 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3ba25e6040 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.148418624 +0000 UTC m=+1.864244158,LastTimestamp:2026-03-18 12:09:28.148418624 +0000 UTC m=+1.864244158,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.991288 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3ba2789d71 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.150138225 +0000 UTC m=+1.865963749,LastTimestamp:2026-03-18 12:09:28.150138225 +0000 UTC m=+1.865963749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:03 crc kubenswrapper[4843]: E0318 12:10:03.995733 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3ba27a923f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.150266431 +0000 UTC m=+1.866091955,LastTimestamp:2026-03-18 12:09:28.150266431 +0000 UTC m=+1.866091955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.000963 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee3ba28b1e62 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.151350882 +0000 UTC m=+1.867176406,LastTimestamp:2026-03-18 12:09:28.151350882 +0000 UTC m=+1.867176406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.004993 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3ba2f388a1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.158193825 +0000 UTC m=+1.874019339,LastTimestamp:2026-03-18 12:09:28.158193825 +0000 UTC m=+1.874019339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.009148 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3bb30fe212 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.428487186 +0000 UTC m=+2.144312710,LastTimestamp:2026-03-18 12:09:28.428487186 +0000 UTC m=+2.144312710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.012941 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3bb3b57e5d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.439340637 +0000 UTC m=+2.155166161,LastTimestamp:2026-03-18 12:09:28.439340637 +0000 UTC m=+2.155166161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.016695 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3bb3c81f89 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.440561545 +0000 UTC m=+2.156387069,LastTimestamp:2026-03-18 12:09:28.440561545 +0000 UTC m=+2.156387069,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.020312 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3bbe6105f6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.618354166 +0000 UTC m=+2.334179700,LastTimestamp:2026-03-18 12:09:28.618354166 +0000 UTC m=+2.334179700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.024205 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3bbf2b7df0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.631623152 +0000 UTC m=+2.347448676,LastTimestamp:2026-03-18 12:09:28.631623152 +0000 UTC m=+2.347448676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.027810 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3bbf46d78e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.633415566 +0000 UTC m=+2.349241080,LastTimestamp:2026-03-18 12:09:28.633415566 +0000 UTC m=+2.349241080,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.031479 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3bc98173ee openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.805028846 +0000 UTC m=+2.520854370,LastTimestamp:2026-03-18 12:09:28.805028846 +0000 UTC m=+2.520854370,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.036407 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3bca9d4ca5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.823631013 +0000 UTC m=+2.539456537,LastTimestamp:2026-03-18 12:09:28.823631013 +0000 UTC m=+2.539456537,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.040589 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3bd52ec3af openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.000936367 +0000 UTC m=+2.716761901,LastTimestamp:2026-03-18 12:09:29.000936367 +0000 UTC m=+2.716761901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.045239 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee3bd564d2d9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.004479193 +0000 UTC m=+2.720304727,LastTimestamp:2026-03-18 12:09:29.004479193 +0000 UTC m=+2.720304727,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.049855 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3bd5a63b0d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.008765709 +0000 UTC m=+2.724591253,LastTimestamp:2026-03-18 12:09:29.008765709 +0000 UTC m=+2.724591253,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.054493 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3bd61806a4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.016223396 +0000 UTC m=+2.732048920,LastTimestamp:2026-03-18 12:09:29.016223396 +0000 UTC m=+2.732048920,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.058713 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3be329e5de openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.235498462 +0000 UTC m=+2.951323976,LastTimestamp:2026-03-18 12:09:29.235498462 +0000 UTC m=+2.951323976,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.063047 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3be431b30f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.252786959 +0000 UTC m=+2.968612503,LastTimestamp:2026-03-18 12:09:29.252786959 +0000 UTC m=+2.968612503,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.066857 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee3be725424c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.302303308 +0000 UTC m=+3.018128832,LastTimestamp:2026-03-18 12:09:29.302303308 +0000 UTC m=+3.018128832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.071006 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3be7265e87 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.302376071 +0000 UTC m=+3.018201595,LastTimestamp:2026-03-18 12:09:29.302376071 +0000 UTC m=+3.018201595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.074691 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3be7cc4207 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.313247751 +0000 UTC m=+3.029073275,LastTimestamp:2026-03-18 12:09:29.313247751 +0000 UTC m=+3.029073275,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.080646 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3bf1228d76 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.469898102 +0000 UTC m=+3.185723626,LastTimestamp:2026-03-18 12:09:29.469898102 +0000 UTC m=+3.185723626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.086611 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3bf13dd64c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.47168622 +0000 UTC m=+3.187511744,LastTimestamp:2026-03-18 12:09:29.47168622 +0000 UTC m=+3.187511744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.091293 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3bf2ab89cf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.495652815 +0000 UTC m=+3.211478349,LastTimestamp:2026-03-18 12:09:29.495652815 +0000 UTC m=+3.211478349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.095684 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee3bf395c1bd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.511002557 +0000 UTC m=+3.226828081,LastTimestamp:2026-03-18 12:09:29.511002557 +0000 UTC m=+3.226828081,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.099106 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3bf3def6e4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.515800292 +0000 UTC m=+3.231625816,LastTimestamp:2026-03-18 12:09:29.515800292 +0000 UTC m=+3.231625816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.103493 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3bff1af876 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.70428223 +0000 UTC m=+3.420107774,LastTimestamp:2026-03-18 12:09:29.70428223 +0000 UTC m=+3.420107774,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.106031 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3bffa04839 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.713018937 +0000 UTC m=+3.428844461,LastTimestamp:2026-03-18 12:09:29.713018937 +0000 UTC m=+3.428844461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.107386 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3bffecb97e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.71802867 +0000 UTC m=+3.433854194,LastTimestamp:2026-03-18 12:09:29.71802867 +0000 UTC m=+3.433854194,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.109803 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3c00053b18 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.719634712 +0000 UTC m=+3.435460236,LastTimestamp:2026-03-18 12:09:29.719634712 +0000 UTC m=+3.435460236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.111129 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3c0119774c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.73773806 +0000 UTC m=+3.453563584,LastTimestamp:2026-03-18 12:09:29.73773806 +0000 UTC m=+3.453563584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.113841 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3c0127adfe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.738669566 +0000 UTC m=+3.454495090,LastTimestamp:2026-03-18 12:09:29.738669566 +0000 UTC m=+3.454495090,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.115279 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3c0bb04e43 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.915395651 +0000 UTC m=+3.631221175,LastTimestamp:2026-03-18 12:09:29.915395651 +0000 UTC m=+3.631221175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.118726 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3c0cbe3a3b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.933085243 +0000 UTC m=+3.648910767,LastTimestamp:2026-03-18 12:09:29.933085243 +0000 UTC m=+3.648910767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.121920 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3c0cd62a16 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.934653974 +0000 UTC m=+3.650479488,LastTimestamp:2026-03-18 12:09:29.934653974 +0000 UTC m=+3.650479488,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.125873 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3c0d54bebc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.942949564 +0000 UTC m=+3.658775088,LastTimestamp:2026-03-18 12:09:29.942949564 +0000 UTC m=+3.658775088,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.129438 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3c0d63cda6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:29.943936422 +0000 UTC m=+3.659761946,LastTimestamp:2026-03-18 12:09:29.943936422 +0000 UTC m=+3.659761946,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.133515 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c120898b8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:30.021845176 +0000 UTC m=+3.737670700,LastTimestamp:2026-03-18 12:09:30.021845176 +0000 UTC m=+3.737670700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.137361 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3c17514d4b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:30.110496075 +0000 UTC m=+3.826321599,LastTimestamp:2026-03-18 12:09:30.110496075 +0000 UTC m=+3.826321599,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.141977 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3c182ec397 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:30.125009815 +0000 UTC m=+3.840835329,LastTimestamp:2026-03-18 12:09:30.125009815 +0000 UTC m=+3.840835329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.146269 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3c184ae032 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:30.126852146 +0000 UTC m=+3.842677680,LastTimestamp:2026-03-18 12:09:30.126852146 +0000 UTC m=+3.842677680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.151106 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c20fe2b07 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:30.272819975 +0000 UTC m=+3.988645499,LastTimestamp:2026-03-18 12:09:30.272819975 +0000 UTC m=+3.988645499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: I0318 12:10:04.155965 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.156923 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c21fd5ba4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:30.2895441 +0000 UTC m=+4.005369624,LastTimestamp:2026-03-18 12:09:30.2895441 +0000 UTC m=+4.005369624,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.162459 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3c24407025 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:30.327494693 +0000 UTC m=+4.043320217,LastTimestamp:2026-03-18 12:09:30.327494693 +0000 UTC m=+4.043320217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.167607 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3c24e688e3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:30.338380003 +0000 UTC m=+4.054205527,LastTimestamp:2026-03-18 12:09:30.338380003 +0000 UTC m=+4.054205527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.172132 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c4e538d6d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:31.033390445 +0000 UTC m=+4.749215969,LastTimestamp:2026-03-18 12:09:31.033390445 +0000 UTC m=+4.749215969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.177315 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c5973f2c1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:31.220062913 +0000 UTC m=+4.935888437,LastTimestamp:2026-03-18 12:09:31.220062913 +0000 UTC m=+4.935888437,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.182412 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c5a2c4a57 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:31.232143959 +0000 UTC m=+4.947969483,LastTimestamp:2026-03-18 12:09:31.232143959 +0000 UTC m=+4.947969483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.186962 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c5a395cea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:31.233000682 +0000 UTC m=+4.948826206,LastTimestamp:2026-03-18 12:09:31.233000682 +0000 UTC m=+4.948826206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.193473 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c65f04b6f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:31.429538671 +0000 UTC m=+5.145364195,LastTimestamp:2026-03-18 12:09:31.429538671 +0000 UTC m=+5.145364195,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.198196 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c66f411b0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:31.446563248 +0000 UTC m=+5.162388772,LastTimestamp:2026-03-18 12:09:31.446563248 +0000 UTC m=+5.162388772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.204182 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c670b4d6a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:31.448085866 +0000 UTC m=+5.163911390,LastTimestamp:2026-03-18 12:09:31.448085866 +0000 UTC m=+5.163911390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.208732 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c78b47704 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:31.744384772 +0000 UTC m=+5.460210296,LastTimestamp:2026-03-18 12:09:31.744384772 +0000 UTC m=+5.460210296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.213429 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c79821d2f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:31.757862191 +0000 UTC m=+5.473687715,LastTimestamp:2026-03-18 12:09:31.757862191 +0000 UTC m=+5.473687715,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.219721 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c7995fb8e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:31.759164302 +0000 UTC m=+5.474989836,LastTimestamp:2026-03-18 12:09:31.759164302 +0000 UTC m=+5.474989836,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.226214 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c87cbaa8f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:31.997563535 +0000 UTC m=+5.713389059,LastTimestamp:2026-03-18 12:09:31.997563535 +0000 UTC m=+5.713389059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.230611 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c88b3c506 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:32.012774662 +0000 UTC m=+5.728600186,LastTimestamp:2026-03-18 12:09:32.012774662 +0000 UTC m=+5.728600186,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.235110 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c88c7368f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:32.014048911 +0000 UTC m=+5.729874435,LastTimestamp:2026-03-18 12:09:32.014048911 +0000 UTC m=+5.729874435,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.239930 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c982da04e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:32.272418894 +0000 UTC m=+5.988244428,LastTimestamp:2026-03-18 12:09:32.272418894 +0000 UTC m=+5.988244428,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.243950 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3c98f744e4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:32.285633764 +0000 UTC m=+6.001459288,LastTimestamp:2026-03-18 12:09:32.285633764 +0000 UTC m=+6.001459288,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.248072 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:04 crc kubenswrapper[4843]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee3d61b58c72 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 12:10:04 crc kubenswrapper[4843]: body: Mar 18 12:10:04 crc kubenswrapper[4843]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:35.653547122 +0000 UTC m=+9.369372646,LastTimestamp:2026-03-18 12:09:35.653547122 +0000 UTC m=+9.369372646,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:04 crc kubenswrapper[4843]: > Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.251990 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3d61b67256 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:35.653605974 +0000 UTC m=+9.369431498,LastTimestamp:2026-03-18 12:09:35.653605974 +0000 UTC m=+9.369431498,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.259085 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 12:10:04 crc kubenswrapper[4843]: &Event{ObjectMeta:{kube-apiserver-crc.189dee3ed97bc829 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:55910->192.168.126.11:17697: read: connection reset by peer Mar 18 12:10:04 crc kubenswrapper[4843]: body: Mar 18 12:10:04 crc kubenswrapper[4843]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.957994537 +0000 UTC m=+15.673820091,LastTimestamp:2026-03-18 12:09:41.957994537 +0000 UTC m=+15.673820091,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:04 crc kubenswrapper[4843]: > Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.264122 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3ed97cb020 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:55910->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.95805392 +0000 UTC m=+15.673879474,LastTimestamp:2026-03-18 12:09:41.95805392 +0000 UTC m=+15.673879474,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.269826 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189dee3c184ae032\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3c184ae032 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:30.126852146 +0000 UTC m=+3.842677680,LastTimestamp:2026-03-18 12:09:42.079845516 +0000 UTC m=+15.795671070,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.273992 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189dee3c24407025\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3c24407025 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:30.327494693 +0000 UTC m=+4.043320217,LastTimestamp:2026-03-18 12:09:42.420133349 +0000 UTC m=+16.135958873,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.278448 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189dee3c24e688e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3c24e688e3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:30.338380003 +0000 UTC m=+4.054205527,LastTimestamp:2026-03-18 12:09:42.431188455 +0000 UTC m=+16.147013979,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.283705 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 12:10:04 crc kubenswrapper[4843]: &Event{ObjectMeta:{kube-apiserver-crc.189dee3f4d0db156 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 12:10:04 crc kubenswrapper[4843]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 12:10:04 crc kubenswrapper[4843]: Mar 18 12:10:04 crc kubenswrapper[4843]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.89693679 +0000 UTC m=+17.612762314,LastTimestamp:2026-03-18 12:09:43.89693679 +0000 UTC m=+17.612762314,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:04 crc kubenswrapper[4843]: > Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.288175 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f4d0e4b0d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.896976141 +0000 UTC m=+17.612801665,LastTimestamp:2026-03-18 12:09:43.896976141 +0000 UTC m=+17.612801665,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.293687 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:04 crc kubenswrapper[4843]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee3fb5ca7c3a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 12:10:04 crc kubenswrapper[4843]: body: Mar 18 12:10:04 crc kubenswrapper[4843]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.654139962 +0000 UTC m=+19.369965576,LastTimestamp:2026-03-18 12:09:45.654139962 +0000 UTC m=+19.369965576,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:04 crc kubenswrapper[4843]: > Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.297953 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3fb5cc2d5d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.654250845 +0000 UTC m=+19.370076399,LastTimestamp:2026-03-18 12:09:45.654250845 +0000 UTC m=+19.370076399,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.306495 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee3fb5ca7c3a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:04 crc kubenswrapper[4843]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee3fb5ca7c3a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 12:10:04 crc kubenswrapper[4843]: body: Mar 18 12:10:04 crc kubenswrapper[4843]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.654139962 +0000 UTC m=+19.369965576,LastTimestamp:2026-03-18 12:09:55.653233088 +0000 UTC m=+29.369058642,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:04 crc kubenswrapper[4843]: > Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.310255 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee3fb5cc2d5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3fb5cc2d5d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.654250845 +0000 UTC m=+19.370076399,LastTimestamp:2026-03-18 12:09:55.65331403 +0000 UTC m=+29.369139584,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.312744 4843 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee420a40de21 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:55.661119009 +0000 UTC m=+29.376944593,LastTimestamp:2026-03-18 12:09:55.661119009 +0000 UTC m=+29.376944593,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.317692 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee3ba27a923f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3ba27a923f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.150266431 +0000 UTC m=+1.866091955,LastTimestamp:2026-03-18 12:09:55.780407575 +0000 UTC m=+29.496233099,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.322072 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee3bb30fe212\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3bb30fe212 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.428487186 +0000 UTC m=+2.144312710,LastTimestamp:2026-03-18 12:09:55.925873322 +0000 UTC m=+29.641698846,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: E0318 12:10:04.326993 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee3bb3b57e5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3bb3b57e5d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:28.439340637 +0000 UTC m=+2.155166161,LastTimestamp:2026-03-18 12:09:55.933343472 +0000 UTC m=+29.649168996,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:04 crc kubenswrapper[4843]: I0318 12:10:04.927618 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:05 crc kubenswrapper[4843]: I0318 12:10:05.654378 4843 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:10:05 crc kubenswrapper[4843]: I0318 12:10:05.654579 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 12:10:05 crc kubenswrapper[4843]: E0318 12:10:05.662734 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee3fb5ca7c3a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:05 crc kubenswrapper[4843]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee3fb5ca7c3a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 12:10:05 crc kubenswrapper[4843]: body: Mar 18 12:10:05 crc kubenswrapper[4843]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.654139962 +0000 UTC m=+19.369965576,LastTimestamp:2026-03-18 12:10:05.65449507 +0000 UTC m=+39.370320674,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:05 crc kubenswrapper[4843]: > Mar 18 12:10:05 crc kubenswrapper[4843]: E0318 12:10:05.669321 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee3fb5cc2d5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3fb5cc2d5d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.654250845 +0000 UTC m=+19.370076399,LastTimestamp:2026-03-18 12:10:05.654624043 +0000 UTC m=+39.370449607,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:05 crc kubenswrapper[4843]: W0318 12:10:05.737341 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 12:10:05 crc kubenswrapper[4843]: E0318 12:10:05.737415 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 12:10:05 crc kubenswrapper[4843]: I0318 12:10:05.925368 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:06 crc kubenswrapper[4843]: W0318 12:10:06.542244 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 18 12:10:06 crc kubenswrapper[4843]: E0318 12:10:06.542299 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 12:10:06 crc kubenswrapper[4843]: I0318 12:10:06.924341 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:07 crc kubenswrapper[4843]: E0318 12:10:07.037939 4843 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:07 crc kubenswrapper[4843]: I0318 12:10:07.925507 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:08 crc kubenswrapper[4843]: I0318 12:10:08.062402 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:08 crc kubenswrapper[4843]: I0318 12:10:08.062591 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:08 crc kubenswrapper[4843]: I0318 12:10:08.063768 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:08 crc kubenswrapper[4843]: I0318 12:10:08.063813 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:08 crc kubenswrapper[4843]: I0318 12:10:08.063822 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:08 crc kubenswrapper[4843]: I0318 12:10:08.069145 4843 scope.go:117] "RemoveContainer" containerID="fc4a50c97ac7bb07b9ef9913807c2e886819464af1674eb59ec2d21c0127160d" Mar 18 12:10:08 crc kubenswrapper[4843]: E0318 12:10:08.069434 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:08 crc kubenswrapper[4843]: W0318 12:10:08.473665 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:08 crc kubenswrapper[4843]: E0318 12:10:08.473706 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 12:10:08 crc kubenswrapper[4843]: I0318 12:10:08.651750 4843 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:10:08 crc kubenswrapper[4843]: I0318 12:10:08.672831 4843 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 12:10:08 crc kubenswrapper[4843]: I0318 12:10:08.925395 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:09 crc kubenswrapper[4843]: I0318 12:10:09.928044 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:10 crc kubenswrapper[4843]: E0318 12:10:10.549140 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:10:10 crc kubenswrapper[4843]: I0318 12:10:10.779739 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:10 crc kubenswrapper[4843]: I0318 12:10:10.782178 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:10 crc kubenswrapper[4843]: I0318 12:10:10.782262 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:10 crc kubenswrapper[4843]: I0318 12:10:10.782289 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:10 crc kubenswrapper[4843]: I0318 12:10:10.782332 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:10 crc kubenswrapper[4843]: E0318 12:10:10.787964 4843 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:10:10 crc kubenswrapper[4843]: I0318 12:10:10.926032 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:11 crc kubenswrapper[4843]: I0318 12:10:11.310911 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:11 crc kubenswrapper[4843]: I0318 12:10:11.311231 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:11 crc kubenswrapper[4843]: I0318 12:10:11.313401 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:11 crc kubenswrapper[4843]: I0318 12:10:11.313699 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:11 crc kubenswrapper[4843]: I0318 12:10:11.314084 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:11 crc kubenswrapper[4843]: I0318 12:10:11.315141 4843 scope.go:117] "RemoveContainer" containerID="fc4a50c97ac7bb07b9ef9913807c2e886819464af1674eb59ec2d21c0127160d" Mar 18 12:10:11 crc kubenswrapper[4843]: E0318 12:10:11.315645 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:11 crc kubenswrapper[4843]: I0318 12:10:11.924455 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:12 crc kubenswrapper[4843]: I0318 12:10:12.924859 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:13 crc kubenswrapper[4843]: I0318 12:10:13.927703 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:14 crc kubenswrapper[4843]: I0318 12:10:14.930794 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:15 crc kubenswrapper[4843]: I0318 12:10:15.653964 4843 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:10:15 crc kubenswrapper[4843]: I0318 12:10:15.654040 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 12:10:15 crc kubenswrapper[4843]: E0318 12:10:15.658298 4843 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee3fb5ca7c3a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:15 crc kubenswrapper[4843]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee3fb5ca7c3a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 12:10:15 crc kubenswrapper[4843]: body: Mar 18 12:10:15 crc kubenswrapper[4843]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.654139962 +0000 UTC m=+19.369965576,LastTimestamp:2026-03-18 12:10:15.654022075 +0000 UTC m=+49.369847609,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:15 crc kubenswrapper[4843]: > Mar 18 12:10:15 crc kubenswrapper[4843]: I0318 12:10:15.927962 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:16 crc kubenswrapper[4843]: I0318 12:10:16.928115 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:17 crc kubenswrapper[4843]: E0318 12:10:17.038102 4843 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:17 crc kubenswrapper[4843]: I0318 12:10:17.439356 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:10:17 crc kubenswrapper[4843]: I0318 12:10:17.439578 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:17 crc kubenswrapper[4843]: I0318 12:10:17.441224 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:17 crc kubenswrapper[4843]: I0318 12:10:17.441285 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:17 crc kubenswrapper[4843]: I0318 12:10:17.441301 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:17 crc kubenswrapper[4843]: E0318 12:10:17.556969 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:10:17 crc kubenswrapper[4843]: W0318 12:10:17.731927 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 12:10:17 crc kubenswrapper[4843]: E0318 12:10:17.731988 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 12:10:17 crc kubenswrapper[4843]: I0318 12:10:17.788741 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:17 crc kubenswrapper[4843]: I0318 12:10:17.790696 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:17 crc kubenswrapper[4843]: I0318 12:10:17.790785 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:17 crc kubenswrapper[4843]: I0318 12:10:17.790810 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:17 crc kubenswrapper[4843]: I0318 12:10:17.790862 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:17 crc kubenswrapper[4843]: E0318 12:10:17.798454 4843 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:10:17 crc kubenswrapper[4843]: I0318 12:10:17.927810 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:18 crc kubenswrapper[4843]: I0318 12:10:18.926937 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:19 crc kubenswrapper[4843]: I0318 12:10:19.928244 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:20 crc kubenswrapper[4843]: I0318 12:10:20.926279 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:21 crc kubenswrapper[4843]: I0318 12:10:21.927558 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:22 crc kubenswrapper[4843]: I0318 12:10:22.925692 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:23 crc kubenswrapper[4843]: I0318 12:10:23.926088 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:24 crc kubenswrapper[4843]: E0318 12:10:24.561943 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:10:24 crc kubenswrapper[4843]: I0318 12:10:24.798728 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:24 crc kubenswrapper[4843]: I0318 12:10:24.800028 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:24 crc kubenswrapper[4843]: I0318 12:10:24.800086 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:24 crc kubenswrapper[4843]: I0318 12:10:24.800101 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:24 crc kubenswrapper[4843]: I0318 12:10:24.800125 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:24 crc kubenswrapper[4843]: E0318 12:10:24.805484 4843 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:10:24 crc kubenswrapper[4843]: I0318 12:10:24.926933 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.653625 4843 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.653993 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.654169 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.654472 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.655740 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.655791 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.655809 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.656488 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.656632 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9" gracePeriod=30 Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.924976 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.983875 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.985917 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.985949 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.985966 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:25 crc kubenswrapper[4843]: I0318 12:10:25.986564 4843 scope.go:117] "RemoveContainer" containerID="fc4a50c97ac7bb07b9ef9913807c2e886819464af1674eb59ec2d21c0127160d" Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.399643 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.400872 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.401255 4843 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9" exitCode=255 Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.401295 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9"} Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.401345 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650"} Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.401397 4843 scope.go:117] "RemoveContainer" containerID="3ed3869ae559a4cee81da335653397b082fe3822bbdbe675a2d5890fed3a97fc" Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.401443 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.404471 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.404550 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.404563 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.404743 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.407354 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8"} Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.407786 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.447144 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.447197 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.447210 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:26 crc kubenswrapper[4843]: I0318 12:10:26.927348 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:27 crc kubenswrapper[4843]: E0318 12:10:27.038749 4843 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.410744 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.411750 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.421685 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.421818 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.421847 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.423702 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.424398 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.427365 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8" exitCode=255 Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.427436 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8"} Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.427524 4843 scope.go:117] "RemoveContainer" containerID="fc4a50c97ac7bb07b9ef9913807c2e886819464af1674eb59ec2d21c0127160d" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.427798 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.455698 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.455777 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.455789 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.456610 4843 scope.go:117] "RemoveContainer" containerID="0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8" Mar 18 12:10:27 crc kubenswrapper[4843]: E0318 12:10:27.456855 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:27 crc kubenswrapper[4843]: I0318 12:10:27.928067 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:28 crc kubenswrapper[4843]: I0318 12:10:28.062278 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:28 crc kubenswrapper[4843]: I0318 12:10:28.506812 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 12:10:28 crc kubenswrapper[4843]: I0318 12:10:28.508751 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:28 crc kubenswrapper[4843]: I0318 12:10:28.510326 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:28 crc kubenswrapper[4843]: I0318 12:10:28.510389 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:28 crc kubenswrapper[4843]: I0318 12:10:28.510402 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:28 crc kubenswrapper[4843]: I0318 12:10:28.511553 4843 scope.go:117] "RemoveContainer" containerID="0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8" Mar 18 12:10:28 crc kubenswrapper[4843]: E0318 12:10:28.511854 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:28 crc kubenswrapper[4843]: I0318 12:10:28.987646 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:30 crc kubenswrapper[4843]: I0318 12:10:30.053906 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:31 crc kubenswrapper[4843]: I0318 12:10:31.030544 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:31 crc kubenswrapper[4843]: I0318 12:10:31.311315 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:31 crc kubenswrapper[4843]: I0318 12:10:31.311504 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:31 crc kubenswrapper[4843]: I0318 12:10:31.312528 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:31 crc kubenswrapper[4843]: I0318 12:10:31.312573 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:31 crc kubenswrapper[4843]: I0318 12:10:31.312589 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:31 crc kubenswrapper[4843]: I0318 12:10:31.313238 4843 scope.go:117] "RemoveContainer" containerID="0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8" Mar 18 12:10:31 crc kubenswrapper[4843]: E0318 12:10:31.313482 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:31 crc kubenswrapper[4843]: E0318 12:10:31.572111 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:10:31 crc kubenswrapper[4843]: I0318 12:10:31.806546 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:31 crc kubenswrapper[4843]: I0318 12:10:31.807701 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:31 crc kubenswrapper[4843]: I0318 12:10:31.807737 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:31 crc kubenswrapper[4843]: I0318 12:10:31.807748 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:31 crc kubenswrapper[4843]: I0318 12:10:31.807770 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:31 crc kubenswrapper[4843]: E0318 12:10:31.812326 4843 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:10:31 crc kubenswrapper[4843]: I0318 12:10:31.923966 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:32 crc kubenswrapper[4843]: I0318 12:10:32.019714 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:32 crc kubenswrapper[4843]: I0318 12:10:32.019920 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:32 crc kubenswrapper[4843]: I0318 12:10:32.021106 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:32 crc kubenswrapper[4843]: I0318 12:10:32.021146 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:32 crc kubenswrapper[4843]: I0318 12:10:32.021159 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:32 crc kubenswrapper[4843]: I0318 12:10:32.653006 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:32 crc kubenswrapper[4843]: I0318 12:10:32.653166 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:32 crc kubenswrapper[4843]: I0318 12:10:32.654141 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:32 crc kubenswrapper[4843]: I0318 12:10:32.654163 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:32 crc kubenswrapper[4843]: I0318 12:10:32.654172 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:32 crc kubenswrapper[4843]: I0318 12:10:32.659125 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:32 crc kubenswrapper[4843]: I0318 12:10:32.923628 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:33 crc kubenswrapper[4843]: I0318 12:10:33.521201 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:33 crc kubenswrapper[4843]: I0318 12:10:33.522391 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:33 crc kubenswrapper[4843]: I0318 12:10:33.522424 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:33 crc kubenswrapper[4843]: I0318 12:10:33.522438 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:33 crc kubenswrapper[4843]: I0318 12:10:33.924746 4843 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:34 crc kubenswrapper[4843]: W0318 12:10:34.102225 4843 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 12:10:34 crc kubenswrapper[4843]: E0318 12:10:34.102292 4843 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 12:10:34 crc kubenswrapper[4843]: I0318 12:10:34.364590 4843 csr.go:261] certificate signing request csr-2f22w is approved, waiting to be issued Mar 18 12:10:34 crc kubenswrapper[4843]: I0318 12:10:34.397681 4843 csr.go:257] certificate signing request csr-2f22w is issued Mar 18 12:10:34 crc kubenswrapper[4843]: I0318 12:10:34.509126 4843 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 12:10:34 crc kubenswrapper[4843]: I0318 12:10:34.749111 4843 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 12:10:35 crc kubenswrapper[4843]: I0318 12:10:35.399139 4843 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-14 07:57:54.09753413 +0000 UTC Mar 18 12:10:35 crc kubenswrapper[4843]: I0318 12:10:35.399197 4843 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5779h47m18.69834021s for next certificate rotation Mar 18 12:10:37 crc kubenswrapper[4843]: E0318 12:10:37.039207 4843 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.194484 4843 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.812975 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.814208 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.814260 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.814272 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.814360 4843 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.826633 4843 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.826844 4843 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 12:10:38 crc kubenswrapper[4843]: E0318 12:10:38.826879 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.831265 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.831354 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.831365 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.831384 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.831395 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:38Z","lastTransitionTime":"2026-03-18T12:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:38 crc kubenswrapper[4843]: E0318 12:10:38.845343 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.852343 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.852412 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.852424 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.876912 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.876968 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:38Z","lastTransitionTime":"2026-03-18T12:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:38 crc kubenswrapper[4843]: E0318 12:10:38.906149 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.916033 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.916079 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.916089 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.916103 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.916114 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:38Z","lastTransitionTime":"2026-03-18T12:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:38 crc kubenswrapper[4843]: E0318 12:10:38.941496 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.950690 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.950747 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.950757 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.950771 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:38 crc kubenswrapper[4843]: I0318 12:10:38.950780 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:38Z","lastTransitionTime":"2026-03-18T12:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:38 crc kubenswrapper[4843]: E0318 12:10:38.965291 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:38 crc kubenswrapper[4843]: E0318 12:10:38.965517 4843 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:10:38 crc kubenswrapper[4843]: E0318 12:10:38.965546 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:39 crc kubenswrapper[4843]: E0318 12:10:39.066514 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:39 crc kubenswrapper[4843]: I0318 12:10:39.135231 4843 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 12:10:39 crc kubenswrapper[4843]: E0318 12:10:39.167334 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:39 crc kubenswrapper[4843]: E0318 12:10:39.268342 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:39 crc kubenswrapper[4843]: E0318 12:10:39.369425 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:39 crc kubenswrapper[4843]: E0318 12:10:39.470211 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:39 crc kubenswrapper[4843]: E0318 12:10:39.570987 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:39 crc kubenswrapper[4843]: E0318 12:10:39.671885 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:39 crc kubenswrapper[4843]: E0318 12:10:39.772272 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:39 crc kubenswrapper[4843]: E0318 12:10:39.873469 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:39 crc kubenswrapper[4843]: E0318 12:10:39.974259 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:40 crc kubenswrapper[4843]: E0318 12:10:40.074814 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:40 crc kubenswrapper[4843]: E0318 12:10:40.176069 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:40 crc kubenswrapper[4843]: E0318 12:10:40.276484 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:40 crc kubenswrapper[4843]: E0318 12:10:40.377478 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:40 crc kubenswrapper[4843]: E0318 12:10:40.478602 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:40 crc kubenswrapper[4843]: E0318 12:10:40.578941 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:40 crc kubenswrapper[4843]: E0318 12:10:40.679795 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:40 crc kubenswrapper[4843]: E0318 12:10:40.780327 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:40 crc kubenswrapper[4843]: E0318 12:10:40.880452 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:40 crc kubenswrapper[4843]: E0318 12:10:40.981244 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:40 crc kubenswrapper[4843]: I0318 12:10:40.983571 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:40 crc kubenswrapper[4843]: I0318 12:10:40.984739 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:40 crc kubenswrapper[4843]: I0318 12:10:40.984785 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:40 crc kubenswrapper[4843]: I0318 12:10:40.984798 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:41 crc kubenswrapper[4843]: E0318 12:10:41.133847 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:41 crc kubenswrapper[4843]: E0318 12:10:41.234834 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:41 crc kubenswrapper[4843]: E0318 12:10:41.334911 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:41 crc kubenswrapper[4843]: E0318 12:10:41.435576 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:41 crc kubenswrapper[4843]: E0318 12:10:41.536064 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:41 crc kubenswrapper[4843]: E0318 12:10:41.637116 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:41 crc kubenswrapper[4843]: E0318 12:10:41.738379 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:41 crc kubenswrapper[4843]: E0318 12:10:41.839175 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:41 crc kubenswrapper[4843]: E0318 12:10:41.940459 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:42 crc kubenswrapper[4843]: I0318 12:10:42.023740 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:42 crc kubenswrapper[4843]: I0318 12:10:42.023946 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:42 crc kubenswrapper[4843]: I0318 12:10:42.025390 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:42 crc kubenswrapper[4843]: I0318 12:10:42.025566 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:42 crc kubenswrapper[4843]: I0318 12:10:42.025717 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:42 crc kubenswrapper[4843]: E0318 12:10:42.041496 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:42 crc kubenswrapper[4843]: E0318 12:10:42.142271 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:42 crc kubenswrapper[4843]: E0318 12:10:42.243234 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:42 crc kubenswrapper[4843]: E0318 12:10:42.343623 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:42 crc kubenswrapper[4843]: E0318 12:10:42.444675 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:42 crc kubenswrapper[4843]: E0318 12:10:42.545495 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:42 crc kubenswrapper[4843]: E0318 12:10:42.646502 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:42 crc kubenswrapper[4843]: E0318 12:10:42.771697 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:42 crc kubenswrapper[4843]: E0318 12:10:42.872096 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:42 crc kubenswrapper[4843]: E0318 12:10:42.973053 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:42 crc kubenswrapper[4843]: I0318 12:10:42.983575 4843 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:42 crc kubenswrapper[4843]: I0318 12:10:42.985418 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:42 crc kubenswrapper[4843]: I0318 12:10:42.985472 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:42 crc kubenswrapper[4843]: I0318 12:10:42.985483 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:42 crc kubenswrapper[4843]: I0318 12:10:42.986984 4843 scope.go:117] "RemoveContainer" containerID="0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8" Mar 18 12:10:42 crc kubenswrapper[4843]: E0318 12:10:42.987387 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:43 crc kubenswrapper[4843]: E0318 12:10:43.075773 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:43 crc kubenswrapper[4843]: E0318 12:10:43.176576 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:43 crc kubenswrapper[4843]: E0318 12:10:43.276873 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:43 crc kubenswrapper[4843]: E0318 12:10:43.377238 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:43 crc kubenswrapper[4843]: E0318 12:10:43.478184 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:43 crc kubenswrapper[4843]: E0318 12:10:43.579025 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:43 crc kubenswrapper[4843]: E0318 12:10:43.679712 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:43 crc kubenswrapper[4843]: E0318 12:10:43.780303 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:43 crc kubenswrapper[4843]: E0318 12:10:43.880486 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:43 crc kubenswrapper[4843]: E0318 12:10:43.980945 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:44 crc kubenswrapper[4843]: E0318 12:10:44.081995 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:44 crc kubenswrapper[4843]: E0318 12:10:44.182843 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:44 crc kubenswrapper[4843]: E0318 12:10:44.284022 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:44 crc kubenswrapper[4843]: E0318 12:10:44.384135 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:44 crc kubenswrapper[4843]: E0318 12:10:44.485267 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:44 crc kubenswrapper[4843]: E0318 12:10:44.586315 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:44 crc kubenswrapper[4843]: E0318 12:10:44.687436 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:44 crc kubenswrapper[4843]: E0318 12:10:44.788468 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:44 crc kubenswrapper[4843]: E0318 12:10:44.889552 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:44 crc kubenswrapper[4843]: E0318 12:10:44.990731 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:45 crc kubenswrapper[4843]: E0318 12:10:45.091473 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:45 crc kubenswrapper[4843]: E0318 12:10:45.191607 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:45 crc kubenswrapper[4843]: E0318 12:10:45.292766 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:45 crc kubenswrapper[4843]: E0318 12:10:45.393243 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:45 crc kubenswrapper[4843]: E0318 12:10:45.494236 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:45 crc kubenswrapper[4843]: E0318 12:10:45.595274 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:45 crc kubenswrapper[4843]: E0318 12:10:45.695403 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:45 crc kubenswrapper[4843]: E0318 12:10:45.795535 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:45 crc kubenswrapper[4843]: E0318 12:10:45.896303 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:45 crc kubenswrapper[4843]: E0318 12:10:45.997095 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:46 crc kubenswrapper[4843]: E0318 12:10:46.097402 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:46 crc kubenswrapper[4843]: E0318 12:10:46.198594 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:46 crc kubenswrapper[4843]: E0318 12:10:46.298844 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:46 crc kubenswrapper[4843]: E0318 12:10:46.399381 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:46 crc kubenswrapper[4843]: E0318 12:10:46.499614 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:46 crc kubenswrapper[4843]: E0318 12:10:46.600276 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:46 crc kubenswrapper[4843]: E0318 12:10:46.701183 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:46 crc kubenswrapper[4843]: E0318 12:10:46.801889 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:46 crc kubenswrapper[4843]: E0318 12:10:46.902068 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:47 crc kubenswrapper[4843]: E0318 12:10:47.002578 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:47 crc kubenswrapper[4843]: E0318 12:10:47.040134 4843 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:47 crc kubenswrapper[4843]: E0318 12:10:47.103541 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:47 crc kubenswrapper[4843]: E0318 12:10:47.203905 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:47 crc kubenswrapper[4843]: E0318 12:10:47.305129 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:47 crc kubenswrapper[4843]: E0318 12:10:47.406031 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:47 crc kubenswrapper[4843]: E0318 12:10:47.506180 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:47 crc kubenswrapper[4843]: E0318 12:10:47.606821 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:47 crc kubenswrapper[4843]: E0318 12:10:47.707866 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:47 crc kubenswrapper[4843]: E0318 12:10:47.808665 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:47 crc kubenswrapper[4843]: E0318 12:10:47.909774 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:48 crc kubenswrapper[4843]: E0318 12:10:48.010479 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:48 crc kubenswrapper[4843]: E0318 12:10:48.111454 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:48 crc kubenswrapper[4843]: E0318 12:10:48.212290 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:48 crc kubenswrapper[4843]: E0318 12:10:48.312740 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:48 crc kubenswrapper[4843]: E0318 12:10:48.413103 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:48 crc kubenswrapper[4843]: E0318 12:10:48.514153 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:48 crc kubenswrapper[4843]: E0318 12:10:48.614530 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:48 crc kubenswrapper[4843]: E0318 12:10:48.715480 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:48 crc kubenswrapper[4843]: E0318 12:10:48.816371 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:48 crc kubenswrapper[4843]: E0318 12:10:48.917444 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.017762 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.118198 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.219261 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.264872 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.270763 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.270812 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.270823 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.270840 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.270851 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:49Z","lastTransitionTime":"2026-03-18T12:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.284525 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.288377 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.288414 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.288424 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.288440 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.288450 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:49Z","lastTransitionTime":"2026-03-18T12:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.304583 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.308715 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.308755 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.308765 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.308784 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.308794 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:49Z","lastTransitionTime":"2026-03-18T12:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.320672 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.324562 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.324594 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.324614 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.324631 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:49 crc kubenswrapper[4843]: I0318 12:10:49.324641 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:49Z","lastTransitionTime":"2026-03-18T12:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.336346 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.336480 4843 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.336506 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.436949 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.538121 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.639169 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.739701 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.840137 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:49 crc kubenswrapper[4843]: E0318 12:10:49.940810 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:50 crc kubenswrapper[4843]: E0318 12:10:50.041170 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:50 crc kubenswrapper[4843]: E0318 12:10:50.141277 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:50 crc kubenswrapper[4843]: E0318 12:10:50.242028 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:50 crc kubenswrapper[4843]: E0318 12:10:50.343187 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:50 crc kubenswrapper[4843]: E0318 12:10:50.443335 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:50 crc kubenswrapper[4843]: E0318 12:10:50.543515 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:50 crc kubenswrapper[4843]: E0318 12:10:50.644638 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:50 crc kubenswrapper[4843]: E0318 12:10:50.745675 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:50 crc kubenswrapper[4843]: E0318 12:10:50.845803 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:50 crc kubenswrapper[4843]: E0318 12:10:50.946347 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:51 crc kubenswrapper[4843]: E0318 12:10:51.046468 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:51 crc kubenswrapper[4843]: E0318 12:10:51.146788 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:51 crc kubenswrapper[4843]: E0318 12:10:51.247891 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:51 crc kubenswrapper[4843]: E0318 12:10:51.348687 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:51 crc kubenswrapper[4843]: E0318 12:10:51.449248 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:51 crc kubenswrapper[4843]: E0318 12:10:51.550446 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:51 crc kubenswrapper[4843]: E0318 12:10:51.651014 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:51 crc kubenswrapper[4843]: E0318 12:10:51.752092 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:51 crc kubenswrapper[4843]: E0318 12:10:51.852488 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:51 crc kubenswrapper[4843]: E0318 12:10:51.953094 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:52 crc kubenswrapper[4843]: E0318 12:10:52.053621 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:52 crc kubenswrapper[4843]: E0318 12:10:52.153933 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:52 crc kubenswrapper[4843]: E0318 12:10:52.254506 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:52 crc kubenswrapper[4843]: E0318 12:10:52.354876 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:52 crc kubenswrapper[4843]: E0318 12:10:52.455738 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:52 crc kubenswrapper[4843]: E0318 12:10:52.556681 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:52 crc kubenswrapper[4843]: E0318 12:10:52.657320 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:52 crc kubenswrapper[4843]: E0318 12:10:52.758176 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:52 crc kubenswrapper[4843]: E0318 12:10:52.858851 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:52 crc kubenswrapper[4843]: E0318 12:10:52.959694 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:53 crc kubenswrapper[4843]: E0318 12:10:53.060078 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:53 crc kubenswrapper[4843]: E0318 12:10:53.160911 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:53 crc kubenswrapper[4843]: E0318 12:10:53.261912 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:53 crc kubenswrapper[4843]: E0318 12:10:53.362305 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:53 crc kubenswrapper[4843]: E0318 12:10:53.462894 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:53 crc kubenswrapper[4843]: E0318 12:10:53.563990 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:53 crc kubenswrapper[4843]: E0318 12:10:53.664746 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:53 crc kubenswrapper[4843]: E0318 12:10:53.765890 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:53 crc kubenswrapper[4843]: E0318 12:10:53.866451 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:53 crc kubenswrapper[4843]: E0318 12:10:53.966584 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:54 crc kubenswrapper[4843]: E0318 12:10:54.067326 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:54 crc kubenswrapper[4843]: E0318 12:10:54.168217 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:54 crc kubenswrapper[4843]: E0318 12:10:54.269260 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:54 crc kubenswrapper[4843]: E0318 12:10:54.369639 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:54 crc kubenswrapper[4843]: E0318 12:10:54.469959 4843 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.562894 4843 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.572503 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.572547 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.572563 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.572589 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.572608 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:54Z","lastTransitionTime":"2026-03-18T12:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.674996 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.675293 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.675381 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.675473 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.675599 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:54Z","lastTransitionTime":"2026-03-18T12:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.777346 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.777584 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.777801 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.777889 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.777954 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:54Z","lastTransitionTime":"2026-03-18T12:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.879846 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.879877 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.879885 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.879898 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.879907 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:54Z","lastTransitionTime":"2026-03-18T12:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.993864 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.994135 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.994220 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.994319 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:54 crc kubenswrapper[4843]: I0318 12:10:54.994410 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:54Z","lastTransitionTime":"2026-03-18T12:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.067497 4843 apiserver.go:52] "Watching apiserver" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.073995 4843 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.074328 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.074812 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.074908 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.074940 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.074827 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.075017 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.075163 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.074979 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.075314 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.075531 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.077957 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.078511 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.078678 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.078796 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.078968 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.079119 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.079404 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.079515 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.079621 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.097295 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.097333 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.097354 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.097371 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.097383 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:55Z","lastTransitionTime":"2026-03-18T12:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.116097 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.122254 4843 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.131004 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138069 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138112 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138137 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138155 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138181 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138198 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138215 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138231 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138245 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138265 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138307 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138321 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138335 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138351 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138369 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138384 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138410 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138427 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138441 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138456 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138476 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138493 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138508 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138524 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138546 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138561 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138575 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138696 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138588 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138932 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138947 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138964 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.138980 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139013 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139029 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139044 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139047 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139113 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139144 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139169 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139198 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139218 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139245 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139268 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139289 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139312 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139329 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139335 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139380 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139348 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139399 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139541 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139571 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139605 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139626 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139633 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139703 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139733 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139736 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139762 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139795 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139819 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139843 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139867 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139889 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139913 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139939 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139964 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140100 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140137 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140160 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140185 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140211 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140234 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140256 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140279 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140303 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140327 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140355 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140380 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140402 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140437 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140469 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140493 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140518 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140543 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140569 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140592 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140617 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140665 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140692 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140716 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140741 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140770 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140794 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140818 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140844 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140871 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140897 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140920 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140942 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140984 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141009 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141036 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141063 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141087 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141110 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141143 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141179 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141201 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141229 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141253 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141274 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141296 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141372 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141398 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141422 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141445 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141468 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141496 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141518 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141542 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141567 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141592 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141616 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141640 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141692 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141716 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141740 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141764 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141805 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141829 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141854 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141916 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141941 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141969 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141995 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142026 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142820 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142862 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142933 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142962 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143022 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143048 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143105 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143170 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143201 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143224 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143249 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143328 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143367 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143388 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143411 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143433 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143472 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143497 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143693 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143834 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143917 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144064 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144093 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144118 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144264 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144319 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144349 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144398 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144424 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144470 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144500 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144523 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144571 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144645 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144704 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144804 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145189 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145235 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145291 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145318 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145373 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145400 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145434 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145461 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145488 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145523 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145548 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145573 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145614 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145638 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145689 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145714 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146147 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146186 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146289 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146329 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146357 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146386 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146425 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146455 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146518 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146560 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146679 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146714 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146970 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.147051 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.147134 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.147693 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.147878 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.147983 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.148019 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.149614 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.149678 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.149729 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.149777 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.149813 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.149841 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.149869 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.149958 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.150082 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.150104 4843 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.150121 4843 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.150133 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.150145 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.150156 4843 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.150171 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.154236 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.155066 4843 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.139910 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140215 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140406 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140550 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140569 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140720 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140751 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140882 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.140976 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141019 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141149 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141230 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.156204 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141356 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141406 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141431 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141525 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141553 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141737 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141809 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141850 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.141968 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142082 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142211 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142291 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142321 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142373 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142493 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142547 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142573 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142886 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.142983 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143218 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143509 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143708 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.143868 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144292 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144565 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144720 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144770 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144780 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144786 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.144972 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145117 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.156839 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145450 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.156840 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145737 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145975 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146010 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146035 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146325 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146426 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.146730 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.147052 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.147111 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.147339 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.147462 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.147470 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.147515 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.147535 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.147575 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.148107 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.148226 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.148407 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.148750 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.148755 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.148971 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.149438 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.149592 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.149740 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.150066 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.150290 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.150444 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.150562 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.150969 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.151188 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.151377 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.151630 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.151776 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.151820 4843 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.151897 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.151864 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.152228 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.152245 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.152360 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.152541 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.152745 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.152573 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.152832 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.153318 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.153340 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.153805 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.153846 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.153949 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.154067 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.154096 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.154129 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.153112 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.155031 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.155076 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.155340 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.155971 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.156516 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.156582 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.156898 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.157425 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.156975 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.157505 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.157165 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.157725 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.157743 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.157501 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.157173 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.157186 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.158025 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.158065 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.158203 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.158302 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.158329 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.145730 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.158387 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.158411 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.158554 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.158600 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.158679 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.159027 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.159398 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.159429 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.159533 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.159443 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.159672 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.159866 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.159899 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.160077 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.160108 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.160112 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.160496 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.160584 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.160573 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.160810 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.160864 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.160872 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.160933 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.161088 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.161278 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.161438 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:10:55.661401065 +0000 UTC m=+89.377226589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.161594 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.162190 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.162234 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.162328 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.162372 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:10:55.662351371 +0000 UTC m=+89.378176915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.162635 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.162702 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.162805 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.162883 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.163168 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.163407 4843 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.163859 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.163961 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.164076 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:10:55.664060738 +0000 UTC m=+89.379886272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.164214 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.164298 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.164723 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.164875 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.165193 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.165302 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.165503 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.165922 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.166391 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.166418 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.166756 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.166783 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.166644 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.166763 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.166005 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.166830 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.166901 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.167349 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.173432 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.175427 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.175506 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.175484 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.175543 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.176384 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.176963 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.177291 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.178606 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.178634 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.178658 4843 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.178744 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:10:55.678718157 +0000 UTC m=+89.394543681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.180337 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.180516 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.180960 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.180990 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.181006 4843 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.181098 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:10:55.68106374 +0000 UTC m=+89.396889264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.182751 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.184044 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.186235 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.186273 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.186313 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.186469 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.187312 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.190487 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.199005 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.200813 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.200880 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.200892 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.200967 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.201004 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:55Z","lastTransitionTime":"2026-03-18T12:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.201742 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.203429 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.206422 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.213875 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.214406 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.226881 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.245465 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251479 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251528 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251615 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251626 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251636 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251645 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251673 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251681 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251672 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251689 4843 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251759 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251776 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251790 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251804 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251731 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251816 4843 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251866 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251879 4843 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251890 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251903 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251912 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251919 4843 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251928 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251936 4843 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251944 4843 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251953 4843 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251961 4843 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251970 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251978 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251986 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.251994 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252003 4843 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252013 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252025 4843 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252037 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252062 4843 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252074 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252092 4843 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252102 4843 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252114 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252124 4843 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252133 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252143 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252152 4843 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252162 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252171 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252179 4843 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252189 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252199 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252208 4843 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252217 4843 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252226 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252234 4843 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252243 4843 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252251 4843 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252259 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252268 4843 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252278 4843 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252288 4843 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252296 4843 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252305 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252313 4843 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252347 4843 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252355 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252365 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252374 4843 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252382 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252390 4843 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252398 4843 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252406 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252415 4843 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252422 4843 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252430 4843 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252438 4843 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252446 4843 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252455 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252463 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252475 4843 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252485 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252498 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252508 4843 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252517 4843 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252525 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252533 4843 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252541 4843 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252549 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252558 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252566 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252574 4843 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252582 4843 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252590 4843 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252597 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252605 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252617 4843 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252625 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252633 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252642 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252675 4843 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252691 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252703 4843 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252719 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252729 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252740 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252748 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252770 4843 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252778 4843 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252785 4843 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252793 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252801 4843 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252809 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252817 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252827 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252840 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252849 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252858 4843 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252867 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252876 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252886 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252900 4843 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252909 4843 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252920 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252930 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252940 4843 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252950 4843 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252961 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252972 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.252991 4843 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253002 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253012 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253032 4843 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253056 4843 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253080 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253103 4843 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253126 4843 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253150 4843 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253171 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253193 4843 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253215 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253245 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253278 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253301 4843 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253326 4843 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253349 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253374 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253397 4843 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253419 4843 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253441 4843 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253463 4843 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253486 4843 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253511 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253538 4843 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253562 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253587 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253605 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253626 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253685 4843 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253712 4843 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253730 4843 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253747 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253764 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253781 4843 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253797 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253815 4843 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253846 4843 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253863 4843 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253880 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253899 4843 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253922 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.253984 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254009 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254040 4843 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254062 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254082 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254101 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254121 4843 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254137 4843 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254155 4843 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254175 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254199 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254222 4843 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254246 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254269 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254289 4843 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254306 4843 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254323 4843 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254339 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254356 4843 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254374 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254390 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254407 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.254425 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.259245 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.303648 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.303700 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.303713 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.303727 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.303736 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:55Z","lastTransitionTime":"2026-03-18T12:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.403577 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.413501 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.413520 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.413538 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.413549 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.413568 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.413579 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:55Z","lastTransitionTime":"2026-03-18T12:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.418453 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:10:55 crc kubenswrapper[4843]: W0318 12:10:55.424792 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-402ecb9362fd510eb64c5c8b2ec5742c4210a4f019a7a80ce9efb4fee7483eab WatchSource:0}: Error finding container 402ecb9362fd510eb64c5c8b2ec5742c4210a4f019a7a80ce9efb4fee7483eab: Status 404 returned error can't find the container with id 402ecb9362fd510eb64c5c8b2ec5742c4210a4f019a7a80ce9efb4fee7483eab Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.425052 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:10:55 crc kubenswrapper[4843]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 12:10:55 crc kubenswrapper[4843]: set -o allexport Mar 18 12:10:55 crc kubenswrapper[4843]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 12:10:55 crc kubenswrapper[4843]: source /etc/kubernetes/apiserver-url.env Mar 18 12:10:55 crc kubenswrapper[4843]: else Mar 18 12:10:55 crc kubenswrapper[4843]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 12:10:55 crc kubenswrapper[4843]: exit 1 Mar 18 12:10:55 crc kubenswrapper[4843]: fi Mar 18 12:10:55 crc kubenswrapper[4843]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 12:10:55 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:10:55 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.426537 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.427824 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.429067 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 12:10:55 crc kubenswrapper[4843]: W0318 12:10:55.430207 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-57cd0f9e4330e3bfbdd71ab95a9f57196befcc1b2f3a10b4c3d3cb99027bf0c8 WatchSource:0}: Error finding container 57cd0f9e4330e3bfbdd71ab95a9f57196befcc1b2f3a10b4c3d3cb99027bf0c8: Status 404 returned error can't find the container with id 57cd0f9e4330e3bfbdd71ab95a9f57196befcc1b2f3a10b4c3d3cb99027bf0c8 Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.432914 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:10:55 crc kubenswrapper[4843]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:10:55 crc kubenswrapper[4843]: if [[ -f "/env/_master" ]]; then Mar 18 12:10:55 crc kubenswrapper[4843]: set -o allexport Mar 18 12:10:55 crc kubenswrapper[4843]: source "/env/_master" Mar 18 12:10:55 crc kubenswrapper[4843]: set +o allexport Mar 18 12:10:55 crc kubenswrapper[4843]: fi Mar 18 12:10:55 crc kubenswrapper[4843]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 12:10:55 crc kubenswrapper[4843]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 12:10:55 crc kubenswrapper[4843]: ho_enable="--enable-hybrid-overlay" Mar 18 12:10:55 crc kubenswrapper[4843]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 12:10:55 crc kubenswrapper[4843]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 12:10:55 crc kubenswrapper[4843]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 12:10:55 crc kubenswrapper[4843]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:10:55 crc kubenswrapper[4843]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 12:10:55 crc kubenswrapper[4843]: --webhook-host=127.0.0.1 \ Mar 18 12:10:55 crc kubenswrapper[4843]: --webhook-port=9743 \ Mar 18 12:10:55 crc kubenswrapper[4843]: ${ho_enable} \ Mar 18 12:10:55 crc kubenswrapper[4843]: --enable-interconnect \ Mar 18 12:10:55 crc kubenswrapper[4843]: --disable-approver \ Mar 18 12:10:55 crc kubenswrapper[4843]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 12:10:55 crc kubenswrapper[4843]: --wait-for-kubernetes-api=200s \ Mar 18 12:10:55 crc kubenswrapper[4843]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 12:10:55 crc kubenswrapper[4843]: --loglevel="${LOGLEVEL}" Mar 18 12:10:55 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:10:55 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.437414 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:10:55 crc kubenswrapper[4843]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:10:55 crc kubenswrapper[4843]: if [[ -f "/env/_master" ]]; then Mar 18 12:10:55 crc kubenswrapper[4843]: set -o allexport Mar 18 12:10:55 crc kubenswrapper[4843]: source "/env/_master" Mar 18 12:10:55 crc kubenswrapper[4843]: set +o allexport Mar 18 12:10:55 crc kubenswrapper[4843]: fi Mar 18 12:10:55 crc kubenswrapper[4843]: Mar 18 12:10:55 crc kubenswrapper[4843]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 12:10:55 crc kubenswrapper[4843]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:10:55 crc kubenswrapper[4843]: --disable-webhook \ Mar 18 12:10:55 crc kubenswrapper[4843]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 12:10:55 crc kubenswrapper[4843]: --loglevel="${LOGLEVEL}" Mar 18 12:10:55 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:10:55 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.439224 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.516201 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.516243 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.516253 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.516267 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.516277 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:55Z","lastTransitionTime":"2026-03-18T12:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.576724 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"57cd0f9e4330e3bfbdd71ab95a9f57196befcc1b2f3a10b4c3d3cb99027bf0c8"} Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.578029 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:10:55 crc kubenswrapper[4843]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:10:55 crc kubenswrapper[4843]: if [[ -f "/env/_master" ]]; then Mar 18 12:10:55 crc kubenswrapper[4843]: set -o allexport Mar 18 12:10:55 crc kubenswrapper[4843]: source "/env/_master" Mar 18 12:10:55 crc kubenswrapper[4843]: set +o allexport Mar 18 12:10:55 crc kubenswrapper[4843]: fi Mar 18 12:10:55 crc kubenswrapper[4843]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 12:10:55 crc kubenswrapper[4843]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 12:10:55 crc kubenswrapper[4843]: ho_enable="--enable-hybrid-overlay" Mar 18 12:10:55 crc kubenswrapper[4843]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 12:10:55 crc kubenswrapper[4843]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 12:10:55 crc kubenswrapper[4843]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 12:10:55 crc kubenswrapper[4843]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:10:55 crc kubenswrapper[4843]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 12:10:55 crc kubenswrapper[4843]: --webhook-host=127.0.0.1 \ Mar 18 12:10:55 crc kubenswrapper[4843]: --webhook-port=9743 \ Mar 18 12:10:55 crc kubenswrapper[4843]: ${ho_enable} \ Mar 18 12:10:55 crc kubenswrapper[4843]: --enable-interconnect \ Mar 18 12:10:55 crc kubenswrapper[4843]: --disable-approver \ Mar 18 12:10:55 crc kubenswrapper[4843]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 12:10:55 crc kubenswrapper[4843]: --wait-for-kubernetes-api=200s \ Mar 18 12:10:55 crc kubenswrapper[4843]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 12:10:55 crc kubenswrapper[4843]: --loglevel="${LOGLEVEL}" Mar 18 12:10:55 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:10:55 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.579594 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"402ecb9362fd510eb64c5c8b2ec5742c4210a4f019a7a80ce9efb4fee7483eab"} Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.580902 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.581376 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0f8af7286f4b1b70d2d94730d656f0c2e4156dd24bfef2d61af9acbb13506acc"} Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.581674 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:10:55 crc kubenswrapper[4843]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:10:55 crc kubenswrapper[4843]: if [[ -f "/env/_master" ]]; then Mar 18 12:10:55 crc kubenswrapper[4843]: set -o allexport Mar 18 12:10:55 crc kubenswrapper[4843]: source "/env/_master" Mar 18 12:10:55 crc kubenswrapper[4843]: set +o allexport Mar 18 12:10:55 crc kubenswrapper[4843]: fi Mar 18 12:10:55 crc kubenswrapper[4843]: Mar 18 12:10:55 crc kubenswrapper[4843]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 12:10:55 crc kubenswrapper[4843]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:10:55 crc kubenswrapper[4843]: --disable-webhook \ Mar 18 12:10:55 crc kubenswrapper[4843]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 12:10:55 crc kubenswrapper[4843]: --loglevel="${LOGLEVEL}" Mar 18 12:10:55 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:10:55 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.582619 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:10:55 crc kubenswrapper[4843]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 12:10:55 crc kubenswrapper[4843]: set -o allexport Mar 18 12:10:55 crc kubenswrapper[4843]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 12:10:55 crc kubenswrapper[4843]: source /etc/kubernetes/apiserver-url.env Mar 18 12:10:55 crc kubenswrapper[4843]: else Mar 18 12:10:55 crc kubenswrapper[4843]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 12:10:55 crc kubenswrapper[4843]: exit 1 Mar 18 12:10:55 crc kubenswrapper[4843]: fi Mar 18 12:10:55 crc kubenswrapper[4843]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 12:10:55 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:10:55 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.583796 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.583893 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.583983 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.589935 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.603210 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.614916 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.618401 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.618512 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.618736 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.618894 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.618960 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:55Z","lastTransitionTime":"2026-03-18T12:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.628699 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.645814 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.664370 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.683175 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.703803 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.724339 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.724386 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.724397 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.724420 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.724433 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:55Z","lastTransitionTime":"2026-03-18T12:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.730112 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.746856 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.759727 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.759906 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:10:56.759860272 +0000 UTC m=+90.475685806 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.760083 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.760204 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.760311 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.760463 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.760245 4843 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.760610 4843 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.760390 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.760711 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.760726 4843 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.760498 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.760769 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.760778 4843 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.760897 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:10:56.760616922 +0000 UTC m=+90.476442436 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.760971 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:10:56.760956652 +0000 UTC m=+90.476782176 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.761040 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:10:56.761032164 +0000 UTC m=+90.476857688 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:55 crc kubenswrapper[4843]: E0318 12:10:55.761108 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:10:56.761099465 +0000 UTC m=+90.476924989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.765984 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.780018 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.828143 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.828189 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.828205 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.828225 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.828240 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:55Z","lastTransitionTime":"2026-03-18T12:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.930626 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.930692 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.930704 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.930728 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:55 crc kubenswrapper[4843]: I0318 12:10:55.930741 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:55Z","lastTransitionTime":"2026-03-18T12:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.034287 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.034619 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.034718 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.034799 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.034878 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:56Z","lastTransitionTime":"2026-03-18T12:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.137299 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.137338 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.137347 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.137359 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.137368 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:56Z","lastTransitionTime":"2026-03-18T12:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.240536 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.240584 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.240597 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.240615 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.240632 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:56Z","lastTransitionTime":"2026-03-18T12:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.342754 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.342784 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.342794 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.342806 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.342815 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:56Z","lastTransitionTime":"2026-03-18T12:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.445785 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.445867 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.445886 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.445913 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.445927 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:56Z","lastTransitionTime":"2026-03-18T12:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.548370 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.548451 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.548462 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.548489 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.548509 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:56Z","lastTransitionTime":"2026-03-18T12:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.651209 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.651241 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.651249 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.651261 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.651269 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:56Z","lastTransitionTime":"2026-03-18T12:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.754631 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.754707 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.754740 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.754758 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.754771 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:56Z","lastTransitionTime":"2026-03-18T12:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.769879 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.770021 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.770050 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.770069 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.770088 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.770207 4843 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.770278 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:10:58.770258351 +0000 UTC m=+92.486083875 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.770818 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:10:58.770775295 +0000 UTC m=+92.486600819 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.771028 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.771046 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.771060 4843 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.771177 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:10:58.771108504 +0000 UTC m=+92.486934028 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.771250 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.771261 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.771270 4843 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.771295 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:10:58.771288009 +0000 UTC m=+92.487113533 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.771342 4843 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.771363 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:10:58.771357921 +0000 UTC m=+92.487183445 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.858800 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.858880 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.858891 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.858909 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.858921 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:56Z","lastTransitionTime":"2026-03-18T12:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.961996 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.962078 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.962091 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.962112 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.962126 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:56Z","lastTransitionTime":"2026-03-18T12:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.983457 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.983482 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.983466 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.983693 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.984314 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:10:56 crc kubenswrapper[4843]: E0318 12:10:56.984691 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.992273 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.992932 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.994431 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.995193 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.996588 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.997215 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.997953 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.999429 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 12:10:56 crc kubenswrapper[4843]: I0318 12:10:56.999390 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.000583 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.001960 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.002603 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.004045 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.004722 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.005399 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.006514 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.007177 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.009028 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.009561 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.009701 4843 scope.go:117] "RemoveContainer" containerID="0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8" Mar 18 12:10:57 crc kubenswrapper[4843]: E0318 12:10:57.009893 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.010352 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.011763 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.012319 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.013522 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.013992 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.015217 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.015771 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.016582 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.017940 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.018517 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.019588 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.020310 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.022478 4843 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.022627 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.024588 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.025100 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.026284 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.027012 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.028608 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.029380 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.030476 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.031196 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.032411 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.033037 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.034307 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.035000 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.036197 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.036735 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.037701 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.038229 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.039910 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.040534 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.040711 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.041851 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.042352 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.043437 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.044144 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.044718 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.045674 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.057154 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.064898 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.064943 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.064953 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.064970 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.064979 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:57Z","lastTransitionTime":"2026-03-18T12:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.073514 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.090189 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.168177 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.168268 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.168307 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.168324 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.168339 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:57Z","lastTransitionTime":"2026-03-18T12:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.272112 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.272170 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.272186 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.272202 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.272214 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:57Z","lastTransitionTime":"2026-03-18T12:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.375025 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.375078 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.375091 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.375114 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.375167 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:57Z","lastTransitionTime":"2026-03-18T12:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.478683 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.478762 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.478805 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.478849 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.478873 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:57Z","lastTransitionTime":"2026-03-18T12:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.582026 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.582082 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.582093 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.582112 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.582125 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:57Z","lastTransitionTime":"2026-03-18T12:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.639414 4843 scope.go:117] "RemoveContainer" containerID="0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8" Mar 18 12:10:57 crc kubenswrapper[4843]: E0318 12:10:57.639605 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.684357 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.684393 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.684402 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.684414 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.684423 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:57Z","lastTransitionTime":"2026-03-18T12:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.787812 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.787889 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.787902 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.787921 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.787950 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:57Z","lastTransitionTime":"2026-03-18T12:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.891768 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.891840 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.891853 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.891875 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.891916 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:57Z","lastTransitionTime":"2026-03-18T12:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.995043 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.995107 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.995118 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.995139 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:57 crc kubenswrapper[4843]: I0318 12:10:57.995150 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:57Z","lastTransitionTime":"2026-03-18T12:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.098508 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.098562 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.098573 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.098593 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.098606 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:58Z","lastTransitionTime":"2026-03-18T12:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.202218 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.202282 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.202297 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.202323 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.202341 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:58Z","lastTransitionTime":"2026-03-18T12:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.304398 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.304459 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.304468 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.304484 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.304494 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:58Z","lastTransitionTime":"2026-03-18T12:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.408345 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.408411 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.408423 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.408444 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.408459 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:58Z","lastTransitionTime":"2026-03-18T12:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.512007 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.512069 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.512084 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.512103 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.512116 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:58Z","lastTransitionTime":"2026-03-18T12:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.615749 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.615870 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.615885 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.615908 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.615922 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:58Z","lastTransitionTime":"2026-03-18T12:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.718800 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.718862 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.718873 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.718891 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.718906 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:58Z","lastTransitionTime":"2026-03-18T12:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.791091 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.791195 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.791230 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.791252 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.791278 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.791398 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:02.791359038 +0000 UTC m=+96.507184562 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.791486 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.791509 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.791518 4843 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.791573 4843 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.791523 4843 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.791599 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:02.791586164 +0000 UTC m=+96.507411688 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.791710 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:02.791689127 +0000 UTC m=+96.507514651 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.791734 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:02.791725268 +0000 UTC m=+96.507550792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.791828 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.791851 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.791865 4843 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.791910 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:02.791900823 +0000 UTC m=+96.507726337 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.821488 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.821553 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.821566 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.821589 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.821604 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:58Z","lastTransitionTime":"2026-03-18T12:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.924694 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.924764 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.924782 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.924808 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.924823 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:58Z","lastTransitionTime":"2026-03-18T12:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.982815 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.982893 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:10:58 crc kubenswrapper[4843]: I0318 12:10:58.982857 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.983116 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.983274 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:10:58 crc kubenswrapper[4843]: E0318 12:10:58.983469 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.033757 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.033809 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.033821 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.033856 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.033872 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.137389 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.137455 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.137466 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.137487 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.137502 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.241630 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.241722 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.241739 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.241763 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.241781 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.344696 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.344755 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.344769 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.344796 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.344812 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.447888 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.447938 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.447948 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.447966 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.447978 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.548765 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.548832 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.548843 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.548865 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.548881 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:59 crc kubenswrapper[4843]: E0318 12:10:59.567523 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.574339 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.574410 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.574425 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.574450 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.574465 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:59 crc kubenswrapper[4843]: E0318 12:10:59.593726 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.600432 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.600506 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.600520 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.600546 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.600766 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:59 crc kubenswrapper[4843]: E0318 12:10:59.621441 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.628143 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.628179 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.628189 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.628205 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.628216 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:59 crc kubenswrapper[4843]: E0318 12:10:59.643848 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.649253 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.649290 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.649302 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.649323 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.649336 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:59 crc kubenswrapper[4843]: E0318 12:10:59.666225 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:59 crc kubenswrapper[4843]: E0318 12:10:59.666370 4843 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.668900 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.668942 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.668955 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.668975 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.668990 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.772353 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.772419 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.772435 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.772456 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.772513 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.876056 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.876126 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.876140 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.876165 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.876182 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.979201 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.979245 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.979255 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.979275 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:59 crc kubenswrapper[4843]: I0318 12:10:59.979289 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:59Z","lastTransitionTime":"2026-03-18T12:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.082868 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.082936 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.082950 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.082969 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.082982 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:00Z","lastTransitionTime":"2026-03-18T12:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.186835 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.186911 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.186925 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.186950 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.186964 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:00Z","lastTransitionTime":"2026-03-18T12:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.290217 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.290303 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.290313 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.290335 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.290348 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:00Z","lastTransitionTime":"2026-03-18T12:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.394275 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.394350 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.394364 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.394412 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.394424 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:00Z","lastTransitionTime":"2026-03-18T12:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.497778 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.497861 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.497881 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.497932 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.497952 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:00Z","lastTransitionTime":"2026-03-18T12:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.601826 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.601906 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.601919 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.601942 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.601972 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:00Z","lastTransitionTime":"2026-03-18T12:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.706017 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.706110 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.706125 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.706173 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.706195 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:00Z","lastTransitionTime":"2026-03-18T12:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.808890 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.808984 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.808998 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.809016 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.809028 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:00Z","lastTransitionTime":"2026-03-18T12:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.911802 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.911871 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.911886 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.911906 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.911922 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:00Z","lastTransitionTime":"2026-03-18T12:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.983807 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.983991 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:00 crc kubenswrapper[4843]: I0318 12:11:00.984049 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:00 crc kubenswrapper[4843]: E0318 12:11:00.984119 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:00 crc kubenswrapper[4843]: E0318 12:11:00.984210 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:00 crc kubenswrapper[4843]: E0318 12:11:00.984453 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.016458 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.016520 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.016532 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.016552 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.016566 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:01Z","lastTransitionTime":"2026-03-18T12:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.119958 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.120010 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.120021 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.120038 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.120049 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:01Z","lastTransitionTime":"2026-03-18T12:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.222839 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.222905 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.222920 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.222942 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.222955 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:01Z","lastTransitionTime":"2026-03-18T12:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.326766 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.326846 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.326863 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.326887 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.326906 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:01Z","lastTransitionTime":"2026-03-18T12:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.430355 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.430413 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.430429 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.430451 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.430466 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:01Z","lastTransitionTime":"2026-03-18T12:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.533230 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.533285 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.533299 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.533316 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.533329 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:01Z","lastTransitionTime":"2026-03-18T12:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.636381 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.636430 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.636440 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.636457 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.636469 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:01Z","lastTransitionTime":"2026-03-18T12:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.739441 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.739508 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.739521 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.739544 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.739561 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:01Z","lastTransitionTime":"2026-03-18T12:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.843943 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.843998 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.844013 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.844032 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.844044 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:01Z","lastTransitionTime":"2026-03-18T12:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.947260 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.947316 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.947327 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.947351 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:01 crc kubenswrapper[4843]: I0318 12:11:01.947366 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:01Z","lastTransitionTime":"2026-03-18T12:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.051044 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.051095 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.051108 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.051129 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.051143 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:02Z","lastTransitionTime":"2026-03-18T12:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.154699 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.154772 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.154791 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.154813 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.154829 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:02Z","lastTransitionTime":"2026-03-18T12:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.257526 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.257580 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.257590 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.257609 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.257621 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:02Z","lastTransitionTime":"2026-03-18T12:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.359952 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.359993 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.360002 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.360018 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.360031 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:02Z","lastTransitionTime":"2026-03-18T12:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.463022 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.463082 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.463096 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.463119 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.463134 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:02Z","lastTransitionTime":"2026-03-18T12:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.565926 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.565995 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.566009 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.566037 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.566048 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:02Z","lastTransitionTime":"2026-03-18T12:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.669367 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.669427 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.669438 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.669459 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.669472 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:02Z","lastTransitionTime":"2026-03-18T12:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.772415 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.772499 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.772513 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.772536 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.772551 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:02Z","lastTransitionTime":"2026-03-18T12:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.832020 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.832121 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.832154 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.832181 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.832202 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.832290 4843 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.832394 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:10.832342259 +0000 UTC m=+104.548167783 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.832439 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.832463 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.832466 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:10.832453202 +0000 UTC m=+104.548278736 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.832478 4843 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.832546 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:10.832527474 +0000 UTC m=+104.548352998 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.832538 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.832597 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.832613 4843 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.832624 4843 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.832723 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:10.832695788 +0000 UTC m=+104.548521482 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.832758 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:10.83274702 +0000 UTC m=+104.548572744 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.875114 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.875168 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.875183 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.875201 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.875213 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:02Z","lastTransitionTime":"2026-03-18T12:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.983034 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.983100 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.983157 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.983264 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.983463 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:02 crc kubenswrapper[4843]: E0318 12:11:02.983916 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.984019 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.984057 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.984067 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.984086 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:02 crc kubenswrapper[4843]: I0318 12:11:02.984098 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:02Z","lastTransitionTime":"2026-03-18T12:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.006249 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.087106 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.087174 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.087189 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.087208 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.087221 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:03Z","lastTransitionTime":"2026-03-18T12:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.190022 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.190090 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.190102 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.190118 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.190132 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:03Z","lastTransitionTime":"2026-03-18T12:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.293258 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.293324 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.293338 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.293358 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.293373 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:03Z","lastTransitionTime":"2026-03-18T12:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.397003 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.397072 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.397091 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.397116 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.397134 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:03Z","lastTransitionTime":"2026-03-18T12:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.500722 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.500778 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.500786 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.500803 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.500813 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:03Z","lastTransitionTime":"2026-03-18T12:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.604080 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.604141 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.604155 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.604176 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.604192 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:03Z","lastTransitionTime":"2026-03-18T12:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.706767 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.706829 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.706843 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.706864 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.706877 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:03Z","lastTransitionTime":"2026-03-18T12:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.809994 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.810059 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.810069 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.810092 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.810103 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:03Z","lastTransitionTime":"2026-03-18T12:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.912607 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.912703 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.912725 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.912751 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:03 crc kubenswrapper[4843]: I0318 12:11:03.912769 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:03Z","lastTransitionTime":"2026-03-18T12:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.015203 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.015273 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.015285 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.015300 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.015312 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:04Z","lastTransitionTime":"2026-03-18T12:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.116983 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.117033 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.117045 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.117067 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.117081 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:04Z","lastTransitionTime":"2026-03-18T12:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.219816 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.219857 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.219865 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.219877 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.219888 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:04Z","lastTransitionTime":"2026-03-18T12:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.322329 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.322373 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.322384 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.322398 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.322409 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:04Z","lastTransitionTime":"2026-03-18T12:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.425196 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.425244 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.425253 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.425268 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.425278 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:04Z","lastTransitionTime":"2026-03-18T12:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.528241 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.528316 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.528337 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.528355 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.528367 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:04Z","lastTransitionTime":"2026-03-18T12:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.631078 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.631161 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.631174 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.631190 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.631224 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:04Z","lastTransitionTime":"2026-03-18T12:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.733627 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.733697 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.733710 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.733726 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.733738 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:04Z","lastTransitionTime":"2026-03-18T12:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.835855 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.835908 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.835922 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.835940 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.835957 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:04Z","lastTransitionTime":"2026-03-18T12:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.938232 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.938289 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.938303 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.938317 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.938328 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:04Z","lastTransitionTime":"2026-03-18T12:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.983334 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:04 crc kubenswrapper[4843]: E0318 12:11:04.983504 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.983593 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:04 crc kubenswrapper[4843]: E0318 12:11:04.983693 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:04 crc kubenswrapper[4843]: I0318 12:11:04.983821 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:04 crc kubenswrapper[4843]: E0318 12:11:04.983974 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.040742 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.040806 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.040824 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.040845 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.040857 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:05Z","lastTransitionTime":"2026-03-18T12:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.143267 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.143322 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.143339 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.143360 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.143375 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:05Z","lastTransitionTime":"2026-03-18T12:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.245673 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.245729 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.245742 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.245758 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.245770 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:05Z","lastTransitionTime":"2026-03-18T12:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.348456 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.348499 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.348511 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.348525 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.348555 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:05Z","lastTransitionTime":"2026-03-18T12:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.450563 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.450603 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.450614 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.450629 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.450640 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:05Z","lastTransitionTime":"2026-03-18T12:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.552272 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.552314 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.552325 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.552340 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.552351 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:05Z","lastTransitionTime":"2026-03-18T12:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.655227 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.655255 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.655263 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.655276 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.655284 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:05Z","lastTransitionTime":"2026-03-18T12:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.758504 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.758577 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.758589 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.758620 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.758636 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:05Z","lastTransitionTime":"2026-03-18T12:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.860673 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.860721 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.860733 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.860748 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.860760 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:05Z","lastTransitionTime":"2026-03-18T12:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.963156 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.963225 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.963244 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.963268 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:05 crc kubenswrapper[4843]: I0318 12:11:05.963285 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:05Z","lastTransitionTime":"2026-03-18T12:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:05 crc kubenswrapper[4843]: E0318 12:11:05.986498 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:05 crc kubenswrapper[4843]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:05 crc kubenswrapper[4843]: set -o allexport Mar 18 12:11:05 crc kubenswrapper[4843]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 12:11:05 crc kubenswrapper[4843]: source /etc/kubernetes/apiserver-url.env Mar 18 12:11:05 crc kubenswrapper[4843]: else Mar 18 12:11:05 crc kubenswrapper[4843]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 12:11:05 crc kubenswrapper[4843]: exit 1 Mar 18 12:11:05 crc kubenswrapper[4843]: fi Mar 18 12:11:05 crc kubenswrapper[4843]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 12:11:05 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:05 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:11:05 crc kubenswrapper[4843]: E0318 12:11:05.987765 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.066714 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.066764 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.066977 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.066998 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.067018 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:06Z","lastTransitionTime":"2026-03-18T12:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.171554 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.171604 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.171615 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.171632 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.171648 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:06Z","lastTransitionTime":"2026-03-18T12:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.274714 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.274758 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.274769 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.274786 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.274807 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:06Z","lastTransitionTime":"2026-03-18T12:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.377490 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.377565 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.377580 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.378035 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.378078 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:06Z","lastTransitionTime":"2026-03-18T12:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.482095 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.482150 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.482163 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.482180 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.482193 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:06Z","lastTransitionTime":"2026-03-18T12:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.585167 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.585217 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.585229 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.585244 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.585255 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:06Z","lastTransitionTime":"2026-03-18T12:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.686979 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.687035 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.687051 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.687070 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.687082 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:06Z","lastTransitionTime":"2026-03-18T12:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.790337 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.790393 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.790407 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.790423 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.790433 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:06Z","lastTransitionTime":"2026-03-18T12:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.893283 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.893368 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.893383 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.893401 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.893441 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:06Z","lastTransitionTime":"2026-03-18T12:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.983512 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.983551 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.983570 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:06 crc kubenswrapper[4843]: E0318 12:11:06.983691 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:06 crc kubenswrapper[4843]: E0318 12:11:06.983755 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:06 crc kubenswrapper[4843]: E0318 12:11:06.983866 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:06 crc kubenswrapper[4843]: E0318 12:11:06.987789 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:06 crc kubenswrapper[4843]: E0318 12:11:06.989011 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.994916 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.994965 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.994979 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.995002 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:06 crc kubenswrapper[4843]: I0318 12:11:06.995018 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:06Z","lastTransitionTime":"2026-03-18T12:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.002380 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.021674 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.041236 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.063982 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.078303 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.091442 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.097202 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.097232 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.097242 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.097259 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.097270 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:07Z","lastTransitionTime":"2026-03-18T12:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.110011 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.121081 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.199759 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.199815 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.199829 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.199850 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.199864 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:07Z","lastTransitionTime":"2026-03-18T12:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.301521 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.301557 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.301566 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.301579 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.301591 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:07Z","lastTransitionTime":"2026-03-18T12:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.404363 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.404420 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.404432 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.404452 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.404465 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:07Z","lastTransitionTime":"2026-03-18T12:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.506713 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.506772 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.506784 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.506799 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.506808 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:07Z","lastTransitionTime":"2026-03-18T12:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.609491 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.609540 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.609552 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.609566 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.609581 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:07Z","lastTransitionTime":"2026-03-18T12:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.712768 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.712846 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.712878 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.712911 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.712933 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:07Z","lastTransitionTime":"2026-03-18T12:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.816103 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.816181 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.816202 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.816225 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.816244 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:07Z","lastTransitionTime":"2026-03-18T12:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.919388 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.919422 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.919447 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.919459 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:07 crc kubenswrapper[4843]: I0318 12:11:07.919468 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:07Z","lastTransitionTime":"2026-03-18T12:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.022418 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.022455 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.022466 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.022479 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.022490 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.125203 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.125251 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.125265 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.125281 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.125292 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.228114 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.228177 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.228186 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.228202 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.228211 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.330815 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.330868 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.330884 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.330906 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.330918 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.432614 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.432647 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.432658 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.432697 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.432707 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.535177 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.535222 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.535232 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.535246 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.535260 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.638028 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.638079 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.638091 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.638108 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.638120 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.740521 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.740584 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.740601 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.740624 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.740640 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.843539 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.843592 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.843607 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.843621 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.843631 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.945493 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.945535 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.945544 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.945558 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.945567 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.983189 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:08 crc kubenswrapper[4843]: E0318 12:11:08.983355 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.983767 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:08 crc kubenswrapper[4843]: E0318 12:11:08.983854 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.983954 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:08 crc kubenswrapper[4843]: I0318 12:11:08.984455 4843 scope.go:117] "RemoveContainer" containerID="0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8" Mar 18 12:11:08 crc kubenswrapper[4843]: E0318 12:11:08.984419 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.048802 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.048840 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.048852 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.048868 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.048883 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.151629 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.151692 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.151705 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.151720 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.151731 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.254686 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.254722 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.254733 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.254748 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.254758 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.356931 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.356984 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.356995 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.357013 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.357064 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.460082 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.460130 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.460143 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.460159 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.460168 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.562853 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.562925 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.562934 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.562950 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.562960 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.666187 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.666249 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.666264 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.666282 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.666638 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.697641 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.700123 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734"} Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.700478 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.722911 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.740248 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.762213 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.769445 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.769515 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.769528 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.769565 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.769575 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.781220 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.811976 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.829411 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.844994 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.863232 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.872642 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.872714 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.872725 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.872742 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.872753 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.874188 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.874238 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.874256 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.874277 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.874290 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: E0318 12:11:09.891977 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.897792 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.897827 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.897837 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.897851 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.897860 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: E0318 12:11:09.913756 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.920528 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.920595 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.920607 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.920633 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.920645 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: E0318 12:11:09.938401 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.945928 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.945982 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.945996 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.946016 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.946029 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: E0318 12:11:09.961877 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.967621 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.967688 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.967702 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.967721 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.967742 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: E0318 12:11:09.983096 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:09 crc kubenswrapper[4843]: E0318 12:11:09.983319 4843 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:11:09 crc kubenswrapper[4843]: E0318 12:11:09.985363 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:09 crc kubenswrapper[4843]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:09 crc kubenswrapper[4843]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:09 crc kubenswrapper[4843]: set -o allexport Mar 18 12:11:09 crc kubenswrapper[4843]: source "/env/_master" Mar 18 12:11:09 crc kubenswrapper[4843]: set +o allexport Mar 18 12:11:09 crc kubenswrapper[4843]: fi Mar 18 12:11:09 crc kubenswrapper[4843]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 12:11:09 crc kubenswrapper[4843]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 12:11:09 crc kubenswrapper[4843]: ho_enable="--enable-hybrid-overlay" Mar 18 12:11:09 crc kubenswrapper[4843]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 12:11:09 crc kubenswrapper[4843]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 12:11:09 crc kubenswrapper[4843]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 12:11:09 crc kubenswrapper[4843]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:09 crc kubenswrapper[4843]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 12:11:09 crc kubenswrapper[4843]: --webhook-host=127.0.0.1 \ Mar 18 12:11:09 crc kubenswrapper[4843]: --webhook-port=9743 \ Mar 18 12:11:09 crc kubenswrapper[4843]: ${ho_enable} \ Mar 18 12:11:09 crc kubenswrapper[4843]: --enable-interconnect \ Mar 18 12:11:09 crc kubenswrapper[4843]: --disable-approver \ Mar 18 12:11:09 crc kubenswrapper[4843]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 12:11:09 crc kubenswrapper[4843]: --wait-for-kubernetes-api=200s \ Mar 18 12:11:09 crc kubenswrapper[4843]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 12:11:09 crc kubenswrapper[4843]: --loglevel="${LOGLEVEL}" Mar 18 12:11:09 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:09 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.985718 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.985754 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.985764 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.985781 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4843]: I0318 12:11:09.985791 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4843]: E0318 12:11:09.988677 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:09 crc kubenswrapper[4843]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:09 crc kubenswrapper[4843]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:09 crc kubenswrapper[4843]: set -o allexport Mar 18 12:11:09 crc kubenswrapper[4843]: source "/env/_master" Mar 18 12:11:09 crc kubenswrapper[4843]: set +o allexport Mar 18 12:11:09 crc kubenswrapper[4843]: fi Mar 18 12:11:09 crc kubenswrapper[4843]: Mar 18 12:11:09 crc kubenswrapper[4843]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 12:11:09 crc kubenswrapper[4843]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:09 crc kubenswrapper[4843]: --disable-webhook \ Mar 18 12:11:09 crc kubenswrapper[4843]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 12:11:09 crc kubenswrapper[4843]: --loglevel="${LOGLEVEL}" Mar 18 12:11:09 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:09 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:11:09 crc kubenswrapper[4843]: E0318 12:11:09.989926 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.089612 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.089680 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.089692 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.089711 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.089726 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.192868 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.193284 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.193373 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.193471 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.193606 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.296687 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.297084 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.297274 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.297372 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.297455 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.400038 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.400084 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.400095 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.400113 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.400126 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.503086 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.503153 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.503167 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.503188 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.503206 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.606133 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.606194 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.606217 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.606237 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.606248 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.709308 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.709366 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.709377 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.709395 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.709413 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.814718 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.814782 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.814800 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.814826 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.814849 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.905079 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.905190 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.905238 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.905268 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.905296 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.905426 4843 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.905447 4843 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.905485 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.905716 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.905740 4843 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.905492 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.905798 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.905811 4843 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.905508 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:26.90548658 +0000 UTC m=+120.621312104 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.905888 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:26.905869301 +0000 UTC m=+120.621694865 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.905915 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:26.905904212 +0000 UTC m=+120.621729736 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.905940 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:26.905928972 +0000 UTC m=+120.621754696 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.905961 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:26.905950983 +0000 UTC m=+120.621776507 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.917460 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.917517 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.917526 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.917543 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.917555 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.983175 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.983461 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.983201 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.983550 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:10 crc kubenswrapper[4843]: I0318 12:11:10.983181 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:10 crc kubenswrapper[4843]: E0318 12:11:10.983601 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.020737 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.020792 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.020808 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.020831 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.020847 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.123837 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.123907 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.123922 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.123957 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.123973 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.227344 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.227415 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.227439 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.227773 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.227792 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.332274 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.332324 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.332335 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.332354 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.332367 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.435947 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.436798 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.436918 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.437034 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.437128 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.540578 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.540631 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.540644 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.540698 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.540712 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.644507 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.644558 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.644570 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.644587 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.644598 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.748511 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.748568 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.748582 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.748601 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.748616 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.851523 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.851578 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.851598 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.851619 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.851633 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.955017 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.955076 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.955091 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.955112 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4843]: I0318 12:11:11.955123 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.058197 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.058251 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.058265 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.058285 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.058298 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.161421 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.161535 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.161550 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.161611 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.161624 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.264529 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.264593 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.264607 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.264635 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.264653 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.368288 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.368330 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.368339 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.368355 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.368365 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.471963 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.472008 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.472023 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.472065 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.472081 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.576191 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.576280 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.576303 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.576348 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.576361 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.681082 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.681188 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.681207 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.681238 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.681257 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.784760 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.784814 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.784828 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.784847 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.784861 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.887799 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.887842 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.887852 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.887870 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.887881 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.983821 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.983890 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.983847 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:12 crc kubenswrapper[4843]: E0318 12:11:12.984093 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:12 crc kubenswrapper[4843]: E0318 12:11:12.984208 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:12 crc kubenswrapper[4843]: E0318 12:11:12.984268 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.990167 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.990208 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.990217 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.990237 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4843]: I0318 12:11:12.990249 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.093642 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.093759 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.093776 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.093796 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.093809 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.197029 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.197103 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.197113 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.197148 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.197163 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.300380 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.300466 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.300478 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.300523 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.300538 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.404348 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.404413 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.404426 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.404450 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.404461 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.507015 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.507059 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.507068 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.507087 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.507100 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.610929 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.611000 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.611017 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.611037 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.611052 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.713689 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.713745 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.713755 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.713772 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.713784 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.816754 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.816802 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.816836 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.816857 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.816872 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.920229 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.920760 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.920857 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.920939 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4843]: I0318 12:11:13.921011 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.024484 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.024793 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.024886 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.024973 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.025090 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.128354 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.128977 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.128997 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.129016 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.129053 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.232112 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.232200 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.232212 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.232233 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.232249 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.335773 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.335836 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.335850 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.335874 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.335892 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.439224 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.439286 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.439300 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.439331 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.439345 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.543081 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.543139 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.543150 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.543171 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.543185 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.646444 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.646495 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.646510 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.646533 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.646546 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.750232 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.750302 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.750318 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.750343 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.750364 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.853189 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.853239 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.853258 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.853281 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.853296 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.957544 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.957597 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.957609 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.957627 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.957638 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.983116 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.983193 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:14 crc kubenswrapper[4843]: I0318 12:11:14.983343 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:14 crc kubenswrapper[4843]: E0318 12:11:14.983521 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:14 crc kubenswrapper[4843]: E0318 12:11:14.983720 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:14 crc kubenswrapper[4843]: E0318 12:11:14.984017 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.060894 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.060948 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.060961 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.060985 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.061002 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.164458 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.165192 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.165307 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.165386 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.165457 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.269200 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.269255 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.269265 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.269285 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.269299 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.371643 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.371739 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.371759 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.371786 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.371807 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.475326 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.475374 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.475384 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.475403 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.475414 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.578913 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.578975 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.578988 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.579014 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.579028 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.682041 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.682085 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.682100 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.682118 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.682128 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.786350 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.786409 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.786420 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.786442 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.786454 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.890208 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.890279 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.890292 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.890312 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.890323 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.994519 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.994571 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.994582 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.994613 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4843]: I0318 12:11:15.994628 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.103558 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.103814 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.103888 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.104046 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.104111 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.207351 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.207394 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.207409 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.207425 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.207437 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.310467 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.310520 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.310539 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.310560 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.310576 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.413883 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.413931 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.413944 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.413962 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.413975 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.516632 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.516702 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.516715 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.516732 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.516765 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.619987 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.620061 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.620078 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.620102 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.620120 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.726054 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.726106 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.726116 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.726134 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.726146 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.829624 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.829695 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.829709 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.829728 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.829739 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.933269 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.933338 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.933350 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.933370 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.933384 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.983049 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.983134 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:16 crc kubenswrapper[4843]: I0318 12:11:16.983241 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:16 crc kubenswrapper[4843]: E0318 12:11:16.983336 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:16 crc kubenswrapper[4843]: E0318 12:11:16.983427 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:16 crc kubenswrapper[4843]: E0318 12:11:16.983501 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.008371 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.031031 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.036759 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.036817 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.036829 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.036850 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.036862 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.051967 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.073695 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.093025 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.111889 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.139418 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.139463 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.139475 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.139507 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.139524 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.140351 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.159375 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.243527 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.243592 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.243606 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.243625 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.243640 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.348205 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.348825 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.348870 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.348904 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.348917 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.453494 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.453897 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.453987 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.454112 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.454203 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.557690 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.557763 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.557779 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.557804 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.557823 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.661159 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.661217 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.661233 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.661256 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.661274 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.764243 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.764305 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.764316 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.764337 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.764351 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.838577 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-csgs2"] Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.839070 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-csgs2" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.842209 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.842582 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.842759 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.855036 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.867619 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.867692 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.867703 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.867727 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.867740 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.885938 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.902204 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.914699 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.927321 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.939426 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.952438 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.967771 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.970379 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.970422 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.970436 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.970458 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.970473 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.986015 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r2pp\" (UniqueName: \"kubernetes.io/projected/809fa601-6b32-4585-a41a-646cc883bcd6-kube-api-access-7r2pp\") pod \"node-resolver-csgs2\" (UID: \"809fa601-6b32-4585-a41a-646cc883bcd6\") " pod="openshift-dns/node-resolver-csgs2" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.986089 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/809fa601-6b32-4585-a41a-646cc883bcd6-hosts-file\") pod \"node-resolver-csgs2\" (UID: \"809fa601-6b32-4585-a41a-646cc883bcd6\") " pod="openshift-dns/node-resolver-csgs2" Mar 18 12:11:17 crc kubenswrapper[4843]: E0318 12:11:17.986018 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:17 crc kubenswrapper[4843]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:17 crc kubenswrapper[4843]: set -o allexport Mar 18 12:11:17 crc kubenswrapper[4843]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 12:11:17 crc kubenswrapper[4843]: source /etc/kubernetes/apiserver-url.env Mar 18 12:11:17 crc kubenswrapper[4843]: else Mar 18 12:11:17 crc kubenswrapper[4843]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 12:11:17 crc kubenswrapper[4843]: exit 1 Mar 18 12:11:17 crc kubenswrapper[4843]: fi Mar 18 12:11:17 crc kubenswrapper[4843]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 12:11:17 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:17 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:11:17 crc kubenswrapper[4843]: I0318 12:11:17.987459 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4843]: E0318 12:11:17.987558 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.073373 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.073422 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.073432 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.073453 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.073464 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.087698 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r2pp\" (UniqueName: \"kubernetes.io/projected/809fa601-6b32-4585-a41a-646cc883bcd6-kube-api-access-7r2pp\") pod \"node-resolver-csgs2\" (UID: \"809fa601-6b32-4585-a41a-646cc883bcd6\") " pod="openshift-dns/node-resolver-csgs2" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.087893 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/809fa601-6b32-4585-a41a-646cc883bcd6-hosts-file\") pod \"node-resolver-csgs2\" (UID: \"809fa601-6b32-4585-a41a-646cc883bcd6\") " pod="openshift-dns/node-resolver-csgs2" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.088691 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/809fa601-6b32-4585-a41a-646cc883bcd6-hosts-file\") pod \"node-resolver-csgs2\" (UID: \"809fa601-6b32-4585-a41a-646cc883bcd6\") " pod="openshift-dns/node-resolver-csgs2" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.113368 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r2pp\" (UniqueName: \"kubernetes.io/projected/809fa601-6b32-4585-a41a-646cc883bcd6-kube-api-access-7r2pp\") pod \"node-resolver-csgs2\" (UID: \"809fa601-6b32-4585-a41a-646cc883bcd6\") " pod="openshift-dns/node-resolver-csgs2" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.152352 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-csgs2" Mar 18 12:11:18 crc kubenswrapper[4843]: W0318 12:11:18.168050 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod809fa601_6b32_4585_a41a_646cc883bcd6.slice/crio-5f6956f245be5bcdec4aa3d72ba1a596f018bcc4928b4b5ac6d49b9e82c1402e WatchSource:0}: Error finding container 5f6956f245be5bcdec4aa3d72ba1a596f018bcc4928b4b5ac6d49b9e82c1402e: Status 404 returned error can't find the container with id 5f6956f245be5bcdec4aa3d72ba1a596f018bcc4928b4b5ac6d49b9e82c1402e Mar 18 12:11:18 crc kubenswrapper[4843]: E0318 12:11:18.170997 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:18 crc kubenswrapper[4843]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:18 crc kubenswrapper[4843]: set -uo pipefail Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 12:11:18 crc kubenswrapper[4843]: HOSTS_FILE="/etc/hosts" Mar 18 12:11:18 crc kubenswrapper[4843]: TEMP_FILE="/etc/hosts.tmp" Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: # Make a temporary file with the old hosts file's attributes. Mar 18 12:11:18 crc kubenswrapper[4843]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 12:11:18 crc kubenswrapper[4843]: echo "Failed to preserve hosts file. Exiting." Mar 18 12:11:18 crc kubenswrapper[4843]: exit 1 Mar 18 12:11:18 crc kubenswrapper[4843]: fi Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: while true; do Mar 18 12:11:18 crc kubenswrapper[4843]: declare -A svc_ips Mar 18 12:11:18 crc kubenswrapper[4843]: for svc in "${services[@]}"; do Mar 18 12:11:18 crc kubenswrapper[4843]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 12:11:18 crc kubenswrapper[4843]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 12:11:18 crc kubenswrapper[4843]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 12:11:18 crc kubenswrapper[4843]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 12:11:18 crc kubenswrapper[4843]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:18 crc kubenswrapper[4843]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:18 crc kubenswrapper[4843]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:18 crc kubenswrapper[4843]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 12:11:18 crc kubenswrapper[4843]: for i in ${!cmds[*]} Mar 18 12:11:18 crc kubenswrapper[4843]: do Mar 18 12:11:18 crc kubenswrapper[4843]: ips=($(eval "${cmds[i]}")) Mar 18 12:11:18 crc kubenswrapper[4843]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 12:11:18 crc kubenswrapper[4843]: svc_ips["${svc}"]="${ips[@]}" Mar 18 12:11:18 crc kubenswrapper[4843]: break Mar 18 12:11:18 crc kubenswrapper[4843]: fi Mar 18 12:11:18 crc kubenswrapper[4843]: done Mar 18 12:11:18 crc kubenswrapper[4843]: done Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: # Update /etc/hosts only if we get valid service IPs Mar 18 12:11:18 crc kubenswrapper[4843]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 12:11:18 crc kubenswrapper[4843]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 12:11:18 crc kubenswrapper[4843]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 12:11:18 crc kubenswrapper[4843]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 12:11:18 crc kubenswrapper[4843]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 12:11:18 crc kubenswrapper[4843]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 12:11:18 crc kubenswrapper[4843]: sleep 60 & wait Mar 18 12:11:18 crc kubenswrapper[4843]: continue Mar 18 12:11:18 crc kubenswrapper[4843]: fi Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: # Append resolver entries for services Mar 18 12:11:18 crc kubenswrapper[4843]: rc=0 Mar 18 12:11:18 crc kubenswrapper[4843]: for svc in "${!svc_ips[@]}"; do Mar 18 12:11:18 crc kubenswrapper[4843]: for ip in ${svc_ips[${svc}]}; do Mar 18 12:11:18 crc kubenswrapper[4843]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 12:11:18 crc kubenswrapper[4843]: done Mar 18 12:11:18 crc kubenswrapper[4843]: done Mar 18 12:11:18 crc kubenswrapper[4843]: if [[ $rc -ne 0 ]]; then Mar 18 12:11:18 crc kubenswrapper[4843]: sleep 60 & wait Mar 18 12:11:18 crc kubenswrapper[4843]: continue Mar 18 12:11:18 crc kubenswrapper[4843]: fi Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 12:11:18 crc kubenswrapper[4843]: # Replace /etc/hosts with our modified version if needed Mar 18 12:11:18 crc kubenswrapper[4843]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 12:11:18 crc kubenswrapper[4843]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 12:11:18 crc kubenswrapper[4843]: fi Mar 18 12:11:18 crc kubenswrapper[4843]: sleep 60 & wait Mar 18 12:11:18 crc kubenswrapper[4843]: unset svc_ips Mar 18 12:11:18 crc kubenswrapper[4843]: done Mar 18 12:11:18 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7r2pp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-csgs2_openshift-dns(809fa601-6b32-4585-a41a-646cc883bcd6): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:18 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:11:18 crc kubenswrapper[4843]: E0318 12:11:18.172629 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-csgs2" podUID="809fa601-6b32-4585-a41a-646cc883bcd6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.175496 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.175537 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.175550 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.175569 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.175623 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.212238 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-wstcq"] Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.212803 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-2hn9k"] Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.212822 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.213405 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.217780 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-ls8r8"] Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.220310 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.220853 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 12:11:18 crc kubenswrapper[4843]: W0318 12:11:18.220831 4843 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Mar 18 12:11:18 crc kubenswrapper[4843]: E0318 12:11:18.220955 4843 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 12:11:18 crc kubenswrapper[4843]: W0318 12:11:18.220854 4843 reflector.go:561] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": failed to list *v1.Secret: secrets "machine-config-daemon-dockercfg-r5tcq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Mar 18 12:11:18 crc kubenswrapper[4843]: E0318 12:11:18.221044 4843 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-r5tcq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-config-daemon-dockercfg-r5tcq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.221119 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 12:11:18 crc kubenswrapper[4843]: W0318 12:11:18.221146 4843 reflector.go:561] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": failed to list *v1.Secret: secrets "multus-ancillary-tools-dockercfg-vnmsz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.221195 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 12:11:18 crc kubenswrapper[4843]: E0318 12:11:18.221227 4843 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vnmsz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"multus-ancillary-tools-dockercfg-vnmsz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 12:11:18 crc kubenswrapper[4843]: W0318 12:11:18.221303 4843 reflector.go:561] object-"openshift-multus"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.224596 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 12:11:18 crc kubenswrapper[4843]: E0318 12:11:18.224668 4843 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 12:11:18 crc kubenswrapper[4843]: W0318 12:11:18.224749 4843 reflector.go:561] object-"openshift-multus"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Mar 18 12:11:18 crc kubenswrapper[4843]: E0318 12:11:18.224779 4843 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 12:11:18 crc kubenswrapper[4843]: W0318 12:11:18.224942 4843 reflector.go:561] object-"openshift-multus"/"default-cni-sysctl-allowlist": failed to list *v1.ConfigMap: configmaps "default-cni-sysctl-allowlist" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Mar 18 12:11:18 crc kubenswrapper[4843]: E0318 12:11:18.224975 4843 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"default-cni-sysctl-allowlist\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.225507 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.228000 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.241289 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.258284 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.274373 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.279294 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.279337 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.279350 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.279370 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.279387 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.288643 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.311435 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.330476 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.343760 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.382958 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.383011 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.383024 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.383045 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.383059 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.392622 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-system-cni-dir\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.392719 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-cni-binary-copy\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.392749 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-var-lib-cni-multus\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.392785 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-multus-conf-dir\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.392825 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-run-netns\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.392853 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-var-lib-kubelet\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.392878 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c88bc92-4b87-46ed-ab45-6291502efbfe-system-cni-dir\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.392903 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c88bc92-4b87-46ed-ab45-6291502efbfe-cni-binary-copy\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.392925 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c88bc92-4b87-46ed-ab45-6291502efbfe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.392949 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8c88bc92-4b87-46ed-ab45-6291502efbfe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.392978 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-multus-cni-dir\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393038 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-hostroot\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393091 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-etc-kubernetes\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393120 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-var-lib-cni-bin\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393172 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5a185c4-48ac-4f51-99be-0a9418d9e53f-mcd-auth-proxy-config\") pod \"machine-config-daemon-wstcq\" (UID: \"f5a185c4-48ac-4f51-99be-0a9418d9e53f\") " pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393201 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f5a185c4-48ac-4f51-99be-0a9418d9e53f-rootfs\") pod \"machine-config-daemon-wstcq\" (UID: \"f5a185c4-48ac-4f51-99be-0a9418d9e53f\") " pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393228 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-multus-daemon-config\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393284 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkw6d\" (UniqueName: \"kubernetes.io/projected/f5a185c4-48ac-4f51-99be-0a9418d9e53f-kube-api-access-wkw6d\") pod \"machine-config-daemon-wstcq\" (UID: \"f5a185c4-48ac-4f51-99be-0a9418d9e53f\") " pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393332 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c88bc92-4b87-46ed-ab45-6291502efbfe-os-release\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393359 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-multus-socket-dir-parent\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393383 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-run-k8s-cni-cncf-io\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393455 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c88bc92-4b87-46ed-ab45-6291502efbfe-cnibin\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393480 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-run-multus-certs\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393517 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42zs5\" (UniqueName: \"kubernetes.io/projected/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-kube-api-access-42zs5\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393540 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mln2h\" (UniqueName: \"kubernetes.io/projected/8c88bc92-4b87-46ed-ab45-6291502efbfe-kube-api-access-mln2h\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393585 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-cnibin\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393616 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-os-release\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.393638 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5a185c4-48ac-4f51-99be-0a9418d9e53f-proxy-tls\") pod \"machine-config-daemon-wstcq\" (UID: \"f5a185c4-48ac-4f51-99be-0a9418d9e53f\") " pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.394473 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.486527 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.486585 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.486599 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.486621 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.486639 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.494924 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-cni-binary-copy\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.494997 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-var-lib-cni-multus\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495019 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-var-lib-kubelet\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495045 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-multus-conf-dir\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495075 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-run-netns\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495097 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c88bc92-4b87-46ed-ab45-6291502efbfe-system-cni-dir\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495116 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c88bc92-4b87-46ed-ab45-6291502efbfe-cni-binary-copy\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495150 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c88bc92-4b87-46ed-ab45-6291502efbfe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495216 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8c88bc92-4b87-46ed-ab45-6291502efbfe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495245 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-multus-cni-dir\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495266 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-hostroot\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495292 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-etc-kubernetes\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495359 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-var-lib-cni-bin\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495388 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5a185c4-48ac-4f51-99be-0a9418d9e53f-mcd-auth-proxy-config\") pod \"machine-config-daemon-wstcq\" (UID: \"f5a185c4-48ac-4f51-99be-0a9418d9e53f\") " pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495414 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f5a185c4-48ac-4f51-99be-0a9418d9e53f-rootfs\") pod \"machine-config-daemon-wstcq\" (UID: \"f5a185c4-48ac-4f51-99be-0a9418d9e53f\") " pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495443 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-multus-daemon-config\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495470 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkw6d\" (UniqueName: \"kubernetes.io/projected/f5a185c4-48ac-4f51-99be-0a9418d9e53f-kube-api-access-wkw6d\") pod \"machine-config-daemon-wstcq\" (UID: \"f5a185c4-48ac-4f51-99be-0a9418d9e53f\") " pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495510 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c88bc92-4b87-46ed-ab45-6291502efbfe-os-release\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495536 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-multus-socket-dir-parent\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495557 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-run-k8s-cni-cncf-io\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495577 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c88bc92-4b87-46ed-ab45-6291502efbfe-cnibin\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495605 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-run-multus-certs\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495625 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42zs5\" (UniqueName: \"kubernetes.io/projected/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-kube-api-access-42zs5\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495647 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5a185c4-48ac-4f51-99be-0a9418d9e53f-proxy-tls\") pod \"machine-config-daemon-wstcq\" (UID: \"f5a185c4-48ac-4f51-99be-0a9418d9e53f\") " pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495697 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mln2h\" (UniqueName: \"kubernetes.io/projected/8c88bc92-4b87-46ed-ab45-6291502efbfe-kube-api-access-mln2h\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495728 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-cnibin\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495750 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-os-release\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495804 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-system-cni-dir\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.495943 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-system-cni-dir\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.496881 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.497175 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-var-lib-kubelet\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.497144 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-var-lib-cni-multus\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.497249 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-multus-conf-dir\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.497280 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-run-netns\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.497312 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c88bc92-4b87-46ed-ab45-6291502efbfe-system-cni-dir\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.497110 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-cni-binary-copy\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.498304 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c88bc92-4b87-46ed-ab45-6291502efbfe-cni-binary-copy\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.498397 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c88bc92-4b87-46ed-ab45-6291502efbfe-os-release\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.498458 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-multus-socket-dir-parent\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.498492 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-run-k8s-cni-cncf-io\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.498522 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c88bc92-4b87-46ed-ab45-6291502efbfe-cnibin\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.498559 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-run-multus-certs\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.498830 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c88bc92-4b87-46ed-ab45-6291502efbfe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.499052 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-multus-cni-dir\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.499181 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-hostroot\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.499295 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-etc-kubernetes\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.499346 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-cnibin\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.499396 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-os-release\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.499580 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f5a185c4-48ac-4f51-99be-0a9418d9e53f-rootfs\") pod \"machine-config-daemon-wstcq\" (UID: \"f5a185c4-48ac-4f51-99be-0a9418d9e53f\") " pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.500172 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5a185c4-48ac-4f51-99be-0a9418d9e53f-mcd-auth-proxy-config\") pod \"machine-config-daemon-wstcq\" (UID: \"f5a185c4-48ac-4f51-99be-0a9418d9e53f\") " pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.500243 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-host-var-lib-cni-bin\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.500642 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-multus-daemon-config\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.530022 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.534334 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkw6d\" (UniqueName: \"kubernetes.io/projected/f5a185c4-48ac-4f51-99be-0a9418d9e53f-kube-api-access-wkw6d\") pod \"machine-config-daemon-wstcq\" (UID: \"f5a185c4-48ac-4f51-99be-0a9418d9e53f\") " pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.579947 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.590463 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.590535 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.590548 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.590568 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.590580 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.608942 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.617303 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bc7c6"] Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.618346 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.622455 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.622603 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.622726 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.624124 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.624417 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.624551 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.624755 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.638778 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.652884 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.669095 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.686155 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.694446 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.694534 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.694554 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.694581 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.694597 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.703240 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.723546 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.737065 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-csgs2" event={"ID":"809fa601-6b32-4585-a41a-646cc883bcd6","Type":"ContainerStarted","Data":"5f6956f245be5bcdec4aa3d72ba1a596f018bcc4928b4b5ac6d49b9e82c1402e"} Mar 18 12:11:18 crc kubenswrapper[4843]: E0318 12:11:18.738856 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:18 crc kubenswrapper[4843]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:18 crc kubenswrapper[4843]: set -uo pipefail Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 12:11:18 crc kubenswrapper[4843]: HOSTS_FILE="/etc/hosts" Mar 18 12:11:18 crc kubenswrapper[4843]: TEMP_FILE="/etc/hosts.tmp" Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: # Make a temporary file with the old hosts file's attributes. Mar 18 12:11:18 crc kubenswrapper[4843]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 12:11:18 crc kubenswrapper[4843]: echo "Failed to preserve hosts file. Exiting." Mar 18 12:11:18 crc kubenswrapper[4843]: exit 1 Mar 18 12:11:18 crc kubenswrapper[4843]: fi Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: while true; do Mar 18 12:11:18 crc kubenswrapper[4843]: declare -A svc_ips Mar 18 12:11:18 crc kubenswrapper[4843]: for svc in "${services[@]}"; do Mar 18 12:11:18 crc kubenswrapper[4843]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 12:11:18 crc kubenswrapper[4843]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 12:11:18 crc kubenswrapper[4843]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 12:11:18 crc kubenswrapper[4843]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 12:11:18 crc kubenswrapper[4843]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:18 crc kubenswrapper[4843]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:18 crc kubenswrapper[4843]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:18 crc kubenswrapper[4843]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 12:11:18 crc kubenswrapper[4843]: for i in ${!cmds[*]} Mar 18 12:11:18 crc kubenswrapper[4843]: do Mar 18 12:11:18 crc kubenswrapper[4843]: ips=($(eval "${cmds[i]}")) Mar 18 12:11:18 crc kubenswrapper[4843]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 12:11:18 crc kubenswrapper[4843]: svc_ips["${svc}"]="${ips[@]}" Mar 18 12:11:18 crc kubenswrapper[4843]: break Mar 18 12:11:18 crc kubenswrapper[4843]: fi Mar 18 12:11:18 crc kubenswrapper[4843]: done Mar 18 12:11:18 crc kubenswrapper[4843]: done Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: # Update /etc/hosts only if we get valid service IPs Mar 18 12:11:18 crc kubenswrapper[4843]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 12:11:18 crc kubenswrapper[4843]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 12:11:18 crc kubenswrapper[4843]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 12:11:18 crc kubenswrapper[4843]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 12:11:18 crc kubenswrapper[4843]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 12:11:18 crc kubenswrapper[4843]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 12:11:18 crc kubenswrapper[4843]: sleep 60 & wait Mar 18 12:11:18 crc kubenswrapper[4843]: continue Mar 18 12:11:18 crc kubenswrapper[4843]: fi Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: # Append resolver entries for services Mar 18 12:11:18 crc kubenswrapper[4843]: rc=0 Mar 18 12:11:18 crc kubenswrapper[4843]: for svc in "${!svc_ips[@]}"; do Mar 18 12:11:18 crc kubenswrapper[4843]: for ip in ${svc_ips[${svc}]}; do Mar 18 12:11:18 crc kubenswrapper[4843]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 12:11:18 crc kubenswrapper[4843]: done Mar 18 12:11:18 crc kubenswrapper[4843]: done Mar 18 12:11:18 crc kubenswrapper[4843]: if [[ $rc -ne 0 ]]; then Mar 18 12:11:18 crc kubenswrapper[4843]: sleep 60 & wait Mar 18 12:11:18 crc kubenswrapper[4843]: continue Mar 18 12:11:18 crc kubenswrapper[4843]: fi Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: Mar 18 12:11:18 crc kubenswrapper[4843]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 12:11:18 crc kubenswrapper[4843]: # Replace /etc/hosts with our modified version if needed Mar 18 12:11:18 crc kubenswrapper[4843]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 12:11:18 crc kubenswrapper[4843]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 12:11:18 crc kubenswrapper[4843]: fi Mar 18 12:11:18 crc kubenswrapper[4843]: sleep 60 & wait Mar 18 12:11:18 crc kubenswrapper[4843]: unset svc_ips Mar 18 12:11:18 crc kubenswrapper[4843]: done Mar 18 12:11:18 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7r2pp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-csgs2_openshift-dns(809fa601-6b32-4585-a41a-646cc883bcd6): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:18 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:11:18 crc kubenswrapper[4843]: E0318 12:11:18.741240 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-csgs2" podUID="809fa601-6b32-4585-a41a-646cc883bcd6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.745791 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.761087 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.777626 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.798389 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.798452 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.798461 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.798481 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.798492 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.798556 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-cni-netd\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.798582 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-kubelet\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.798492 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.798669 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-systemd\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.798748 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-log-socket\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.798900 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-systemd-units\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.798953 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-slash\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.798975 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-openvswitch\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.799011 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84n7m\" (UniqueName: \"kubernetes.io/projected/45174fd5-3a94-47fe-81c3-18bd634c4fcf-kube-api-access-84n7m\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.799060 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-cni-bin\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.799118 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-var-lib-openvswitch\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.799147 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-ovn\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.799236 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-node-log\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.799274 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-etc-openvswitch\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.799292 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovn-node-metrics-cert\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.799319 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-run-ovn-kubernetes\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.799341 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-env-overrides\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.799367 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovnkube-script-lib\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.799413 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-run-netns\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.799431 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovnkube-config\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.799304 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.819260 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.835481 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.852310 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.987248 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:18 crc kubenswrapper[4843]: E0318 12:11:18.987449 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.987633 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.987772 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-cni-bin\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.987937 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-cni-bin\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.987957 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:18 crc kubenswrapper[4843]: E0318 12:11:18.988026 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:18 crc kubenswrapper[4843]: E0318 12:11:18.987776 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988169 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-var-lib-openvswitch\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988287 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988311 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988321 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988351 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-ovn\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988370 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988385 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988416 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-var-lib-openvswitch\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988285 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-ovn\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988813 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-node-log\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988886 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-etc-openvswitch\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988914 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovn-node-metrics-cert\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988949 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-run-ovn-kubernetes\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988982 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-etc-openvswitch\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988999 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-env-overrides\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.988945 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-node-log\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989024 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovnkube-script-lib\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989060 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-run-ovn-kubernetes\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989113 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-run-netns\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989135 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovnkube-config\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989188 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989229 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-cni-netd\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989287 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-kubelet\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989308 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-systemd\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989383 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-log-socket\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989417 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-run-netns\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989437 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-systemd-units\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989456 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-cni-netd\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989467 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989468 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-slash\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989637 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-openvswitch\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989647 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-systemd\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989671 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84n7m\" (UniqueName: \"kubernetes.io/projected/45174fd5-3a94-47fe-81c3-18bd634c4fcf-kube-api-access-84n7m\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989694 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-log-socket\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989697 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-slash\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989717 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-kubelet\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989743 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-openvswitch\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.989790 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-systemd-units\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.990704 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovnkube-script-lib\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.990843 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-env-overrides\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:18 crc kubenswrapper[4843]: I0318 12:11:18.993095 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovn-node-metrics-cert\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.000185 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovnkube-config\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.020337 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84n7m\" (UniqueName: \"kubernetes.io/projected/45174fd5-3a94-47fe-81c3-18bd634c4fcf-kube-api-access-84n7m\") pod \"ovnkube-node-bc7c6\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.023887 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.054503 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.074030 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.091184 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.091476 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.091519 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.091530 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.091550 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.091595 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.110451 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.125438 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.142085 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.154895 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.170759 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.190384 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.194061 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.194111 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.194124 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.194145 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.194163 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.234401 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.252667 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:19 crc kubenswrapper[4843]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 12:11:19 crc kubenswrapper[4843]: apiVersion: v1 Mar 18 12:11:19 crc kubenswrapper[4843]: clusters: Mar 18 12:11:19 crc kubenswrapper[4843]: - cluster: Mar 18 12:11:19 crc kubenswrapper[4843]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 12:11:19 crc kubenswrapper[4843]: server: https://api-int.crc.testing:6443 Mar 18 12:11:19 crc kubenswrapper[4843]: name: default-cluster Mar 18 12:11:19 crc kubenswrapper[4843]: contexts: Mar 18 12:11:19 crc kubenswrapper[4843]: - context: Mar 18 12:11:19 crc kubenswrapper[4843]: cluster: default-cluster Mar 18 12:11:19 crc kubenswrapper[4843]: namespace: default Mar 18 12:11:19 crc kubenswrapper[4843]: user: default-auth Mar 18 12:11:19 crc kubenswrapper[4843]: name: default-context Mar 18 12:11:19 crc kubenswrapper[4843]: current-context: default-context Mar 18 12:11:19 crc kubenswrapper[4843]: kind: Config Mar 18 12:11:19 crc kubenswrapper[4843]: preferences: {} Mar 18 12:11:19 crc kubenswrapper[4843]: users: Mar 18 12:11:19 crc kubenswrapper[4843]: - name: default-auth Mar 18 12:11:19 crc kubenswrapper[4843]: user: Mar 18 12:11:19 crc kubenswrapper[4843]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 12:11:19 crc kubenswrapper[4843]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 12:11:19 crc kubenswrapper[4843]: EOF Mar 18 12:11:19 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84n7m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:19 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.253864 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.297605 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.298140 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.298211 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.298232 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.298245 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.402007 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.402128 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.402144 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.402172 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.402299 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.499676 4843 configmap.go:193] Couldn't get configMap openshift-multus/default-cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.499703 4843 secret.go:188] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.500150 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c88bc92-4b87-46ed-ab45-6291502efbfe-cni-sysctl-allowlist podName:8c88bc92-4b87-46ed-ab45-6291502efbfe nodeName:}" failed. No retries permitted until 2026-03-18 12:11:20.000126115 +0000 UTC m=+113.715951639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/8c88bc92-4b87-46ed-ab45-6291502efbfe-cni-sysctl-allowlist") pod "multus-additional-cni-plugins-2hn9k" (UID: "8c88bc92-4b87-46ed-ab45-6291502efbfe") : failed to sync configmap cache: timed out waiting for the condition Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.500369 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5a185c4-48ac-4f51-99be-0a9418d9e53f-proxy-tls podName:f5a185c4-48ac-4f51-99be-0a9418d9e53f nodeName:}" failed. No retries permitted until 2026-03-18 12:11:20.00030853 +0000 UTC m=+113.716134064 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f5a185c4-48ac-4f51-99be-0a9418d9e53f-proxy-tls") pod "machine-config-daemon-wstcq" (UID: "f5a185c4-48ac-4f51-99be-0a9418d9e53f") : failed to sync secret cache: timed out waiting for the condition Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.505035 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.505194 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.505285 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.505366 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.505455 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.522425 4843 projected.go:288] Couldn't get configMap openshift-multus/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.526639 4843 projected.go:288] Couldn't get configMap openshift-multus/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.613968 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.614058 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.614081 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.614108 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.614131 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.639271 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.641710 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.642630 4843 projected.go:194] Error preparing data for projected volume kube-api-access-mln2h for pod openshift-multus/multus-additional-cni-plugins-2hn9k: failed to sync configmap cache: timed out waiting for the condition Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.642863 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c88bc92-4b87-46ed-ab45-6291502efbfe-kube-api-access-mln2h podName:8c88bc92-4b87-46ed-ab45-6291502efbfe nodeName:}" failed. No retries permitted until 2026-03-18 12:11:20.142839896 +0000 UTC m=+113.858665410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mln2h" (UniqueName: "kubernetes.io/projected/8c88bc92-4b87-46ed-ab45-6291502efbfe-kube-api-access-mln2h") pod "multus-additional-cni-plugins-2hn9k" (UID: "8c88bc92-4b87-46ed-ab45-6291502efbfe") : failed to sync configmap cache: timed out waiting for the condition Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.643638 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.647030 4843 projected.go:194] Error preparing data for projected volume kube-api-access-42zs5 for pod openshift-multus/multus-ls8r8: failed to sync configmap cache: timed out waiting for the condition Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.647205 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-kube-api-access-42zs5 podName:1aa16ddb-306b-4e37-a33a-b9cdce3c254e nodeName:}" failed. No retries permitted until 2026-03-18 12:11:20.147166694 +0000 UTC m=+113.862992408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-42zs5" (UniqueName: "kubernetes.io/projected/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-kube-api-access-42zs5") pod "multus-ls8r8" (UID: "1aa16ddb-306b-4e37-a33a-b9cdce3c254e") : failed to sync configmap cache: timed out waiting for the condition Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.717812 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.717920 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.717930 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.717950 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.717961 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.742755 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerStarted","Data":"aedfbd173c27261b62efec95c11641890eca11561d631f6c5fb684303fbdb259"} Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.744669 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:19 crc kubenswrapper[4843]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 12:11:19 crc kubenswrapper[4843]: apiVersion: v1 Mar 18 12:11:19 crc kubenswrapper[4843]: clusters: Mar 18 12:11:19 crc kubenswrapper[4843]: - cluster: Mar 18 12:11:19 crc kubenswrapper[4843]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 12:11:19 crc kubenswrapper[4843]: server: https://api-int.crc.testing:6443 Mar 18 12:11:19 crc kubenswrapper[4843]: name: default-cluster Mar 18 12:11:19 crc kubenswrapper[4843]: contexts: Mar 18 12:11:19 crc kubenswrapper[4843]: - context: Mar 18 12:11:19 crc kubenswrapper[4843]: cluster: default-cluster Mar 18 12:11:19 crc kubenswrapper[4843]: namespace: default Mar 18 12:11:19 crc kubenswrapper[4843]: user: default-auth Mar 18 12:11:19 crc kubenswrapper[4843]: name: default-context Mar 18 12:11:19 crc kubenswrapper[4843]: current-context: default-context Mar 18 12:11:19 crc kubenswrapper[4843]: kind: Config Mar 18 12:11:19 crc kubenswrapper[4843]: preferences: {} Mar 18 12:11:19 crc kubenswrapper[4843]: users: Mar 18 12:11:19 crc kubenswrapper[4843]: - name: default-auth Mar 18 12:11:19 crc kubenswrapper[4843]: user: Mar 18 12:11:19 crc kubenswrapper[4843]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 12:11:19 crc kubenswrapper[4843]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 12:11:19 crc kubenswrapper[4843]: EOF Mar 18 12:11:19 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84n7m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:19 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:11:19 crc kubenswrapper[4843]: E0318 12:11:19.745942 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.750257 4843 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.760940 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.769968 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.773384 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.784257 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.798178 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.804907 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.814956 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.820378 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.820409 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.820419 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.820436 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.820447 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.824308 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.850706 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.871500 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.888935 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.902427 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.923822 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.923877 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.923886 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.923899 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.923909 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.928909 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.946412 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.964826 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:19 crc kubenswrapper[4843]: I0318 12:11:19.983428 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.001338 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5a185c4-48ac-4f51-99be-0a9418d9e53f-proxy-tls\") pod \"machine-config-daemon-wstcq\" (UID: \"f5a185c4-48ac-4f51-99be-0a9418d9e53f\") " pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.001443 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8c88bc92-4b87-46ed-ab45-6291502efbfe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.002172 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8c88bc92-4b87-46ed-ab45-6291502efbfe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.006473 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5a185c4-48ac-4f51-99be-0a9418d9e53f-proxy-tls\") pod \"machine-config-daemon-wstcq\" (UID: \"f5a185c4-48ac-4f51-99be-0a9418d9e53f\") " pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.027119 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.027180 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.027193 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.027208 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.027220 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.033844 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.125482 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.125534 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.125547 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.125565 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.125582 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: E0318 12:11:20.145664 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.151123 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.151199 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.151214 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.151243 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.151256 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: E0318 12:11:20.167902 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.173296 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.173358 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.173371 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.173391 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.173405 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: E0318 12:11:20.189009 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.199035 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.199092 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.199106 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.199131 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.199206 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.205101 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mln2h\" (UniqueName: \"kubernetes.io/projected/8c88bc92-4b87-46ed-ab45-6291502efbfe-kube-api-access-mln2h\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.205238 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42zs5\" (UniqueName: \"kubernetes.io/projected/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-kube-api-access-42zs5\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.210396 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42zs5\" (UniqueName: \"kubernetes.io/projected/1aa16ddb-306b-4e37-a33a-b9cdce3c254e-kube-api-access-42zs5\") pod \"multus-ls8r8\" (UID: \"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\") " pod="openshift-multus/multus-ls8r8" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.210887 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mln2h\" (UniqueName: \"kubernetes.io/projected/8c88bc92-4b87-46ed-ab45-6291502efbfe-kube-api-access-mln2h\") pod \"multus-additional-cni-plugins-2hn9k\" (UID: \"8c88bc92-4b87-46ed-ab45-6291502efbfe\") " pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:20 crc kubenswrapper[4843]: E0318 12:11:20.214966 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.220237 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.220294 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.220305 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.220323 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.220335 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: E0318 12:11:20.235207 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: E0318 12:11:20.235382 4843 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.238976 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.239034 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.239047 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.239071 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.239085 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.342753 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.343176 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.342942 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.343188 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.343319 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.343342 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.352782 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ls8r8" Mar 18 12:11:20 crc kubenswrapper[4843]: W0318 12:11:20.367613 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aa16ddb_306b_4e37_a33a_b9cdce3c254e.slice/crio-722517c3367b23aa4bcced25c6af2dc6d46d94cbb79ebf5dae1fbbe77ce57e84 WatchSource:0}: Error finding container 722517c3367b23aa4bcced25c6af2dc6d46d94cbb79ebf5dae1fbbe77ce57e84: Status 404 returned error can't find the container with id 722517c3367b23aa4bcced25c6af2dc6d46d94cbb79ebf5dae1fbbe77ce57e84 Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.447346 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.447402 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.447418 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.447438 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.447453 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.550486 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.550525 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.550535 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.550561 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.550570 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.654074 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.654112 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.654122 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.654158 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.654169 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.746852 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" event={"ID":"8c88bc92-4b87-46ed-ab45-6291502efbfe","Type":"ContainerStarted","Data":"8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.746904 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" event={"ID":"8c88bc92-4b87-46ed-ab45-6291502efbfe","Type":"ContainerStarted","Data":"c6638f3d1eb7f616a9aded8f740209333222d20b3911b7055efaf8b67655f18c"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.749415 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.749512 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.749528 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"746bf7ee57e542453b2702dbc9d6593347586c62a1b2affc7e891713508dbda6"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.750668 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ls8r8" event={"ID":"1aa16ddb-306b-4e37-a33a-b9cdce3c254e","Type":"ContainerStarted","Data":"ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.750705 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ls8r8" event={"ID":"1aa16ddb-306b-4e37-a33a-b9cdce3c254e","Type":"ContainerStarted","Data":"722517c3367b23aa4bcced25c6af2dc6d46d94cbb79ebf5dae1fbbe77ce57e84"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.756724 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.756771 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.756781 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.756796 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.756819 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.774903 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.797072 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.814752 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.832432 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.859564 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.859610 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.859623 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.859644 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.859674 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.861498 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.878114 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.896043 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.911376 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.929263 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.944707 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.956207 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.962065 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.962130 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.962142 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.962161 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.962177 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.968845 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.983107 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:20 crc kubenswrapper[4843]: E0318 12:11:20.983233 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.983117 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:20 crc kubenswrapper[4843]: E0318 12:11:20.983483 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:20 crc kubenswrapper[4843]: I0318 12:11:20.983532 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:20 crc kubenswrapper[4843]: E0318 12:11:20.983761 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.064912 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.064961 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.064971 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.064986 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.064998 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.167749 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.167797 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.167810 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.167829 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.167842 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.270069 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.270098 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.270106 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.270119 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.270129 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.316522 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.372179 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.372222 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.372234 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.372249 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.372261 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.605261 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.605296 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.605305 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.605321 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.605330 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.618190 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.632550 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.647750 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.672859 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.691609 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.707707 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.707749 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.707760 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.707778 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.707790 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.709489 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.724684 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.745378 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.753912 4843 generic.go:334] "Generic (PLEG): container finished" podID="8c88bc92-4b87-46ed-ab45-6291502efbfe" containerID="8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22" exitCode=0 Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.753950 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" event={"ID":"8c88bc92-4b87-46ed-ab45-6291502efbfe","Type":"ContainerDied","Data":"8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22"} Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.782130 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.803826 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.810087 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.810129 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.810141 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.810157 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.810170 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.818476 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.831038 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.844952 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.855450 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.868816 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.883911 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.897833 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.911931 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.913903 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.913943 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.913957 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.913976 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.913989 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.934389 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.951493 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.970709 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4843]: I0318 12:11:21.984762 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.010104 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.018079 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.018123 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.018135 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.018154 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.018166 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.031603 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.048857 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.071201 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.101191 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.127571 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.127634 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.127644 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.127691 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.127706 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.373263 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.373294 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.373303 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.373316 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.373324 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.476742 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.476780 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.476791 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.476808 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.476819 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.580060 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.580134 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.580146 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.580171 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.580186 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.683668 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.684235 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.684252 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.684277 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.684291 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.758719 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" event={"ID":"8c88bc92-4b87-46ed-ab45-6291502efbfe","Type":"ContainerStarted","Data":"87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d"} Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.761075 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d"} Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.761110 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0"} Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.776780 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.786244 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.786289 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.786299 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.786334 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.786345 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.809898 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.860257 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.881888 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.897949 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.898565 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.898749 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.898771 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.898789 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.898801 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.915432 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.948695 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.971171 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.983426 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.983599 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.983843 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:22 crc kubenswrapper[4843]: E0318 12:11:22.984058 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:22 crc kubenswrapper[4843]: E0318 12:11:22.983604 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:22 crc kubenswrapper[4843]: E0318 12:11:22.983862 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:22 crc kubenswrapper[4843]: I0318 12:11:22.992439 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.004209 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.004280 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.004295 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.004315 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.004335 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.012125 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.033578 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.050133 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.072705 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.091820 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.107093 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.107130 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.107140 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.107153 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.107163 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.114500 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.146059 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.165062 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.184190 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.199899 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.209181 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.209214 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.209226 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.209242 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.209253 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.222500 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.244278 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.259641 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.276980 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.293718 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.311632 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.311686 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.311696 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.311708 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.311718 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.313978 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.326341 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.414745 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.415148 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.415159 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.415179 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.415190 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.517888 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.517934 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.517972 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.517993 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.518006 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.620226 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.620279 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.620292 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.620320 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.620332 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.724045 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.724089 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.724107 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.724127 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.724139 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.765145 4843 generic.go:334] "Generic (PLEG): container finished" podID="8c88bc92-4b87-46ed-ab45-6291502efbfe" containerID="87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d" exitCode=0 Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.765192 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" event={"ID":"8c88bc92-4b87-46ed-ab45-6291502efbfe","Type":"ContainerDied","Data":"87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d"} Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.789406 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.813149 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.826741 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.826789 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.826810 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.826829 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.826841 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.833484 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.854906 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.871279 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.902013 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.929092 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.929135 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.929148 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.929166 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.929179 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.929442 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.947987 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.964119 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.982878 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:23 crc kubenswrapper[4843]: I0318 12:11:23.999443 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:23Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.022062 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.031836 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.031864 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.031872 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.031884 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.031893 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.036971 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.134094 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.134149 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.134161 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.134176 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.134188 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.237291 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.237333 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.237342 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.237356 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.237369 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.340235 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.340277 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.340286 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.340298 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.340308 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.442616 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.442707 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.442722 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.442743 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.442756 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.546102 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.546134 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.546146 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.546162 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.546175 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.649151 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.649205 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.649219 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.649239 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.649252 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.751966 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.752003 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.752012 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.752026 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.752036 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.769326 4843 generic.go:334] "Generic (PLEG): container finished" podID="8c88bc92-4b87-46ed-ab45-6291502efbfe" containerID="ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b" exitCode=0 Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.769384 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" event={"ID":"8c88bc92-4b87-46ed-ab45-6291502efbfe","Type":"ContainerDied","Data":"ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b"} Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.790459 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.811060 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.826688 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.845723 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.854393 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.854431 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.854442 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.854457 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.854467 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.865034 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.893249 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.912516 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.929190 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.944040 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.957068 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.957107 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.957117 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.957133 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.957145 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.967017 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.983241 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.983290 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.983360 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:24 crc kubenswrapper[4843]: I0318 12:11:24.983316 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:24 crc kubenswrapper[4843]: E0318 12:11:24.983461 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:24 crc kubenswrapper[4843]: E0318 12:11:24.983519 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:24 crc kubenswrapper[4843]: E0318 12:11:24.983573 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.000258 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:24Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.020880 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.063317 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.063694 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.063838 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.063960 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.064048 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.166992 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.167037 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.167050 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.167068 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.167079 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.270756 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.271252 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.271264 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.271287 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.271305 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.374195 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.374240 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.374252 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.374268 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.374281 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.451033 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-tk5kw"] Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.451989 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tk5kw" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.455298 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.456793 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.456858 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.457264 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.472920 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.477013 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.477056 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.477070 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.477089 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.477104 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.492530 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.515838 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.527008 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8e2b55bb-76e9-4e93-95fd-063957b71f6b-serviceca\") pod \"node-ca-tk5kw\" (UID: \"8e2b55bb-76e9-4e93-95fd-063957b71f6b\") " pod="openshift-image-registry/node-ca-tk5kw" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.527061 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtlpv\" (UniqueName: \"kubernetes.io/projected/8e2b55bb-76e9-4e93-95fd-063957b71f6b-kube-api-access-vtlpv\") pod \"node-ca-tk5kw\" (UID: \"8e2b55bb-76e9-4e93-95fd-063957b71f6b\") " pod="openshift-image-registry/node-ca-tk5kw" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.527223 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e2b55bb-76e9-4e93-95fd-063957b71f6b-host\") pod \"node-ca-tk5kw\" (UID: \"8e2b55bb-76e9-4e93-95fd-063957b71f6b\") " pod="openshift-image-registry/node-ca-tk5kw" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.533224 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.547570 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.563425 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.579990 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.580041 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.580052 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.580067 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.580078 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.584125 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.601065 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.616988 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.628182 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e2b55bb-76e9-4e93-95fd-063957b71f6b-host\") pod \"node-ca-tk5kw\" (UID: \"8e2b55bb-76e9-4e93-95fd-063957b71f6b\") " pod="openshift-image-registry/node-ca-tk5kw" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.628266 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8e2b55bb-76e9-4e93-95fd-063957b71f6b-serviceca\") pod \"node-ca-tk5kw\" (UID: \"8e2b55bb-76e9-4e93-95fd-063957b71f6b\") " pod="openshift-image-registry/node-ca-tk5kw" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.628304 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtlpv\" (UniqueName: \"kubernetes.io/projected/8e2b55bb-76e9-4e93-95fd-063957b71f6b-kube-api-access-vtlpv\") pod \"node-ca-tk5kw\" (UID: \"8e2b55bb-76e9-4e93-95fd-063957b71f6b\") " pod="openshift-image-registry/node-ca-tk5kw" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.629579 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8e2b55bb-76e9-4e93-95fd-063957b71f6b-serviceca\") pod \"node-ca-tk5kw\" (UID: \"8e2b55bb-76e9-4e93-95fd-063957b71f6b\") " pod="openshift-image-registry/node-ca-tk5kw" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.629703 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e2b55bb-76e9-4e93-95fd-063957b71f6b-host\") pod \"node-ca-tk5kw\" (UID: \"8e2b55bb-76e9-4e93-95fd-063957b71f6b\") " pod="openshift-image-registry/node-ca-tk5kw" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.663131 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.684225 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.684297 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.684312 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.684329 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.684342 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.685171 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtlpv\" (UniqueName: \"kubernetes.io/projected/8e2b55bb-76e9-4e93-95fd-063957b71f6b-kube-api-access-vtlpv\") pod \"node-ca-tk5kw\" (UID: \"8e2b55bb-76e9-4e93-95fd-063957b71f6b\") " pod="openshift-image-registry/node-ca-tk5kw" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.693328 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.727450 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.751063 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.771946 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tk5kw" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.772358 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.777522 4843 generic.go:334] "Generic (PLEG): container finished" podID="8c88bc92-4b87-46ed-ab45-6291502efbfe" containerID="c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec" exitCode=0 Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.777591 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" event={"ID":"8c88bc92-4b87-46ed-ab45-6291502efbfe","Type":"ContainerDied","Data":"c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec"} Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.786585 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.786630 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.786641 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.786676 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.786690 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4843]: W0318 12:11:25.787423 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e2b55bb_76e9_4e93_95fd_063957b71f6b.slice/crio-13c9c18dd3f9560d34b798b2e6ac3536550ed68b48d42fbad84c123ffcf42727 WatchSource:0}: Error finding container 13c9c18dd3f9560d34b798b2e6ac3536550ed68b48d42fbad84c123ffcf42727: Status 404 returned error can't find the container with id 13c9c18dd3f9560d34b798b2e6ac3536550ed68b48d42fbad84c123ffcf42727 Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.807316 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.820768 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.866152 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.885271 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.889015 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.889041 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.889048 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.889064 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.889073 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.903611 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.919229 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.940784 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.958016 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.979575 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.992474 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.992551 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.992565 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.992587 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.992603 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4843]: I0318 12:11:25.997550 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.014617 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.034714 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.053487 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.076991 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.096026 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.096059 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.096070 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.096088 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.096103 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.198384 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.198432 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.198444 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.198460 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.198471 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.301481 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.301520 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.301537 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.301554 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.301566 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.405490 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.405523 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.405533 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.405549 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.405558 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.507987 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.508037 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.508048 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.508066 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.508078 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.611293 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.611371 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.611385 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.611407 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.611421 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.714778 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.714815 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.714823 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.714836 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.714846 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.783052 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tk5kw" event={"ID":"8e2b55bb-76e9-4e93-95fd-063957b71f6b","Type":"ContainerStarted","Data":"0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2"} Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.783124 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tk5kw" event={"ID":"8e2b55bb-76e9-4e93-95fd-063957b71f6b","Type":"ContainerStarted","Data":"13c9c18dd3f9560d34b798b2e6ac3536550ed68b48d42fbad84c123ffcf42727"} Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.785467 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f"} Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.788908 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" event={"ID":"8c88bc92-4b87-46ed-ab45-6291502efbfe","Type":"ContainerStarted","Data":"be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d"} Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.801566 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.823865 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.823925 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.823937 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.823958 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.823968 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.826533 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.846578 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.866296 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.889771 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.908125 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.924204 4843 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.927584 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.949406 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.952029 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.952162 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.952207 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.952244 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.952281 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.952480 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.952514 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.952529 4843 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.952563 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.952595 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:58.952568766 +0000 UTC m=+152.668394290 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.952607 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.952604 4843 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.952627 4843 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.952791 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:58.952753341 +0000 UTC m=+152.668579025 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.952573 4843 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.952862 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:58.952834103 +0000 UTC m=+152.668659787 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.952953 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:58.952882725 +0000 UTC m=+152.668708259 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.953033 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:58.953022979 +0000 UTC m=+152.668848503 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.981326 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.983489 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.983673 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.984007 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:26 crc kubenswrapper[4843]: I0318 12:11:26.984149 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.984157 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:26 crc kubenswrapper[4843]: E0318 12:11:26.984382 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.001967 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.020982 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.039671 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: E0318 12:11:27.057007 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.077874 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.093583 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.112421 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.130074 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.143694 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.165688 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.182717 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.202714 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.220112 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.247388 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.264218 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.294985 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.320887 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.340032 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.358109 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.379405 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.395859 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.412976 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.434396 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.453461 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.474952 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.502977 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.522386 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.538051 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.554472 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.578947 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.595401 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.612314 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.629738 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.646410 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.806373 4843 generic.go:334] "Generic (PLEG): container finished" podID="8c88bc92-4b87-46ed-ab45-6291502efbfe" containerID="be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d" exitCode=0 Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.806461 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" event={"ID":"8c88bc92-4b87-46ed-ab45-6291502efbfe","Type":"ContainerDied","Data":"be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d"} Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.826533 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.849842 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.864830 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.882419 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.902587 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.929637 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.950990 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.970244 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:27 crc kubenswrapper[4843]: I0318 12:11:27.988937 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.016819 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.035087 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.053870 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.075459 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.094098 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.813596 4843 generic.go:334] "Generic (PLEG): container finished" podID="8c88bc92-4b87-46ed-ab45-6291502efbfe" containerID="595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3" exitCode=0 Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.813665 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" event={"ID":"8c88bc92-4b87-46ed-ab45-6291502efbfe","Type":"ContainerDied","Data":"595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3"} Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.832769 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.850876 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.869927 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.886456 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.901034 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.914964 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.932160 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.952531 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.968545 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.983608 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.983672 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.983629 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:28 crc kubenswrapper[4843]: E0318 12:11:28.983772 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:28 crc kubenswrapper[4843]: E0318 12:11:28.983838 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:28 crc kubenswrapper[4843]: E0318 12:11:28.983945 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:28 crc kubenswrapper[4843]: I0318 12:11:28.990240 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:28Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.003361 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.028842 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.045121 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.060643 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.818412 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" event={"ID":"8c88bc92-4b87-46ed-ab45-6291502efbfe","Type":"ContainerStarted","Data":"75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029"} Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.838008 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.859424 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.876153 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.892048 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.908327 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.921916 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.938242 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.959147 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.973352 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:29 crc kubenswrapper[4843]: I0318 12:11:29.994577 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:29.999982 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:29Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.020675 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.043141 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.058456 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.078846 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.402064 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.402115 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.402125 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.402140 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.402149 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4843]: E0318 12:11:30.417945 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.422920 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.422957 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.422966 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.422979 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.422989 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4843]: E0318 12:11:30.435239 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.439946 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.439996 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.440005 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.440020 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.440031 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4843]: E0318 12:11:30.454455 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.458575 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.458664 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.458676 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.458693 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.458703 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4843]: E0318 12:11:30.471695 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.476436 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.476484 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.476496 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.476513 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.476525 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4843]: E0318 12:11:30.491146 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:30 crc kubenswrapper[4843]: E0318 12:11:30.491298 4843 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.983750 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.984047 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:30 crc kubenswrapper[4843]: I0318 12:11:30.984108 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:30 crc kubenswrapper[4843]: E0318 12:11:30.984158 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:30 crc kubenswrapper[4843]: E0318 12:11:30.984252 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:30 crc kubenswrapper[4843]: E0318 12:11:30.984389 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.485436 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt"] Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.486085 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.488515 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.488828 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.497441 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20c29405-fb3c-4f34-b5c1-ff6d4745d0d6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-f46tt\" (UID: \"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.497637 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20c29405-fb3c-4f34-b5c1-ff6d4745d0d6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-f46tt\" (UID: \"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.497788 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9vfd\" (UniqueName: \"kubernetes.io/projected/20c29405-fb3c-4f34-b5c1-ff6d4745d0d6-kube-api-access-c9vfd\") pod \"ovnkube-control-plane-749d76644c-f46tt\" (UID: \"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.497918 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20c29405-fb3c-4f34-b5c1-ff6d4745d0d6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-f46tt\" (UID: \"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.505911 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.524722 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.539949 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.566537 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.581206 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.597435 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.598717 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20c29405-fb3c-4f34-b5c1-ff6d4745d0d6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-f46tt\" (UID: \"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.598814 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20c29405-fb3c-4f34-b5c1-ff6d4745d0d6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-f46tt\" (UID: \"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.598857 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vfd\" (UniqueName: \"kubernetes.io/projected/20c29405-fb3c-4f34-b5c1-ff6d4745d0d6-kube-api-access-c9vfd\") pod \"ovnkube-control-plane-749d76644c-f46tt\" (UID: \"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.598906 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20c29405-fb3c-4f34-b5c1-ff6d4745d0d6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-f46tt\" (UID: \"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.599790 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20c29405-fb3c-4f34-b5c1-ff6d4745d0d6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-f46tt\" (UID: \"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.599812 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20c29405-fb3c-4f34-b5c1-ff6d4745d0d6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-f46tt\" (UID: \"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.606940 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20c29405-fb3c-4f34-b5c1-ff6d4745d0d6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-f46tt\" (UID: \"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.612868 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.614935 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9vfd\" (UniqueName: \"kubernetes.io/projected/20c29405-fb3c-4f34-b5c1-ff6d4745d0d6-kube-api-access-c9vfd\") pod \"ovnkube-control-plane-749d76644c-f46tt\" (UID: \"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.626241 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.642783 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.663271 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.681242 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.695440 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.709111 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.729260 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.742992 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.762685 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.797272 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" Mar 18 12:11:31 crc kubenswrapper[4843]: W0318 12:11:31.810321 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20c29405_fb3c_4f34_b5c1_ff6d4745d0d6.slice/crio-42c1da89407aea7ba1b17e5972e2e829fab5bbba0650d10216bfc72e89433669 WatchSource:0}: Error finding container 42c1da89407aea7ba1b17e5972e2e829fab5bbba0650d10216bfc72e89433669: Status 404 returned error can't find the container with id 42c1da89407aea7ba1b17e5972e2e829fab5bbba0650d10216bfc72e89433669 Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.825538 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" event={"ID":"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6","Type":"ContainerStarted","Data":"42c1da89407aea7ba1b17e5972e2e829fab5bbba0650d10216bfc72e89433669"} Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.826946 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-csgs2" event={"ID":"809fa601-6b32-4585-a41a-646cc883bcd6","Type":"ContainerStarted","Data":"1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97"} Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.863359 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.879066 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.901553 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.918525 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.935925 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.951087 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.965937 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:31 crc kubenswrapper[4843]: I0318 12:11:31.983565 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.000280 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:31Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.014051 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.028631 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.045985 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: E0318 12:11:32.060163 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.061541 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.077733 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.092988 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.106981 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.247716 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-sn986"] Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.248204 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:32 crc kubenswrapper[4843]: E0318 12:11:32.248267 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.266518 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.282366 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.301083 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.306834 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbszd\" (UniqueName: \"kubernetes.io/projected/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-kube-api-access-bbszd\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.306889 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.322485 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.339675 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.355506 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.368530 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.382612 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.401252 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.407338 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbszd\" (UniqueName: \"kubernetes.io/projected/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-kube-api-access-bbszd\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.407387 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:32 crc kubenswrapper[4843]: E0318 12:11:32.407490 4843 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:32 crc kubenswrapper[4843]: E0318 12:11:32.407531 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs podName:62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c nodeName:}" failed. No retries permitted until 2026-03-18 12:11:32.907518193 +0000 UTC m=+126.623343717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs") pod "network-metrics-daemon-sn986" (UID: "62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.419225 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.426170 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbszd\" (UniqueName: \"kubernetes.io/projected/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-kube-api-access-bbszd\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.435514 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.451600 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.466080 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.488175 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.500418 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.534078 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.577209 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.832788 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" event={"ID":"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6","Type":"ContainerStarted","Data":"aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266"} Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.832880 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" event={"ID":"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6","Type":"ContainerStarted","Data":"24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa"} Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.850165 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.869908 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.884538 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.900855 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.913971 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:32 crc kubenswrapper[4843]: E0318 12:11:32.914196 4843 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:32 crc kubenswrapper[4843]: E0318 12:11:32.914328 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs podName:62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c nodeName:}" failed. No retries permitted until 2026-03-18 12:11:33.914296056 +0000 UTC m=+127.630121610 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs") pod "network-metrics-daemon-sn986" (UID: "62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.918507 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.940337 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.958831 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.973577 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.983712 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.983842 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:32 crc kubenswrapper[4843]: E0318 12:11:32.983938 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.983858 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:32 crc kubenswrapper[4843]: E0318 12:11:32.984044 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:32 crc kubenswrapper[4843]: E0318 12:11:32.984164 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:32 crc kubenswrapper[4843]: I0318 12:11:32.990522 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.014820 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.030706 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.044968 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.072534 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.093947 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.111342 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.127783 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.149700 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.836973 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2"} Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.857118 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.874477 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.890756 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.905890 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.922094 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.923285 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:33 crc kubenswrapper[4843]: E0318 12:11:33.923431 4843 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:33 crc kubenswrapper[4843]: E0318 12:11:33.923489 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs podName:62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c nodeName:}" failed. No retries permitted until 2026-03-18 12:11:35.923471951 +0000 UTC m=+129.639297475 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs") pod "network-metrics-daemon-sn986" (UID: "62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.945528 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.961472 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.975827 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:33Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:33 crc kubenswrapper[4843]: I0318 12:11:33.982735 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:33 crc kubenswrapper[4843]: E0318 12:11:33.983153 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.003460 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.031436 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.051323 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.069901 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.087790 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.104947 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.121812 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.133081 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.148903 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.842986 4843 generic.go:334] "Generic (PLEG): container finished" podID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerID="741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545" exitCode=0 Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.843031 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545"} Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.866444 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.882209 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.902531 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.916993 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.930487 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.947118 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.966988 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.985244 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.985365 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:34 crc kubenswrapper[4843]: E0318 12:11:34.985472 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.985257 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:34 crc kubenswrapper[4843]: E0318 12:11:34.985601 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:34 crc kubenswrapper[4843]: E0318 12:11:34.985889 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:34 crc kubenswrapper[4843]: I0318 12:11:34.991082 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:34Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.011230 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.028510 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.044512 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.069183 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.085069 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.103520 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.119153 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.135330 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.156572 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.855663 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerStarted","Data":"38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e"} Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.855740 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerStarted","Data":"d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce"} Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.855759 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerStarted","Data":"ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101"} Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.855784 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerStarted","Data":"5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c"} Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.855813 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerStarted","Data":"0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f"} Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.855826 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerStarted","Data":"98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4"} Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.947135 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:35 crc kubenswrapper[4843]: E0318 12:11:35.947337 4843 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:35 crc kubenswrapper[4843]: E0318 12:11:35.947465 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs podName:62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c nodeName:}" failed. No retries permitted until 2026-03-18 12:11:39.947439687 +0000 UTC m=+133.663265291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs") pod "network-metrics-daemon-sn986" (UID: "62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:35 crc kubenswrapper[4843]: I0318 12:11:35.983266 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:35 crc kubenswrapper[4843]: E0318 12:11:35.983481 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:36 crc kubenswrapper[4843]: I0318 12:11:36.982740 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:36 crc kubenswrapper[4843]: I0318 12:11:36.983146 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:36 crc kubenswrapper[4843]: I0318 12:11:36.983223 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:36 crc kubenswrapper[4843]: E0318 12:11:36.983249 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:36 crc kubenswrapper[4843]: E0318 12:11:36.983355 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:36 crc kubenswrapper[4843]: E0318 12:11:36.983416 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:36 crc kubenswrapper[4843]: I0318 12:11:36.998059 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:36Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.014122 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.032018 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.046733 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: E0318 12:11:37.061120 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.061355 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.082413 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.098209 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.119147 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.143488 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.168694 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.219743 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.233359 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.245969 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.266514 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.281450 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.299195 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.316194 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.870114 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerStarted","Data":"22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94"} Mar 18 12:11:37 crc kubenswrapper[4843]: I0318 12:11:37.983553 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:37 crc kubenswrapper[4843]: E0318 12:11:37.983806 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:38 crc kubenswrapper[4843]: I0318 12:11:38.983686 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:38 crc kubenswrapper[4843]: I0318 12:11:38.983863 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:38 crc kubenswrapper[4843]: I0318 12:11:38.984044 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:38 crc kubenswrapper[4843]: E0318 12:11:38.984032 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:38 crc kubenswrapper[4843]: E0318 12:11:38.984226 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:38 crc kubenswrapper[4843]: E0318 12:11:38.984343 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:39 crc kubenswrapper[4843]: I0318 12:11:39.881897 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerStarted","Data":"ce3c8643410faefd821c5deae5404c626b2fe9e8ef91b283fc4ed8eed056e149"} Mar 18 12:11:39 crc kubenswrapper[4843]: I0318 12:11:39.883075 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:39 crc kubenswrapper[4843]: I0318 12:11:39.883159 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:39 crc kubenswrapper[4843]: I0318 12:11:39.883254 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:39 crc kubenswrapper[4843]: I0318 12:11:39.902248 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:39Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:39 crc kubenswrapper[4843]: I0318 12:11:39.916607 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:39 crc kubenswrapper[4843]: I0318 12:11:39.924211 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:39Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:39 crc kubenswrapper[4843]: I0318 12:11:39.931441 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:39 crc kubenswrapper[4843]: I0318 12:11:39.955068 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:39Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:39 crc kubenswrapper[4843]: I0318 12:11:39.972505 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:39Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:39 crc kubenswrapper[4843]: I0318 12:11:39.983484 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:39 crc kubenswrapper[4843]: E0318 12:11:39.983780 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:39 crc kubenswrapper[4843]: I0318 12:11:39.990064 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:39Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.003001 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:40 crc kubenswrapper[4843]: E0318 12:11:40.003194 4843 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:40 crc kubenswrapper[4843]: E0318 12:11:40.003276 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs podName:62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c nodeName:}" failed. No retries permitted until 2026-03-18 12:11:48.003255378 +0000 UTC m=+141.719080902 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs") pod "network-metrics-daemon-sn986" (UID: "62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.005185 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.030081 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3c8643410faefd821c5deae5404c626b2fe9e8ef91b283fc4ed8eed056e149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.046499 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.061755 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.079371 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.099312 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.117542 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.135360 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.152343 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.173292 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.188232 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.203809 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.259592 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.283738 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.303715 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.326298 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3c8643410faefd821c5deae5404c626b2fe9e8ef91b283fc4ed8eed056e149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.337662 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.349762 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.386115 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.405536 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.423505 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.443280 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.469166 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.494806 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.515635 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.531151 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.547468 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.564772 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.585126 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.895074 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.895131 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.895144 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.895164 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.895181 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4843]: E0318 12:11:40.915466 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.920576 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.920626 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.920639 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.920674 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.920685 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4843]: E0318 12:11:40.936382 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.942272 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.942311 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.942323 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.942342 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.942356 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4843]: E0318 12:11:40.958825 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.964959 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.965032 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.965046 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.965069 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.965083 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.983357 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.983423 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.983446 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:40 crc kubenswrapper[4843]: E0318 12:11:40.983556 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:40 crc kubenswrapper[4843]: E0318 12:11:40.983690 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:40 crc kubenswrapper[4843]: E0318 12:11:40.983756 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:40 crc kubenswrapper[4843]: E0318 12:11:40.984240 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:40Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.990223 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.990276 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.990290 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.990311 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4843]: I0318 12:11:40.990325 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:41 crc kubenswrapper[4843]: E0318 12:11:41.004055 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:41Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:41 crc kubenswrapper[4843]: E0318 12:11:41.004181 4843 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:11:41 crc kubenswrapper[4843]: I0318 12:11:41.983593 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:41 crc kubenswrapper[4843]: E0318 12:11:41.983852 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:42 crc kubenswrapper[4843]: E0318 12:11:42.063011 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:11:42 crc kubenswrapper[4843]: I0318 12:11:42.982930 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:42 crc kubenswrapper[4843]: I0318 12:11:42.983071 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:42 crc kubenswrapper[4843]: E0318 12:11:42.983174 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:42 crc kubenswrapper[4843]: E0318 12:11:42.983292 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:42 crc kubenswrapper[4843]: I0318 12:11:42.983630 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:42 crc kubenswrapper[4843]: E0318 12:11:42.983791 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:43 crc kubenswrapper[4843]: I0318 12:11:43.900816 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/0.log" Mar 18 12:11:43 crc kubenswrapper[4843]: I0318 12:11:43.904313 4843 generic.go:334] "Generic (PLEG): container finished" podID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerID="ce3c8643410faefd821c5deae5404c626b2fe9e8ef91b283fc4ed8eed056e149" exitCode=1 Mar 18 12:11:43 crc kubenswrapper[4843]: I0318 12:11:43.904389 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"ce3c8643410faefd821c5deae5404c626b2fe9e8ef91b283fc4ed8eed056e149"} Mar 18 12:11:43 crc kubenswrapper[4843]: I0318 12:11:43.905574 4843 scope.go:117] "RemoveContainer" containerID="ce3c8643410faefd821c5deae5404c626b2fe9e8ef91b283fc4ed8eed056e149" Mar 18 12:11:43 crc kubenswrapper[4843]: I0318 12:11:43.982844 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:43 crc kubenswrapper[4843]: E0318 12:11:43.983099 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:43 crc kubenswrapper[4843]: I0318 12:11:43.993551 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:43Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.013587 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.034444 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.054159 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.081192 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3c8643410faefd821c5deae5404c626b2fe9e8ef91b283fc4ed8eed056e149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3c8643410faefd821c5deae5404c626b2fe9e8ef91b283fc4ed8eed056e149\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:11:43Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:43.294284 6814 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:43.294406 6814 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:43.294461 6814 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:43.294818 6814 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:43.295855 6814 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:43.296105 6814 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:43.296532 6814 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.095602 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.110206 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.125761 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.142802 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.162805 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.182325 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.199372 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.218382 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.237735 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.253949 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.267484 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.284716 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.911569 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/0.log" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.915619 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerStarted","Data":"b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929"} Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.916345 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.942253 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.969635 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.982941 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:44 crc kubenswrapper[4843]: E0318 12:11:44.983434 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.983114 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:44 crc kubenswrapper[4843]: E0318 12:11:44.983898 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.982991 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:44 crc kubenswrapper[4843]: E0318 12:11:44.984061 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:44 crc kubenswrapper[4843]: I0318 12:11:44.994688 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.013620 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.033385 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.055336 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.073819 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.092540 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.112450 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.136850 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.166714 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.189803 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.211361 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.230767 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.257703 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3c8643410faefd821c5deae5404c626b2fe9e8ef91b283fc4ed8eed056e149\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:11:43Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:43.294284 6814 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:43.294406 6814 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:43.294461 6814 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:43.294818 6814 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:43.295855 6814 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:43.296105 6814 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:43.296532 6814 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.274080 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.293307 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.922766 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/1.log" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.924423 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/0.log" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.928892 4843 generic.go:334] "Generic (PLEG): container finished" podID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerID="b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929" exitCode=1 Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.929001 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929"} Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.929100 4843 scope.go:117] "RemoveContainer" containerID="ce3c8643410faefd821c5deae5404c626b2fe9e8ef91b283fc4ed8eed056e149" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.930212 4843 scope.go:117] "RemoveContainer" containerID="b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929" Mar 18 12:11:45 crc kubenswrapper[4843]: E0318 12:11:45.930617 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.959215 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.979414 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:45 crc kubenswrapper[4843]: I0318 12:11:45.983941 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:45 crc kubenswrapper[4843]: E0318 12:11:45.984352 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.004610 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.008040 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.027755 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.046859 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.066463 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.083484 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.102477 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.120056 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.143618 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.162733 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.181266 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.197507 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.221362 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce3c8643410faefd821c5deae5404c626b2fe9e8ef91b283fc4ed8eed056e149\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:11:43Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:43.294284 6814 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:43.294406 6814 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:43.294461 6814 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:43.294818 6814 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:43.295855 6814 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:43.296105 6814 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:43.296532 6814 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\" 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.184643 6969 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185038 6969 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185701 6969 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 12:11:45.185641 6969 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:45.187762 6969 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 12:11:45.187794 6969 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 12:11:45.187828 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 12:11:45.187872 6969 factory.go:656] Stopping watch factory\\\\nI0318 12:11:45.187896 6969 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:11:45.187938 6969 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 12:11:45.187952 6969 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 12:11:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.238146 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.255855 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.282119 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.936402 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/1.log" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.946122 4843 scope.go:117] "RemoveContainer" containerID="b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929" Mar 18 12:11:46 crc kubenswrapper[4843]: E0318 12:11:46.946377 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.961065 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.982967 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.983021 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:46 crc kubenswrapper[4843]: E0318 12:11:46.983134 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.983359 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:46 crc kubenswrapper[4843]: E0318 12:11:46.983425 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:46 crc kubenswrapper[4843]: E0318 12:11:46.983606 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:46 crc kubenswrapper[4843]: I0318 12:11:46.995459 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.025705 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.043876 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.061886 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: E0318 12:11:47.063996 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.082075 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.105851 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\" 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.184643 6969 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185038 6969 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185701 6969 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 12:11:45.185641 6969 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:45.187762 6969 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 12:11:45.187794 6969 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 12:11:45.187828 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 12:11:45.187872 6969 factory.go:656] Stopping watch factory\\\\nI0318 12:11:45.187896 6969 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:11:45.187938 6969 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 12:11:45.187952 6969 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 12:11:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.123783 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.141433 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.156635 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.170776 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.186987 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd54ffb3-185b-435b-8816-0e68777c3d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:25Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:09:55.993473 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:09:55.994204 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:09:55.995250 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:09:55.996008 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 12:10:25.662059 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 12:10:25.662183 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.200396 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.215372 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.232179 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.250520 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.266138 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.286256 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.304023 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd54ffb3-185b-435b-8816-0e68777c3d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:25Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:09:55.993473 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:09:55.994204 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:09:55.995250 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:09:55.996008 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 12:10:25.662059 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 12:10:25.662183 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.319328 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.335073 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.350701 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.368191 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.398232 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.428967 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.456141 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.481419 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.516805 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.539632 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.557996 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.575415 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.602504 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\" 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.184643 6969 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185038 6969 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185701 6969 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 12:11:45.185641 6969 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:45.187762 6969 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 12:11:45.187794 6969 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 12:11:45.187828 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 12:11:45.187872 6969 factory.go:656] Stopping watch factory\\\\nI0318 12:11:45.187896 6969 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:11:45.187938 6969 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 12:11:45.187952 6969 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 12:11:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.618413 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.637329 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.656051 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.674284 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:47 crc kubenswrapper[4843]: I0318 12:11:47.982901 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:47 crc kubenswrapper[4843]: E0318 12:11:47.983107 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:48 crc kubenswrapper[4843]: I0318 12:11:48.023189 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:48 crc kubenswrapper[4843]: E0318 12:11:48.023371 4843 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:48 crc kubenswrapper[4843]: E0318 12:11:48.023449 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs podName:62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.023429627 +0000 UTC m=+157.739255151 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs") pod "network-metrics-daemon-sn986" (UID: "62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:48 crc kubenswrapper[4843]: I0318 12:11:48.982935 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:48 crc kubenswrapper[4843]: I0318 12:11:48.983102 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:48 crc kubenswrapper[4843]: I0318 12:11:48.983219 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:48 crc kubenswrapper[4843]: E0318 12:11:48.983204 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:48 crc kubenswrapper[4843]: E0318 12:11:48.983363 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:48 crc kubenswrapper[4843]: E0318 12:11:48.983530 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:49 crc kubenswrapper[4843]: I0318 12:11:49.983128 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:49 crc kubenswrapper[4843]: E0318 12:11:49.983448 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:50 crc kubenswrapper[4843]: I0318 12:11:50.983523 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:50 crc kubenswrapper[4843]: I0318 12:11:50.983601 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:50 crc kubenswrapper[4843]: E0318 12:11:50.983771 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:50 crc kubenswrapper[4843]: E0318 12:11:50.983929 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:50 crc kubenswrapper[4843]: I0318 12:11:50.983996 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:50 crc kubenswrapper[4843]: E0318 12:11:50.984075 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.118723 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.118794 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.118807 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.118829 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.118844 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4843]: E0318 12:11:51.137120 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:51Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.143160 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.143215 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.143253 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.143278 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.143296 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4843]: E0318 12:11:51.159759 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:51Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.166505 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.166549 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.166560 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.166577 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.166588 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4843]: E0318 12:11:51.184678 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:51Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.191861 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.191930 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.191951 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.191975 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.191989 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4843]: E0318 12:11:51.209045 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:51Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.215078 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.215121 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.215133 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.215169 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.215183 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4843]: E0318 12:11:51.230898 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:51Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:51 crc kubenswrapper[4843]: E0318 12:11:51.231078 4843 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:11:51 crc kubenswrapper[4843]: I0318 12:11:51.982964 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:51 crc kubenswrapper[4843]: E0318 12:11:51.983157 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:52 crc kubenswrapper[4843]: E0318 12:11:52.065532 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:11:52 crc kubenswrapper[4843]: I0318 12:11:52.982873 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:52 crc kubenswrapper[4843]: I0318 12:11:52.982965 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:52 crc kubenswrapper[4843]: I0318 12:11:52.982898 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:52 crc kubenswrapper[4843]: E0318 12:11:52.983066 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:52 crc kubenswrapper[4843]: E0318 12:11:52.983141 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:52 crc kubenswrapper[4843]: E0318 12:11:52.983322 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:53 crc kubenswrapper[4843]: I0318 12:11:53.983043 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:53 crc kubenswrapper[4843]: E0318 12:11:53.983325 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:54 crc kubenswrapper[4843]: I0318 12:11:54.986375 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:54 crc kubenswrapper[4843]: E0318 12:11:54.986590 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:54 crc kubenswrapper[4843]: I0318 12:11:54.986864 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:54 crc kubenswrapper[4843]: I0318 12:11:54.986919 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:54 crc kubenswrapper[4843]: E0318 12:11:54.987068 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:54 crc kubenswrapper[4843]: E0318 12:11:54.987275 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:55 crc kubenswrapper[4843]: I0318 12:11:55.982807 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:55 crc kubenswrapper[4843]: E0318 12:11:55.982989 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:56 crc kubenswrapper[4843]: I0318 12:11:56.983393 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:56 crc kubenswrapper[4843]: I0318 12:11:56.983401 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:56 crc kubenswrapper[4843]: E0318 12:11:56.983606 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:56 crc kubenswrapper[4843]: E0318 12:11:56.983758 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:56 crc kubenswrapper[4843]: I0318 12:11:56.983400 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:56 crc kubenswrapper[4843]: E0318 12:11:56.983886 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.009735 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\" 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.184643 6969 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185038 6969 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185701 6969 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 12:11:45.185641 6969 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:45.187762 6969 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 12:11:45.187794 6969 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 12:11:45.187828 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 12:11:45.187872 6969 factory.go:656] Stopping watch factory\\\\nI0318 12:11:45.187896 6969 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:11:45.187938 6969 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 12:11:45.187952 6969 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 12:11:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.022732 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.036763 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.063242 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: E0318 12:11:57.066340 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.086328 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.102917 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.118916 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.206995 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.230494 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.249788 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.263804 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.279048 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.297546 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd54ffb3-185b-435b-8816-0e68777c3d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:25Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:09:55.993473 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:09:55.994204 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:09:55.995250 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:09:55.996008 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 12:10:25.662059 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 12:10:25.662183 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.317138 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.333806 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.353412 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.370988 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.390224 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.983215 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:57 crc kubenswrapper[4843]: E0318 12:11:57.983371 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:11:57 crc kubenswrapper[4843]: I0318 12:11:57.999423 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 12:11:58 crc kubenswrapper[4843]: I0318 12:11:58.983279 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:58 crc kubenswrapper[4843]: I0318 12:11:58.983344 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:58 crc kubenswrapper[4843]: I0318 12:11:58.983394 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:58 crc kubenswrapper[4843]: E0318 12:11:58.983519 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:58 crc kubenswrapper[4843]: E0318 12:11:58.983732 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:58 crc kubenswrapper[4843]: E0318 12:11:58.983894 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:59 crc kubenswrapper[4843]: I0318 12:11:59.022362 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.022512 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:03.022479282 +0000 UTC m=+216.738304806 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:59 crc kubenswrapper[4843]: I0318 12:11:59.022625 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:59 crc kubenswrapper[4843]: I0318 12:11:59.022709 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:59 crc kubenswrapper[4843]: I0318 12:11:59.022741 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:59 crc kubenswrapper[4843]: I0318 12:11:59.022775 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.022857 4843 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.022890 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.022917 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.022934 4843 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.022918 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:13:03.022907673 +0000 UTC m=+216.738733207 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.022984 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:13:03.022972255 +0000 UTC m=+216.738797779 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.023126 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.023189 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.023211 4843 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.023126 4843 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.023333 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:13:03.023309735 +0000 UTC m=+216.739135259 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.023376 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:13:03.023354596 +0000 UTC m=+216.739180120 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:59 crc kubenswrapper[4843]: I0318 12:11:59.983026 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:11:59 crc kubenswrapper[4843]: E0318 12:11:59.983721 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:00 crc kubenswrapper[4843]: I0318 12:12:00.982897 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:00 crc kubenswrapper[4843]: I0318 12:12:00.982897 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:00 crc kubenswrapper[4843]: I0318 12:12:00.983117 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:00 crc kubenswrapper[4843]: E0318 12:12:00.983155 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:00 crc kubenswrapper[4843]: E0318 12:12:00.983357 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:00 crc kubenswrapper[4843]: E0318 12:12:00.983939 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:00 crc kubenswrapper[4843]: I0318 12:12:00.984314 4843 scope.go:117] "RemoveContainer" containerID="b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.568428 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.568473 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.568483 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.568517 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.568531 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4843]: E0318 12:12:01.585728 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.591013 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.591086 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.591100 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.591147 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.591163 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4843]: E0318 12:12:01.607265 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.612490 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.612532 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.612561 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.612580 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.612594 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4843]: E0318 12:12:01.629719 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.635176 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.635224 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.635234 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.635254 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.635266 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4843]: E0318 12:12:01.652003 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.658057 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.658112 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.658122 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.658142 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.658153 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4843]: E0318 12:12:01.674428 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4843]: E0318 12:12:01.674680 4843 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:12:01 crc kubenswrapper[4843]: I0318 12:12:01.983432 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:01 crc kubenswrapper[4843]: E0318 12:12:01.983669 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.011581 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/2.log" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.012179 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/1.log" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.015565 4843 generic.go:334] "Generic (PLEG): container finished" podID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerID="410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15" exitCode=1 Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.015617 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15"} Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.015739 4843 scope.go:117] "RemoveContainer" containerID="b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.016711 4843 scope.go:117] "RemoveContainer" containerID="410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15" Mar 18 12:12:02 crc kubenswrapper[4843]: E0318 12:12:02.017229 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.033648 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.052010 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.064222 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cba39ba-f80c-4083-8d84-a1c2a9cc7255\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d775ddd85f583bb387bee13d76b0cbbf7e4e045db6976fbade6d2e66bfdc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: E0318 12:12:02.067827 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.080393 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.095889 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.109792 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.134823 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\" 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.184643 6969 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185038 6969 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185701 6969 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 12:11:45.185641 6969 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:45.187762 6969 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 12:11:45.187794 6969 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 12:11:45.187828 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 12:11:45.187872 6969 factory.go:656] Stopping watch factory\\\\nI0318 12:11:45.187896 6969 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:11:45.187938 6969 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 12:11:45.187952 6969 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 12:11:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"ndler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0318 12:12:01.938416 7170 services_controller.go:434] Service openshift-etcd/etcd retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{etcd openshift-etcd ad0a4b9d-2a7b-4f3f-9020-0c45d515459d 4800 0 2025-02-23 05:11:51 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:etcd] map[operator.openshift.io/spec-hash:0685cfaa0976bfb7ba58513629369c20bf05f4fba36949e982bdb43af328f0e1 prometheus.io/scheme:https prometheus.io/scrape:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:etcd,Protocol:TCP,Port:2379,TargetPort:{0 2379 },NodePort:0,AppProtocol:nil\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.150254 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.172246 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.196185 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.212441 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.228077 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.242958 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.258911 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.275873 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.292680 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.306977 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.324015 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.338459 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd54ffb3-185b-435b-8816-0e68777c3d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:25Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:09:55.993473 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:09:55.994204 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:09:55.995250 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:09:55.996008 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 12:10:25.662059 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 12:10:25.662183 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.983336 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:02 crc kubenswrapper[4843]: E0318 12:12:02.983551 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.983860 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:02 crc kubenswrapper[4843]: E0318 12:12:02.983922 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:02 crc kubenswrapper[4843]: I0318 12:12:02.984269 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:02 crc kubenswrapper[4843]: E0318 12:12:02.984469 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:03 crc kubenswrapper[4843]: I0318 12:12:03.021231 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/2.log" Mar 18 12:12:03 crc kubenswrapper[4843]: I0318 12:12:03.982700 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:03 crc kubenswrapper[4843]: E0318 12:12:03.982875 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:04 crc kubenswrapper[4843]: I0318 12:12:04.084070 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:04 crc kubenswrapper[4843]: E0318 12:12:04.084307 4843 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:04 crc kubenswrapper[4843]: E0318 12:12:04.084409 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs podName:62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c nodeName:}" failed. No retries permitted until 2026-03-18 12:12:36.084377636 +0000 UTC m=+189.800203180 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs") pod "network-metrics-daemon-sn986" (UID: "62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:04 crc kubenswrapper[4843]: I0318 12:12:04.983463 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:04 crc kubenswrapper[4843]: I0318 12:12:04.983463 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:04 crc kubenswrapper[4843]: I0318 12:12:04.983489 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:04 crc kubenswrapper[4843]: E0318 12:12:04.983680 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:04 crc kubenswrapper[4843]: E0318 12:12:04.983762 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:04 crc kubenswrapper[4843]: E0318 12:12:04.983947 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:05 crc kubenswrapper[4843]: I0318 12:12:05.983374 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:05 crc kubenswrapper[4843]: E0318 12:12:05.983612 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:06 crc kubenswrapper[4843]: I0318 12:12:06.983068 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:06 crc kubenswrapper[4843]: I0318 12:12:06.983140 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:06 crc kubenswrapper[4843]: E0318 12:12:06.983235 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:06 crc kubenswrapper[4843]: I0318 12:12:06.983392 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:06 crc kubenswrapper[4843]: E0318 12:12:06.983505 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:06 crc kubenswrapper[4843]: E0318 12:12:06.983699 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.000791 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.016904 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.031226 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.047409 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.064671 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd54ffb3-185b-435b-8816-0e68777c3d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:25Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:09:55.993473 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:09:55.994204 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:09:55.995250 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:09:55.996008 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 12:10:25.662059 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 12:10:25.662183 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: E0318 12:12:07.068592 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.084558 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.102527 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.116539 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cba39ba-f80c-4083-8d84-a1c2a9cc7255\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d775ddd85f583bb387bee13d76b0cbbf7e4e045db6976fbade6d2e66bfdc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.132379 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.147153 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.162169 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.189221 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\" 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.184643 6969 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185038 6969 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185701 6969 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 12:11:45.185641 6969 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:45.187762 6969 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 12:11:45.187794 6969 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 12:11:45.187828 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 12:11:45.187872 6969 factory.go:656] Stopping watch factory\\\\nI0318 12:11:45.187896 6969 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:11:45.187938 6969 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 12:11:45.187952 6969 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 12:11:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"ndler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0318 12:12:01.938416 7170 services_controller.go:434] Service openshift-etcd/etcd retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{etcd openshift-etcd ad0a4b9d-2a7b-4f3f-9020-0c45d515459d 4800 0 2025-02-23 05:11:51 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:etcd] map[operator.openshift.io/spec-hash:0685cfaa0976bfb7ba58513629369c20bf05f4fba36949e982bdb43af328f0e1 prometheus.io/scheme:https prometheus.io/scrape:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:etcd,Protocol:TCP,Port:2379,TargetPort:{0 2379 },NodePort:0,AppProtocol:nil\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.203395 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.217217 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.241512 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.260900 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.275298 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.290012 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.309065 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4843]: I0318 12:12:07.983408 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:07 crc kubenswrapper[4843]: E0318 12:12:07.984147 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:08 crc kubenswrapper[4843]: I0318 12:12:08.982988 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:08 crc kubenswrapper[4843]: E0318 12:12:08.983560 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:08 crc kubenswrapper[4843]: I0318 12:12:08.983152 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:08 crc kubenswrapper[4843]: E0318 12:12:08.983966 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:08 crc kubenswrapper[4843]: I0318 12:12:08.983114 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:08 crc kubenswrapper[4843]: E0318 12:12:08.985007 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.047632 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ls8r8_1aa16ddb-306b-4e37-a33a-b9cdce3c254e/kube-multus/0.log" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.047738 4843 generic.go:334] "Generic (PLEG): container finished" podID="1aa16ddb-306b-4e37-a33a-b9cdce3c254e" containerID="ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d" exitCode=1 Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.047787 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ls8r8" event={"ID":"1aa16ddb-306b-4e37-a33a-b9cdce3c254e","Type":"ContainerDied","Data":"ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d"} Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.048266 4843 scope.go:117] "RemoveContainer" containerID="ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.064884 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.098167 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.118151 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.139446 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.160478 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.187714 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\" 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.184643 6969 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185038 6969 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185701 6969 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 12:11:45.185641 6969 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:45.187762 6969 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 12:11:45.187794 6969 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 12:11:45.187828 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 12:11:45.187872 6969 factory.go:656] Stopping watch factory\\\\nI0318 12:11:45.187896 6969 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:11:45.187938 6969 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 12:11:45.187952 6969 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 12:11:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"ndler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0318 12:12:01.938416 7170 services_controller.go:434] Service openshift-etcd/etcd retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{etcd openshift-etcd ad0a4b9d-2a7b-4f3f-9020-0c45d515459d 4800 0 2025-02-23 05:11:51 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:etcd] map[operator.openshift.io/spec-hash:0685cfaa0976bfb7ba58513629369c20bf05f4fba36949e982bdb43af328f0e1 prometheus.io/scheme:https prometheus.io/scrape:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:etcd,Protocol:TCP,Port:2379,TargetPort:{0 2379 },NodePort:0,AppProtocol:nil\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.204927 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.224543 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.244639 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.265309 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74\\\\n2026-03-18T12:11:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74 to /host/opt/cni/bin/\\\\n2026-03-18T12:11:23Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:23Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.285501 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd54ffb3-185b-435b-8816-0e68777c3d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:25Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:09:55.993473 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:09:55.994204 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:09:55.995250 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:09:55.996008 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 12:10:25.662059 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 12:10:25.662183 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.304089 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.322985 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.341620 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.355791 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.371173 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.386739 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cba39ba-f80c-4083-8d84-a1c2a9cc7255\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d775ddd85f583bb387bee13d76b0cbbf7e4e045db6976fbade6d2e66bfdc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.406798 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.441143 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4843]: I0318 12:12:09.982949 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:09 crc kubenswrapper[4843]: E0318 12:12:09.983142 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.054215 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ls8r8_1aa16ddb-306b-4e37-a33a-b9cdce3c254e/kube-multus/0.log" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.054300 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ls8r8" event={"ID":"1aa16ddb-306b-4e37-a33a-b9cdce3c254e","Type":"ContainerStarted","Data":"b99a8424a67312606a51f09b6fe9197761584e8c306318cdb3f765a2aafc3223"} Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.069950 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cba39ba-f80c-4083-8d84-a1c2a9cc7255\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d775ddd85f583bb387bee13d76b0cbbf7e4e045db6976fbade6d2e66bfdc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.087734 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.105784 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.119536 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.144242 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.164168 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.179637 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.195351 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.219752 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7077c34c4a8cec3d7d7588401edcab3d3107eb9591856e8b1752d183d02c929\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\" 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.184643 6969 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185038 6969 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:11:45.185701 6969 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0318 12:11:45.185641 6969 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:11:45.187762 6969 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0318 12:11:45.187794 6969 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0318 12:11:45.187828 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 12:11:45.187872 6969 factory.go:656] Stopping watch factory\\\\nI0318 12:11:45.187896 6969 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:11:45.187938 6969 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0318 12:11:45.187952 6969 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0318 12:11:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"ndler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0318 12:12:01.938416 7170 services_controller.go:434] Service openshift-etcd/etcd retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{etcd openshift-etcd ad0a4b9d-2a7b-4f3f-9020-0c45d515459d 4800 0 2025-02-23 05:11:51 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:etcd] map[operator.openshift.io/spec-hash:0685cfaa0976bfb7ba58513629369c20bf05f4fba36949e982bdb43af328f0e1 prometheus.io/scheme:https prometheus.io/scrape:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:etcd,Protocol:TCP,Port:2379,TargetPort:{0 2379 },NodePort:0,AppProtocol:nil\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.237551 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.254362 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.271419 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.287876 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b99a8424a67312606a51f09b6fe9197761584e8c306318cdb3f765a2aafc3223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74\\\\n2026-03-18T12:11:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74 to /host/opt/cni/bin/\\\\n2026-03-18T12:11:23Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:23Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.310189 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd54ffb3-185b-435b-8816-0e68777c3d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:25Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:09:55.993473 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:09:55.994204 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:09:55.995250 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:09:55.996008 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 12:10:25.662059 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 12:10:25.662183 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.327376 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.342398 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.361682 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.379271 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.396458 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.983537 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:10 crc kubenswrapper[4843]: E0318 12:12:10.983817 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.984128 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:10 crc kubenswrapper[4843]: E0318 12:12:10.984205 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:10 crc kubenswrapper[4843]: I0318 12:12:10.984464 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:10 crc kubenswrapper[4843]: E0318 12:12:10.984527 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.766372 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.766452 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.766472 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.766502 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.766527 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4843]: E0318 12:12:11.785627 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.791200 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.791287 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.791308 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.791332 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.791349 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4843]: E0318 12:12:11.808708 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.814740 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.814803 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.814816 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.814838 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.814852 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4843]: E0318 12:12:11.830247 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.835177 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.835639 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.835675 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.835697 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.835709 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4843]: E0318 12:12:11.849546 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.854150 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.854199 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.854220 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.854246 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.854261 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4843]: E0318 12:12:11.868354 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4843]: E0318 12:12:11.868478 4843 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:12:11 crc kubenswrapper[4843]: I0318 12:12:11.983343 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:11 crc kubenswrapper[4843]: E0318 12:12:11.983570 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:12 crc kubenswrapper[4843]: E0318 12:12:12.070397 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:12 crc kubenswrapper[4843]: I0318 12:12:12.983313 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:12 crc kubenswrapper[4843]: I0318 12:12:12.983491 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:12 crc kubenswrapper[4843]: I0318 12:12:12.983520 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:12 crc kubenswrapper[4843]: E0318 12:12:12.983516 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:12 crc kubenswrapper[4843]: E0318 12:12:12.983802 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:12 crc kubenswrapper[4843]: E0318 12:12:12.984371 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:12 crc kubenswrapper[4843]: I0318 12:12:12.984518 4843 scope.go:117] "RemoveContainer" containerID="410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15" Mar 18 12:12:12 crc kubenswrapper[4843]: E0318 12:12:12.984792 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.007889 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.025301 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.043176 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b99a8424a67312606a51f09b6fe9197761584e8c306318cdb3f765a2aafc3223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74\\\\n2026-03-18T12:11:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74 to /host/opt/cni/bin/\\\\n2026-03-18T12:11:23Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:23Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.057865 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.076007 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd54ffb3-185b-435b-8816-0e68777c3d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:25Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:09:55.993473 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:09:55.994204 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:09:55.995250 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:09:55.996008 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 12:10:25.662059 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 12:10:25.662183 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.089965 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.103330 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.117385 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.130563 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.144167 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cba39ba-f80c-4083-8d84-a1c2a9cc7255\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d775ddd85f583bb387bee13d76b0cbbf7e4e045db6976fbade6d2e66bfdc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.159969 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.177741 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.189891 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.203977 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.226884 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.241891 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.254594 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.269376 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.290786 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"ndler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0318 12:12:01.938416 7170 services_controller.go:434] Service openshift-etcd/etcd retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{etcd openshift-etcd ad0a4b9d-2a7b-4f3f-9020-0c45d515459d 4800 0 2025-02-23 05:11:51 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:etcd] map[operator.openshift.io/spec-hash:0685cfaa0976bfb7ba58513629369c20bf05f4fba36949e982bdb43af328f0e1 prometheus.io/scheme:https prometheus.io/scrape:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:etcd,Protocol:TCP,Port:2379,TargetPort:{0 2379 },NodePort:0,AppProtocol:nil\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:13Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:13 crc kubenswrapper[4843]: I0318 12:12:13.983236 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:13 crc kubenswrapper[4843]: E0318 12:12:13.983441 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:14 crc kubenswrapper[4843]: I0318 12:12:14.983990 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:14 crc kubenswrapper[4843]: I0318 12:12:14.984055 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:14 crc kubenswrapper[4843]: I0318 12:12:14.984264 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:14 crc kubenswrapper[4843]: E0318 12:12:14.984429 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:14 crc kubenswrapper[4843]: E0318 12:12:14.984616 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:14 crc kubenswrapper[4843]: E0318 12:12:14.984778 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:15 crc kubenswrapper[4843]: I0318 12:12:15.983143 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:15 crc kubenswrapper[4843]: E0318 12:12:15.983370 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:16 crc kubenswrapper[4843]: I0318 12:12:16.984115 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:16 crc kubenswrapper[4843]: I0318 12:12:16.984182 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:16 crc kubenswrapper[4843]: I0318 12:12:16.984262 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:16 crc kubenswrapper[4843]: E0318 12:12:16.984366 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:16 crc kubenswrapper[4843]: E0318 12:12:16.984727 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:16 crc kubenswrapper[4843]: E0318 12:12:16.984843 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.006743 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.024588 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.041010 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b99a8424a67312606a51f09b6fe9197761584e8c306318cdb3f765a2aafc3223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74\\\\n2026-03-18T12:11:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74 to /host/opt/cni/bin/\\\\n2026-03-18T12:11:23Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:23Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.058607 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd54ffb3-185b-435b-8816-0e68777c3d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:25Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:09:55.993473 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:09:55.994204 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:09:55.995250 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:09:55.996008 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 12:10:25.662059 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 12:10:25.662183 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: E0318 12:12:17.071607 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.077303 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.098409 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.113588 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.126262 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.141893 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.155057 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cba39ba-f80c-4083-8d84-a1c2a9cc7255\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d775ddd85f583bb387bee13d76b0cbbf7e4e045db6976fbade6d2e66bfdc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.170239 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.192689 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.208349 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.230807 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.250358 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.266724 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.281900 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.363090 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"ndler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0318 12:12:01.938416 7170 services_controller.go:434] Service openshift-etcd/etcd retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{etcd openshift-etcd ad0a4b9d-2a7b-4f3f-9020-0c45d515459d 4800 0 2025-02-23 05:11:51 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:etcd] map[operator.openshift.io/spec-hash:0685cfaa0976bfb7ba58513629369c20bf05f4fba36949e982bdb43af328f0e1 prometheus.io/scheme:https prometheus.io/scrape:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:etcd,Protocol:TCP,Port:2379,TargetPort:{0 2379 },NodePort:0,AppProtocol:nil\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.377126 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:17Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:17 crc kubenswrapper[4843]: I0318 12:12:17.983561 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:17 crc kubenswrapper[4843]: E0318 12:12:17.983785 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:18 crc kubenswrapper[4843]: I0318 12:12:18.984034 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:18 crc kubenswrapper[4843]: I0318 12:12:18.984156 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:18 crc kubenswrapper[4843]: I0318 12:12:18.984073 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:18 crc kubenswrapper[4843]: E0318 12:12:18.984315 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:18 crc kubenswrapper[4843]: E0318 12:12:18.984421 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:18 crc kubenswrapper[4843]: E0318 12:12:18.984574 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:19 crc kubenswrapper[4843]: I0318 12:12:19.235159 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:12:19 crc kubenswrapper[4843]: I0318 12:12:19.237797 4843 scope.go:117] "RemoveContainer" containerID="410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15" Mar 18 12:12:19 crc kubenswrapper[4843]: E0318 12:12:19.238390 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" Mar 18 12:12:19 crc kubenswrapper[4843]: I0318 12:12:19.983561 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:19 crc kubenswrapper[4843]: E0318 12:12:19.983794 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:20 crc kubenswrapper[4843]: I0318 12:12:20.983221 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:20 crc kubenswrapper[4843]: I0318 12:12:20.983221 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:20 crc kubenswrapper[4843]: E0318 12:12:20.983417 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:20 crc kubenswrapper[4843]: I0318 12:12:20.983246 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:20 crc kubenswrapper[4843]: E0318 12:12:20.983469 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:20 crc kubenswrapper[4843]: E0318 12:12:20.983558 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.920840 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.920891 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.920901 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.920919 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.920940 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:21Z","lastTransitionTime":"2026-03-18T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:21 crc kubenswrapper[4843]: E0318 12:12:21.939044 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.945540 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.945578 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.945588 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.945617 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.945630 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:21Z","lastTransitionTime":"2026-03-18T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:21 crc kubenswrapper[4843]: E0318 12:12:21.960967 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.968195 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.968246 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.968256 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.968274 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.968290 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:21Z","lastTransitionTime":"2026-03-18T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.982916 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:21 crc kubenswrapper[4843]: E0318 12:12:21.983073 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:21 crc kubenswrapper[4843]: E0318 12:12:21.984507 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.989771 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.989799 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.989810 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.989827 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:21 crc kubenswrapper[4843]: I0318 12:12:21.989843 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:21Z","lastTransitionTime":"2026-03-18T12:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:22 crc kubenswrapper[4843]: E0318 12:12:22.006450 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4843]: I0318 12:12:22.010559 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:22 crc kubenswrapper[4843]: I0318 12:12:22.010586 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:22 crc kubenswrapper[4843]: I0318 12:12:22.010595 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:22 crc kubenswrapper[4843]: I0318 12:12:22.010611 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:22 crc kubenswrapper[4843]: I0318 12:12:22.010623 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:22Z","lastTransitionTime":"2026-03-18T12:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:22 crc kubenswrapper[4843]: E0318 12:12:22.025945 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4843]: E0318 12:12:22.026107 4843 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:12:22 crc kubenswrapper[4843]: E0318 12:12:22.073170 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:22 crc kubenswrapper[4843]: I0318 12:12:22.983211 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:22 crc kubenswrapper[4843]: I0318 12:12:22.983217 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:22 crc kubenswrapper[4843]: E0318 12:12:22.983380 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:22 crc kubenswrapper[4843]: I0318 12:12:22.983211 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:22 crc kubenswrapper[4843]: E0318 12:12:22.983456 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:22 crc kubenswrapper[4843]: E0318 12:12:22.983515 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:23 crc kubenswrapper[4843]: I0318 12:12:23.983634 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:23 crc kubenswrapper[4843]: E0318 12:12:23.984230 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:24 crc kubenswrapper[4843]: I0318 12:12:24.983819 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:24 crc kubenswrapper[4843]: I0318 12:12:24.983885 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:24 crc kubenswrapper[4843]: E0318 12:12:24.984043 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:24 crc kubenswrapper[4843]: E0318 12:12:24.984125 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:24 crc kubenswrapper[4843]: I0318 12:12:24.983865 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:24 crc kubenswrapper[4843]: E0318 12:12:24.984237 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:25 crc kubenswrapper[4843]: I0318 12:12:25.983311 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:25 crc kubenswrapper[4843]: E0318 12:12:25.983575 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:26 crc kubenswrapper[4843]: I0318 12:12:26.983648 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:26 crc kubenswrapper[4843]: E0318 12:12:26.983862 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:26 crc kubenswrapper[4843]: I0318 12:12:26.983878 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:26 crc kubenswrapper[4843]: I0318 12:12:26.983947 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:26 crc kubenswrapper[4843]: E0318 12:12:26.983966 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:26 crc kubenswrapper[4843]: E0318 12:12:26.984163 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.009094 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.024927 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.039789 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.055042 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: E0318 12:12:27.074090 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.085626 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"ndler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0318 12:12:01.938416 7170 services_controller.go:434] Service openshift-etcd/etcd retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{etcd openshift-etcd ad0a4b9d-2a7b-4f3f-9020-0c45d515459d 4800 0 2025-02-23 05:11:51 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:etcd] map[operator.openshift.io/spec-hash:0685cfaa0976bfb7ba58513629369c20bf05f4fba36949e982bdb43af328f0e1 prometheus.io/scheme:https prometheus.io/scrape:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:etcd,Protocol:TCP,Port:2379,TargetPort:{0 2379 },NodePort:0,AppProtocol:nil\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.101973 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.116165 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.134099 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.149736 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.165106 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b99a8424a67312606a51f09b6fe9197761584e8c306318cdb3f765a2aafc3223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74\\\\n2026-03-18T12:11:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74 to /host/opt/cni/bin/\\\\n2026-03-18T12:11:23Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:23Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.180009 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd54ffb3-185b-435b-8816-0e68777c3d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:25Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:09:55.993473 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:09:55.994204 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:09:55.995250 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:09:55.996008 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 12:10:25.662059 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 12:10:25.662183 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.193196 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.205449 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.219464 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.230357 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.242895 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.254126 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cba39ba-f80c-4083-8d84-a1c2a9cc7255\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d775ddd85f583bb387bee13d76b0cbbf7e4e045db6976fbade6d2e66bfdc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.266615 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.282609 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:27Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:27 crc kubenswrapper[4843]: I0318 12:12:27.983357 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:27 crc kubenswrapper[4843]: E0318 12:12:27.983557 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:28 crc kubenswrapper[4843]: I0318 12:12:28.983361 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:28 crc kubenswrapper[4843]: I0318 12:12:28.983401 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:28 crc kubenswrapper[4843]: E0318 12:12:28.983622 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:28 crc kubenswrapper[4843]: I0318 12:12:28.983394 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:28 crc kubenswrapper[4843]: E0318 12:12:28.983821 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:28 crc kubenswrapper[4843]: E0318 12:12:28.984046 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:29 crc kubenswrapper[4843]: I0318 12:12:29.983364 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:29 crc kubenswrapper[4843]: E0318 12:12:29.984025 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:29 crc kubenswrapper[4843]: I0318 12:12:29.984793 4843 scope.go:117] "RemoveContainer" containerID="410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.142208 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/2.log" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.148342 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerStarted","Data":"3bdbcb1a9e9191c9d4a7ac1e7144ee2cdd03aaa1197ee2b2842d23d84dcfaa77"} Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.149143 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.171981 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.190976 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.207298 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.220054 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.233116 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.248042 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd54ffb3-185b-435b-8816-0e68777c3d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:25Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:09:55.993473 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:09:55.994204 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:09:55.995250 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:09:55.996008 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 12:10:25.662059 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 12:10:25.662183 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.262149 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.279282 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.292959 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cba39ba-f80c-4083-8d84-a1c2a9cc7255\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d775ddd85f583bb387bee13d76b0cbbf7e4e045db6976fbade6d2e66bfdc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.308967 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.326694 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.341439 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.366243 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bdbcb1a9e9191c9d4a7ac1e7144ee2cdd03aaa1197ee2b2842d23d84dcfaa77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"ndler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0318 12:12:01.938416 7170 services_controller.go:434] Service openshift-etcd/etcd retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{etcd openshift-etcd ad0a4b9d-2a7b-4f3f-9020-0c45d515459d 4800 0 2025-02-23 05:11:51 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:etcd] map[operator.openshift.io/spec-hash:0685cfaa0976bfb7ba58513629369c20bf05f4fba36949e982bdb43af328f0e1 prometheus.io/scheme:https prometheus.io/scrape:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:etcd,Protocol:TCP,Port:2379,TargetPort:{0 2379 },NodePort:0,AppProtocol:nil\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.384195 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.407309 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.431396 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.449556 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.471485 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b99a8424a67312606a51f09b6fe9197761584e8c306318cdb3f765a2aafc3223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74\\\\n2026-03-18T12:11:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74 to /host/opt/cni/bin/\\\\n2026-03-18T12:11:23Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:23Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.501260 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.982994 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.983117 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:30 crc kubenswrapper[4843]: E0318 12:12:30.983179 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:30 crc kubenswrapper[4843]: I0318 12:12:30.983235 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:30 crc kubenswrapper[4843]: E0318 12:12:30.983333 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:30 crc kubenswrapper[4843]: E0318 12:12:30.983462 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:31 crc kubenswrapper[4843]: I0318 12:12:31.983101 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:31 crc kubenswrapper[4843]: E0318 12:12:31.983325 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:32 crc kubenswrapper[4843]: E0318 12:12:32.076029 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.365494 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.365550 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.365560 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.365584 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.365611 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:32Z","lastTransitionTime":"2026-03-18T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:32 crc kubenswrapper[4843]: E0318 12:12:32.383113 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.387564 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.387600 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.387613 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.387638 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.387667 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:32Z","lastTransitionTime":"2026-03-18T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:32 crc kubenswrapper[4843]: E0318 12:12:32.403686 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.409052 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.409115 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.409134 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.409156 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.409170 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:32Z","lastTransitionTime":"2026-03-18T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:32 crc kubenswrapper[4843]: E0318 12:12:32.425058 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.429526 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.429577 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.429587 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.429608 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.429625 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:32Z","lastTransitionTime":"2026-03-18T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:32 crc kubenswrapper[4843]: E0318 12:12:32.443547 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.447735 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.447771 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.447781 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.447800 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.447812 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:32Z","lastTransitionTime":"2026-03-18T12:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:32 crc kubenswrapper[4843]: E0318 12:12:32.465327 4843 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0efd8cb9-5707-44e3-a74f-91b5a38b13a0\\\",\\\"systemUUID\\\":\\\"b2f7e1f8-18f3-4a63-825d-e1bf5bd4e89e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: E0318 12:12:32.465459 4843 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.519918 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/3.log" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.520848 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/2.log" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.524024 4843 generic.go:334] "Generic (PLEG): container finished" podID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerID="3bdbcb1a9e9191c9d4a7ac1e7144ee2cdd03aaa1197ee2b2842d23d84dcfaa77" exitCode=1 Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.524082 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"3bdbcb1a9e9191c9d4a7ac1e7144ee2cdd03aaa1197ee2b2842d23d84dcfaa77"} Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.524173 4843 scope.go:117] "RemoveContainer" containerID="410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.525890 4843 scope.go:117] "RemoveContainer" containerID="3bdbcb1a9e9191c9d4a7ac1e7144ee2cdd03aaa1197ee2b2842d23d84dcfaa77" Mar 18 12:12:32 crc kubenswrapper[4843]: E0318 12:12:32.529020 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.541431 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.555311 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.578927 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.595400 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.611152 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.629779 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.649526 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bdbcb1a9e9191c9d4a7ac1e7144ee2cdd03aaa1197ee2b2842d23d84dcfaa77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"ndler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0318 12:12:01.938416 7170 services_controller.go:434] Service openshift-etcd/etcd retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{etcd openshift-etcd ad0a4b9d-2a7b-4f3f-9020-0c45d515459d 4800 0 2025-02-23 05:11:51 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:etcd] map[operator.openshift.io/spec-hash:0685cfaa0976bfb7ba58513629369c20bf05f4fba36949e982bdb43af328f0e1 prometheus.io/scheme:https prometheus.io/scrape:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:etcd,Protocol:TCP,Port:2379,TargetPort:{0 2379 },NodePort:0,AppProtocol:nil\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bdbcb1a9e9191c9d4a7ac1e7144ee2cdd03aaa1197ee2b2842d23d84dcfaa77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:31Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 12:12:31.754467 7515 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 12:12:31.752979 7515 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0318 12:12:31.754510 7515 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0318 12:12:31.754518 7515 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0318 12:12:31.753227 7515 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 12:12:31.754524 7515 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0318 12:12:31.754531 7515 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0318 12:12:31.754532 7515 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.662012 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.676111 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.689197 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b99a8424a67312606a51f09b6fe9197761584e8c306318cdb3f765a2aafc3223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74\\\\n2026-03-18T12:11:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74 to /host/opt/cni/bin/\\\\n2026-03-18T12:11:23Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:23Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.701520 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.716825 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd54ffb3-185b-435b-8816-0e68777c3d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:25Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:09:55.993473 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:09:55.994204 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:09:55.995250 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:09:55.996008 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 12:10:25.662059 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 12:10:25.662183 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.729794 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.742914 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.756427 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.767074 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.777839 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cba39ba-f80c-4083-8d84-a1c2a9cc7255\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d775ddd85f583bb387bee13d76b0cbbf7e4e045db6976fbade6d2e66bfdc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.793449 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.809203 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:32Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.983114 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.983369 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:32 crc kubenswrapper[4843]: I0318 12:12:32.983401 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:32 crc kubenswrapper[4843]: E0318 12:12:32.983543 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:32 crc kubenswrapper[4843]: E0318 12:12:32.983875 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:32 crc kubenswrapper[4843]: E0318 12:12:32.984020 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:33 crc kubenswrapper[4843]: I0318 12:12:33.528707 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/3.log" Mar 18 12:12:33 crc kubenswrapper[4843]: I0318 12:12:33.983561 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:33 crc kubenswrapper[4843]: E0318 12:12:33.983879 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:34 crc kubenswrapper[4843]: I0318 12:12:34.983056 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:34 crc kubenswrapper[4843]: I0318 12:12:34.983112 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:34 crc kubenswrapper[4843]: I0318 12:12:34.983164 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:34 crc kubenswrapper[4843]: E0318 12:12:34.983434 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:34 crc kubenswrapper[4843]: E0318 12:12:34.983562 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:34 crc kubenswrapper[4843]: E0318 12:12:34.983808 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:35 crc kubenswrapper[4843]: I0318 12:12:35.982919 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:35 crc kubenswrapper[4843]: E0318 12:12:35.983195 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:36 crc kubenswrapper[4843]: I0318 12:12:36.128818 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:36 crc kubenswrapper[4843]: E0318 12:12:36.129044 4843 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:36 crc kubenswrapper[4843]: E0318 12:12:36.129131 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs podName:62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c nodeName:}" failed. No retries permitted until 2026-03-18 12:13:40.129107091 +0000 UTC m=+253.844932615 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs") pod "network-metrics-daemon-sn986" (UID: "62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:36 crc kubenswrapper[4843]: I0318 12:12:36.983628 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:36 crc kubenswrapper[4843]: I0318 12:12:36.983710 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:36 crc kubenswrapper[4843]: I0318 12:12:36.983723 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:36 crc kubenswrapper[4843]: E0318 12:12:36.984234 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:36 crc kubenswrapper[4843]: E0318 12:12:36.984322 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:36 crc kubenswrapper[4843]: E0318 12:12:36.984174 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.003282 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.018588 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beebb9a6b23a3b73a81b13710b9d7d5e84c93c1c375f0ae82e0636d3467311e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.033330 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ls8r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa16ddb-306b-4e37-a33a-b9cdce3c254e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b99a8424a67312606a51f09b6fe9197761584e8c306318cdb3f765a2aafc3223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74\\\\n2026-03-18T12:11:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb4b718-46d1-4639-b14c-b5e0b66e4e74 to /host/opt/cni/bin/\\\\n2026-03-18T12:11:23Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:23Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42zs5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ls8r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.048624 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd54ffb3-185b-435b-8816-0e68777c3d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c302996fedfc87cb3490362b296791c75b12a3745aaca6f412242139cc1b650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4594141b60aa4fc27e9141ee0e03ce44500fc87853300a481b7538bfe47775e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:25Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:09:55.993473 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:09:55.994204 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:09:55.995250 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:09:55.996008 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 12:10:25.662059 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 12:10:25.662183 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:55Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530d48dac592d94c55bd319d7710b45b9b7b0c0503a95fdbc2617047899996a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5619a8ca389c080d33e727a262f73509b406e3fadd6794735ac7d709b38a52e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.062606 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"365333ab-ed4e-45c0-b03d-7c342fe43cfd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb2a76846b272d785e4c9bbdd20d39046e2e32553b135950822d7b01b06a929d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9b5c1000d578a0a686b8535963061513e543d030b77e8d2988ad160c366ff96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d02a10fed73dae14e13cc894ca1d1f8f55f8964c4ff88027aa60d344650a2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7824ca88638f949a38cdfaea5651ff226b86f84519bb8e01296f5cb6374de2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: E0318 12:12:37.076735 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.080818 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc57899f90899e409d621b14c6a396ddac85eb80f4d9c78fe9a14ca631f08a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.096686 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f7cb654987bb38e3695d3fcfb6ef6c50e883c65a74cc2c599020b521d48985d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://232e09525e14d6827ed7ea29c87c5c0e1e6906b86722b1fa735d75b910e2cdc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.109619 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-csgs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"809fa601-6b32-4585-a41a-646cc883bcd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f00e2d7c9d04af9a939ac0d61ea623e360fb3424f64c4e327fddc75bbac3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7r2pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-csgs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.121623 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20c29405-fb3c-4f34-b5c1-ff6d4745d0d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24442ddc75a1aa526ccf1f10e240fcfbe35b53547a87b40fb3a81fbad74248aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef99cca747828d660873dde102ff7922276527644b13a0dab51c6e3d4892266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9vfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f46tt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.133311 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cba39ba-f80c-4083-8d84-a1c2a9cc7255\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a66d775ddd85f583bb387bee13d76b0cbbf7e4e045db6976fbade6d2e66bfdc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e70475738802558290a41be0c03750793afa9727b762b69f0f1b95920493b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.147357 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.165074 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c88bc92-4b87-46ed-ab45-6291502efbfe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b211e790ebf41b3e31329f9e7bc2745ff904a7a0d184a2cff2c5f6a0e33029\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fa3e936c6702a3ea3a57eb317bff7e26fd5bc526c52006849e160b9bebced22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87729c97185df648e4b273943bae980467b65e14be7df1fff16615948e714e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab21727fd0ca3cda1bcc51806d6d6cf231f9fd79c19f5580a5f9362fbbc3658b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82bdc13252e88cb262266e31a1a62812376337ffe2af94cf7a45ce414d765ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be0c21e73e6beb64df1eb39bf630972529e67789b801f4eddac9c8d36c53144d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595ca001cb275ad673520c7ecd6de0f9a5afe01c85143bc1c1af10c95b17d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mln2h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2hn9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.189635 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d645946-8ad4-4d8e-ac16-7e0fb6acfd56\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e084239b9843bf9a932e043addd565654276ed62b499934c69695ce4ea8a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab997aed0454845d43777898b8c9a4373bfb2928163e3d033bd4f6e6b4031c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6dd674ae84674c6192b6015aedf931bfaa2e03252bf30e489269fdde50dd59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b379853d7abffb1ade409fa9c1dbe18097de0e92aa6b37739ee8b99565ed9fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf6b48e16a5ca094a7eeeb3ff149345f9465ceeec34169d2c3c9c8ee103ccf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://561f608209593ebc6de5b6387f2e6271942475cd1f5617e6fa54c0222baf66b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21fd0a5c388096df7060bfbcb1bfb8763934cd20cf11f3fbc0f513e0e7523ede\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dae9efd03a433337f45d13813921c155c70ec417a69dd135d173166c6543bb30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.204991 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f49b56c-dbaf-4a37-9508-d8e894da9149\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:27Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:26.884922 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:26.885144 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:26.886290 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-837834987/tls.crt::/tmp/serving-cert-837834987/tls.key\\\\\\\"\\\\nI0318 12:10:27.163690 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:27.169854 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:27.169893 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:27.169923 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:27.169930 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:27.177636 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:27.177698 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177706 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:27.177713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:27.177717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:27.177721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:27.177726 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:27.178050 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:27.202679 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.219977 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.233097 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5a185c4-48ac-4f51-99be-0a9418d9e53f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137ac24b3e6a4c2c2280b323d5d8715646954929b969e04d4bd118bcfcfffbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkw6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wstcq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.253808 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45174fd5-3a94-47fe-81c3-18bd634c4fcf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bdbcb1a9e9191c9d4a7ac1e7144ee2cdd03aaa1197ee2b2842d23d84dcfaa77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410ca9d0f0ba26ee79860e7b3ef64921b741a4f1b1299fe5336191009dae2d15\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"ndler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z]\\\\nI0318 12:12:01.938416 7170 services_controller.go:434] Service openshift-etcd/etcd retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{etcd openshift-etcd ad0a4b9d-2a7b-4f3f-9020-0c45d515459d 4800 0 2025-02-23 05:11:51 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:etcd] map[operator.openshift.io/spec-hash:0685cfaa0976bfb7ba58513629369c20bf05f4fba36949e982bdb43af328f0e1 prometheus.io/scheme:https prometheus.io/scrape:true service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:etcd,Protocol:TCP,Port:2379,TargetPort:{0 2379 },NodePort:0,AppProtocol:nil\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bdbcb1a9e9191c9d4a7ac1e7144ee2cdd03aaa1197ee2b2842d23d84dcfaa77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:31Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 12:12:31.754467 7515 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 12:12:31.752979 7515 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0318 12:12:31.754510 7515 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0318 12:12:31.754518 7515 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0318 12:12:31.753227 7515 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 12:12:31.754524 7515 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0318 12:12:31.754531 7515 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0318 12:12:31.754532 7515 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84n7m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bc7c6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.265820 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tk5kw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e2b55bb-76e9-4e93-95fd-063957b71f6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ad65cb00e8e670bc91b9ea0daf8c22848b5abe8f01fc5f211458d166759fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtlpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tk5kw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.280185 4843 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sn986" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbszd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:32Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sn986\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:37Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:37 crc kubenswrapper[4843]: I0318 12:12:37.983254 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:37 crc kubenswrapper[4843]: E0318 12:12:37.984054 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:38 crc kubenswrapper[4843]: I0318 12:12:38.982857 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:38 crc kubenswrapper[4843]: I0318 12:12:38.982901 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:38 crc kubenswrapper[4843]: E0318 12:12:38.983013 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:38 crc kubenswrapper[4843]: I0318 12:12:38.983077 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:38 crc kubenswrapper[4843]: E0318 12:12:38.983698 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:38 crc kubenswrapper[4843]: E0318 12:12:38.983781 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:39 crc kubenswrapper[4843]: I0318 12:12:39.983741 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:39 crc kubenswrapper[4843]: E0318 12:12:39.983972 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:40 crc kubenswrapper[4843]: I0318 12:12:40.982890 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:40 crc kubenswrapper[4843]: I0318 12:12:40.982910 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:40 crc kubenswrapper[4843]: I0318 12:12:40.982935 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:40 crc kubenswrapper[4843]: E0318 12:12:40.983218 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:40 crc kubenswrapper[4843]: E0318 12:12:40.983142 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:40 crc kubenswrapper[4843]: E0318 12:12:40.983313 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:41 crc kubenswrapper[4843]: I0318 12:12:41.983088 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:41 crc kubenswrapper[4843]: E0318 12:12:41.983259 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:42 crc kubenswrapper[4843]: E0318 12:12:42.078468 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.792210 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.792290 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.792301 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.792319 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.792331 4843 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:42Z","lastTransitionTime":"2026-03-18T12:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.854714 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4"] Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.855315 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.857635 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.857971 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.858073 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.859506 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.913045 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f46tt" podStartSLOduration=132.913017766 podStartE2EDuration="2m12.913017766s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:42.910875807 +0000 UTC m=+196.626701351" watchObservedRunningTime="2026-03-18 12:12:42.913017766 +0000 UTC m=+196.628843290" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.913250 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-csgs2" podStartSLOduration=133.913246402 podStartE2EDuration="2m13.913246402s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:42.895571803 +0000 UTC m=+196.611397337" watchObservedRunningTime="2026-03-18 12:12:42.913246402 +0000 UTC m=+196.629071926" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.932025 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=57.931997202 podStartE2EDuration="57.931997202s" podCreationTimestamp="2026-03-18 12:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:42.931635182 +0000 UTC m=+196.647460726" watchObservedRunningTime="2026-03-18 12:12:42.931997202 +0000 UTC m=+196.647822726" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.962488 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=73.962464235 podStartE2EDuration="1m13.962464235s" podCreationTimestamp="2026-03-18 12:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:42.955221795 +0000 UTC m=+196.671047339" watchObservedRunningTime="2026-03-18 12:12:42.962464235 +0000 UTC m=+196.678289769" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.983496 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.983586 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.983603 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:42 crc kubenswrapper[4843]: E0318 12:12:42.983929 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:42 crc kubenswrapper[4843]: E0318 12:12:42.984106 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:42 crc kubenswrapper[4843]: E0318 12:12:42.984253 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:42 crc kubenswrapper[4843]: I0318 12:12:42.984287 4843 scope.go:117] "RemoveContainer" containerID="3bdbcb1a9e9191c9d4a7ac1e7144ee2cdd03aaa1197ee2b2842d23d84dcfaa77" Mar 18 12:12:42 crc kubenswrapper[4843]: E0318 12:12:42.984453 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.011283 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=46.011262667 podStartE2EDuration="46.011262667s" podCreationTimestamp="2026-03-18 12:11:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:43.010384173 +0000 UTC m=+196.726209697" watchObservedRunningTime="2026-03-18 12:12:43.011262667 +0000 UTC m=+196.727088191" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.011331 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6923078-b23e-4be4-99d9-258f7e32b75a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.011396 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6923078-b23e-4be4-99d9-258f7e32b75a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.011425 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f6923078-b23e-4be4-99d9-258f7e32b75a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.011453 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f6923078-b23e-4be4-99d9-258f7e32b75a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.011503 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6923078-b23e-4be4-99d9-258f7e32b75a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.076004 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podStartSLOduration=134.07597629 podStartE2EDuration="2m14.07597629s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:43.074003025 +0000 UTC m=+196.789828549" watchObservedRunningTime="2026-03-18 12:12:43.07597629 +0000 UTC m=+196.791801814" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.076126 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2hn9k" podStartSLOduration=134.076122504 podStartE2EDuration="2m14.076122504s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:43.047977194 +0000 UTC m=+196.763802718" watchObservedRunningTime="2026-03-18 12:12:43.076122504 +0000 UTC m=+196.791948028" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.112399 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6923078-b23e-4be4-99d9-258f7e32b75a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.112472 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6923078-b23e-4be4-99d9-258f7e32b75a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.112496 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f6923078-b23e-4be4-99d9-258f7e32b75a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.112587 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f6923078-b23e-4be4-99d9-258f7e32b75a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.112909 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f6923078-b23e-4be4-99d9-258f7e32b75a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.113089 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f6923078-b23e-4be4-99d9-258f7e32b75a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.113171 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6923078-b23e-4be4-99d9-258f7e32b75a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.113336 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6923078-b23e-4be4-99d9-258f7e32b75a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.121283 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6923078-b23e-4be4-99d9-258f7e32b75a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.140827 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6923078-b23e-4be4-99d9-258f7e32b75a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7wxh4\" (UID: \"f6923078-b23e-4be4-99d9-258f7e32b75a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.173840 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.207738 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tk5kw" podStartSLOduration=134.207716809 podStartE2EDuration="2m14.207716809s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:43.207135243 +0000 UTC m=+196.922960767" watchObservedRunningTime="2026-03-18 12:12:43.207716809 +0000 UTC m=+196.923542343" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.260331 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=101.260314076 podStartE2EDuration="1m41.260314076s" podCreationTimestamp="2026-03-18 12:11:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:43.259199465 +0000 UTC m=+196.975024989" watchObservedRunningTime="2026-03-18 12:12:43.260314076 +0000 UTC m=+196.976139600" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.282645 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=106.282625124 podStartE2EDuration="1m46.282625124s" podCreationTimestamp="2026-03-18 12:10:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:43.281610276 +0000 UTC m=+196.997435800" watchObservedRunningTime="2026-03-18 12:12:43.282625124 +0000 UTC m=+196.998450668" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.385795 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ls8r8" podStartSLOduration=134.385772441 podStartE2EDuration="2m14.385772441s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:43.35505393 +0000 UTC m=+197.070879454" watchObservedRunningTime="2026-03-18 12:12:43.385772441 +0000 UTC m=+197.101597965" Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.518405 4843 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.529711 4843 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.566547 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" event={"ID":"f6923078-b23e-4be4-99d9-258f7e32b75a","Type":"ContainerStarted","Data":"6881e06b5c31f2a298fc7b02fda234d348e13bb33c5c95fbf1713c4829b6a967"} Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.566616 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" event={"ID":"f6923078-b23e-4be4-99d9-258f7e32b75a","Type":"ContainerStarted","Data":"a0c3ab448e63c30274005367f998894bcdedd65297b2800ed5088cbc51dbbb41"} Mar 18 12:12:43 crc kubenswrapper[4843]: I0318 12:12:43.983868 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:43 crc kubenswrapper[4843]: E0318 12:12:43.984094 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:44 crc kubenswrapper[4843]: I0318 12:12:44.983616 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:44 crc kubenswrapper[4843]: I0318 12:12:44.983694 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:44 crc kubenswrapper[4843]: E0318 12:12:44.983798 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:44 crc kubenswrapper[4843]: I0318 12:12:44.983862 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:44 crc kubenswrapper[4843]: E0318 12:12:44.984023 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:44 crc kubenswrapper[4843]: E0318 12:12:44.984236 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:45 crc kubenswrapper[4843]: I0318 12:12:45.983545 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:45 crc kubenswrapper[4843]: E0318 12:12:45.984324 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:46 crc kubenswrapper[4843]: I0318 12:12:46.983481 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:46 crc kubenswrapper[4843]: E0318 12:12:46.984725 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:46 crc kubenswrapper[4843]: I0318 12:12:46.984807 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:46 crc kubenswrapper[4843]: E0318 12:12:46.985101 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:46 crc kubenswrapper[4843]: I0318 12:12:46.984802 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:46 crc kubenswrapper[4843]: E0318 12:12:46.985174 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:47 crc kubenswrapper[4843]: E0318 12:12:47.079114 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:47 crc kubenswrapper[4843]: I0318 12:12:47.982642 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:47 crc kubenswrapper[4843]: E0318 12:12:47.983002 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:48 crc kubenswrapper[4843]: I0318 12:12:48.983613 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:48 crc kubenswrapper[4843]: I0318 12:12:48.983627 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:48 crc kubenswrapper[4843]: E0318 12:12:48.984918 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:48 crc kubenswrapper[4843]: E0318 12:12:48.984977 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:48 crc kubenswrapper[4843]: I0318 12:12:48.983848 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:48 crc kubenswrapper[4843]: E0318 12:12:48.985042 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:49 crc kubenswrapper[4843]: I0318 12:12:49.983699 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:49 crc kubenswrapper[4843]: E0318 12:12:49.983982 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:50 crc kubenswrapper[4843]: I0318 12:12:50.983122 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:50 crc kubenswrapper[4843]: I0318 12:12:50.983153 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:50 crc kubenswrapper[4843]: I0318 12:12:50.983184 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:50 crc kubenswrapper[4843]: E0318 12:12:50.983867 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:50 crc kubenswrapper[4843]: E0318 12:12:50.983921 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:50 crc kubenswrapper[4843]: E0318 12:12:50.983972 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:51 crc kubenswrapper[4843]: I0318 12:12:51.983561 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:51 crc kubenswrapper[4843]: E0318 12:12:51.983803 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:52 crc kubenswrapper[4843]: E0318 12:12:52.080340 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:52 crc kubenswrapper[4843]: I0318 12:12:52.983118 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:52 crc kubenswrapper[4843]: I0318 12:12:52.983224 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:52 crc kubenswrapper[4843]: I0318 12:12:52.983292 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:52 crc kubenswrapper[4843]: E0318 12:12:52.983337 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:52 crc kubenswrapper[4843]: E0318 12:12:52.983368 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:52 crc kubenswrapper[4843]: E0318 12:12:52.983465 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:53 crc kubenswrapper[4843]: I0318 12:12:53.983765 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:53 crc kubenswrapper[4843]: E0318 12:12:53.983998 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:54 crc kubenswrapper[4843]: I0318 12:12:54.983030 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:54 crc kubenswrapper[4843]: I0318 12:12:54.983180 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:54 crc kubenswrapper[4843]: E0318 12:12:54.983236 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:54 crc kubenswrapper[4843]: E0318 12:12:54.983490 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:54 crc kubenswrapper[4843]: I0318 12:12:54.983676 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:54 crc kubenswrapper[4843]: E0318 12:12:54.983744 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:55 crc kubenswrapper[4843]: I0318 12:12:55.609029 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ls8r8_1aa16ddb-306b-4e37-a33a-b9cdce3c254e/kube-multus/1.log" Mar 18 12:12:55 crc kubenswrapper[4843]: I0318 12:12:55.609544 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ls8r8_1aa16ddb-306b-4e37-a33a-b9cdce3c254e/kube-multus/0.log" Mar 18 12:12:55 crc kubenswrapper[4843]: I0318 12:12:55.609618 4843 generic.go:334] "Generic (PLEG): container finished" podID="1aa16ddb-306b-4e37-a33a-b9cdce3c254e" containerID="b99a8424a67312606a51f09b6fe9197761584e8c306318cdb3f765a2aafc3223" exitCode=1 Mar 18 12:12:55 crc kubenswrapper[4843]: I0318 12:12:55.609685 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ls8r8" event={"ID":"1aa16ddb-306b-4e37-a33a-b9cdce3c254e","Type":"ContainerDied","Data":"b99a8424a67312606a51f09b6fe9197761584e8c306318cdb3f765a2aafc3223"} Mar 18 12:12:55 crc kubenswrapper[4843]: I0318 12:12:55.610464 4843 scope.go:117] "RemoveContainer" containerID="ae18736c42959756112b13bd2eb28d8d0820a4c54f4098d22db434e0be4a529d" Mar 18 12:12:55 crc kubenswrapper[4843]: I0318 12:12:55.610616 4843 scope.go:117] "RemoveContainer" containerID="b99a8424a67312606a51f09b6fe9197761584e8c306318cdb3f765a2aafc3223" Mar 18 12:12:55 crc kubenswrapper[4843]: E0318 12:12:55.611081 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-ls8r8_openshift-multus(1aa16ddb-306b-4e37-a33a-b9cdce3c254e)\"" pod="openshift-multus/multus-ls8r8" podUID="1aa16ddb-306b-4e37-a33a-b9cdce3c254e" Mar 18 12:12:55 crc kubenswrapper[4843]: I0318 12:12:55.636126 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7wxh4" podStartSLOduration=146.636093408 podStartE2EDuration="2m26.636093408s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:43.584488556 +0000 UTC m=+197.300314080" watchObservedRunningTime="2026-03-18 12:12:55.636093408 +0000 UTC m=+209.351918982" Mar 18 12:12:55 crc kubenswrapper[4843]: I0318 12:12:55.983686 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:55 crc kubenswrapper[4843]: E0318 12:12:55.984256 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:56 crc kubenswrapper[4843]: I0318 12:12:56.615169 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ls8r8_1aa16ddb-306b-4e37-a33a-b9cdce3c254e/kube-multus/1.log" Mar 18 12:12:56 crc kubenswrapper[4843]: I0318 12:12:56.984097 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:56 crc kubenswrapper[4843]: I0318 12:12:56.985383 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:56 crc kubenswrapper[4843]: E0318 12:12:56.985369 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:56 crc kubenswrapper[4843]: I0318 12:12:56.985491 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:56 crc kubenswrapper[4843]: E0318 12:12:56.985986 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:56 crc kubenswrapper[4843]: E0318 12:12:56.986122 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:57 crc kubenswrapper[4843]: E0318 12:12:57.081327 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:57 crc kubenswrapper[4843]: I0318 12:12:57.983822 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:57 crc kubenswrapper[4843]: E0318 12:12:57.984019 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:12:57 crc kubenswrapper[4843]: I0318 12:12:57.985737 4843 scope.go:117] "RemoveContainer" containerID="3bdbcb1a9e9191c9d4a7ac1e7144ee2cdd03aaa1197ee2b2842d23d84dcfaa77" Mar 18 12:12:57 crc kubenswrapper[4843]: E0318 12:12:57.986023 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bc7c6_openshift-ovn-kubernetes(45174fd5-3a94-47fe-81c3-18bd634c4fcf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" Mar 18 12:12:58 crc kubenswrapper[4843]: I0318 12:12:58.983156 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:58 crc kubenswrapper[4843]: I0318 12:12:58.983207 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:58 crc kubenswrapper[4843]: I0318 12:12:58.983261 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:58 crc kubenswrapper[4843]: E0318 12:12:58.983726 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:58 crc kubenswrapper[4843]: E0318 12:12:58.983873 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:58 crc kubenswrapper[4843]: E0318 12:12:58.984004 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:59 crc kubenswrapper[4843]: I0318 12:12:59.982946 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:12:59 crc kubenswrapper[4843]: E0318 12:12:59.983135 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:13:00 crc kubenswrapper[4843]: I0318 12:13:00.983504 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:00 crc kubenswrapper[4843]: E0318 12:13:00.983672 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:00 crc kubenswrapper[4843]: I0318 12:13:00.983699 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:00 crc kubenswrapper[4843]: E0318 12:13:00.983824 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:00 crc kubenswrapper[4843]: I0318 12:13:00.984310 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:00 crc kubenswrapper[4843]: E0318 12:13:00.984505 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:01 crc kubenswrapper[4843]: I0318 12:13:01.983281 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:13:01 crc kubenswrapper[4843]: E0318 12:13:01.983452 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:13:02 crc kubenswrapper[4843]: E0318 12:13:02.082817 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:13:02 crc kubenswrapper[4843]: I0318 12:13:02.983433 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:02 crc kubenswrapper[4843]: I0318 12:13:02.983627 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:02 crc kubenswrapper[4843]: I0318 12:13:02.983705 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:02 crc kubenswrapper[4843]: E0318 12:13:02.983777 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:02 crc kubenswrapper[4843]: E0318 12:13:02.983950 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:02 crc kubenswrapper[4843]: E0318 12:13:02.983980 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:03 crc kubenswrapper[4843]: I0318 12:13:03.109766 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:03 crc kubenswrapper[4843]: I0318 12:13:03.109924 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.109964 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:15:05.109931111 +0000 UTC m=+338.825756655 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:03 crc kubenswrapper[4843]: I0318 12:13:03.110018 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:03 crc kubenswrapper[4843]: I0318 12:13:03.110057 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.110099 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.110127 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.110146 4843 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.110185 4843 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.110185 4843 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:13:03 crc kubenswrapper[4843]: I0318 12:13:03.110104 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.110185 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.110258 4843 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.110272 4843 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.110198 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:15:05.110183938 +0000 UTC m=+338.826009482 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.110326 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:15:05.110289671 +0000 UTC m=+338.826115215 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.110349 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:15:05.110338103 +0000 UTC m=+338.826163647 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.110368 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:15:05.110359433 +0000 UTC m=+338.826184977 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:13:03 crc kubenswrapper[4843]: I0318 12:13:03.983063 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:13:03 crc kubenswrapper[4843]: E0318 12:13:03.983265 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:13:04 crc kubenswrapper[4843]: I0318 12:13:04.983587 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:04 crc kubenswrapper[4843]: E0318 12:13:04.983815 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:04 crc kubenswrapper[4843]: I0318 12:13:04.984111 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:04 crc kubenswrapper[4843]: I0318 12:13:04.984208 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:04 crc kubenswrapper[4843]: E0318 12:13:04.984326 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:04 crc kubenswrapper[4843]: E0318 12:13:04.984597 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:05 crc kubenswrapper[4843]: I0318 12:13:05.983220 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:13:05 crc kubenswrapper[4843]: E0318 12:13:05.983439 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:13:06 crc kubenswrapper[4843]: I0318 12:13:06.983527 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:06 crc kubenswrapper[4843]: I0318 12:13:06.983567 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:06 crc kubenswrapper[4843]: I0318 12:13:06.983527 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:06 crc kubenswrapper[4843]: E0318 12:13:06.984837 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:06 crc kubenswrapper[4843]: E0318 12:13:06.985086 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:06 crc kubenswrapper[4843]: E0318 12:13:06.985259 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:07 crc kubenswrapper[4843]: E0318 12:13:07.083397 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:13:07 crc kubenswrapper[4843]: I0318 12:13:07.982912 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:13:07 crc kubenswrapper[4843]: E0318 12:13:07.983099 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:13:08 crc kubenswrapper[4843]: I0318 12:13:08.983779 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:08 crc kubenswrapper[4843]: I0318 12:13:08.983899 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:08 crc kubenswrapper[4843]: I0318 12:13:08.984016 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:08 crc kubenswrapper[4843]: E0318 12:13:08.983945 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:08 crc kubenswrapper[4843]: E0318 12:13:08.984159 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:08 crc kubenswrapper[4843]: E0318 12:13:08.984266 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:09 crc kubenswrapper[4843]: I0318 12:13:09.983021 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:13:09 crc kubenswrapper[4843]: E0318 12:13:09.983146 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:13:10 crc kubenswrapper[4843]: I0318 12:13:10.983320 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:10 crc kubenswrapper[4843]: I0318 12:13:10.983509 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:10 crc kubenswrapper[4843]: I0318 12:13:10.983624 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:10 crc kubenswrapper[4843]: E0318 12:13:10.983628 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:10 crc kubenswrapper[4843]: E0318 12:13:10.983726 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:10 crc kubenswrapper[4843]: E0318 12:13:10.984043 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:10 crc kubenswrapper[4843]: I0318 12:13:10.984282 4843 scope.go:117] "RemoveContainer" containerID="b99a8424a67312606a51f09b6fe9197761584e8c306318cdb3f765a2aafc3223" Mar 18 12:13:11 crc kubenswrapper[4843]: I0318 12:13:11.669693 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ls8r8_1aa16ddb-306b-4e37-a33a-b9cdce3c254e/kube-multus/1.log" Mar 18 12:13:11 crc kubenswrapper[4843]: I0318 12:13:11.669793 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ls8r8" event={"ID":"1aa16ddb-306b-4e37-a33a-b9cdce3c254e","Type":"ContainerStarted","Data":"bd2f541f29cc54ab425776fa1e30b681a6fbe2865d995ebbfc75b4407a9e2df0"} Mar 18 12:13:11 crc kubenswrapper[4843]: I0318 12:13:11.982742 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:13:11 crc kubenswrapper[4843]: E0318 12:13:11.982901 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:13:12 crc kubenswrapper[4843]: E0318 12:13:12.084923 4843 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:13:12 crc kubenswrapper[4843]: I0318 12:13:12.983325 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:12 crc kubenswrapper[4843]: I0318 12:13:12.983372 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:12 crc kubenswrapper[4843]: E0318 12:13:12.983465 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:12 crc kubenswrapper[4843]: I0318 12:13:12.983589 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:12 crc kubenswrapper[4843]: E0318 12:13:12.983823 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:12 crc kubenswrapper[4843]: E0318 12:13:12.984437 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:12 crc kubenswrapper[4843]: I0318 12:13:12.985121 4843 scope.go:117] "RemoveContainer" containerID="3bdbcb1a9e9191c9d4a7ac1e7144ee2cdd03aaa1197ee2b2842d23d84dcfaa77" Mar 18 12:13:13 crc kubenswrapper[4843]: I0318 12:13:13.677509 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/3.log" Mar 18 12:13:13 crc kubenswrapper[4843]: I0318 12:13:13.680137 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerStarted","Data":"39bf6650b8bad5f54492fd2e7d7f7e4e2087d3fd5dec7dcae16607953e5712af"} Mar 18 12:13:13 crc kubenswrapper[4843]: I0318 12:13:13.680635 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:13:13 crc kubenswrapper[4843]: I0318 12:13:13.714221 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podStartSLOduration=164.714201095 podStartE2EDuration="2m44.714201095s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:13.713799194 +0000 UTC m=+227.429624718" watchObservedRunningTime="2026-03-18 12:13:13.714201095 +0000 UTC m=+227.430026619" Mar 18 12:13:13 crc kubenswrapper[4843]: I0318 12:13:13.843143 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sn986"] Mar 18 12:13:13 crc kubenswrapper[4843]: I0318 12:13:13.843280 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:13:13 crc kubenswrapper[4843]: E0318 12:13:13.843394 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:13:14 crc kubenswrapper[4843]: I0318 12:13:14.983469 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:14 crc kubenswrapper[4843]: I0318 12:13:14.983466 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:14 crc kubenswrapper[4843]: E0318 12:13:14.983962 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:14 crc kubenswrapper[4843]: I0318 12:13:14.983528 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:14 crc kubenswrapper[4843]: E0318 12:13:14.984059 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:14 crc kubenswrapper[4843]: I0318 12:13:14.983509 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:13:14 crc kubenswrapper[4843]: E0318 12:13:14.983988 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:14 crc kubenswrapper[4843]: E0318 12:13:14.984141 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:13:16 crc kubenswrapper[4843]: I0318 12:13:16.982874 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:16 crc kubenswrapper[4843]: I0318 12:13:16.982904 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:16 crc kubenswrapper[4843]: I0318 12:13:16.982944 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:13:16 crc kubenswrapper[4843]: E0318 12:13:16.984012 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:16 crc kubenswrapper[4843]: I0318 12:13:16.984036 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:16 crc kubenswrapper[4843]: E0318 12:13:16.984111 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sn986" podUID="62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c" Mar 18 12:13:16 crc kubenswrapper[4843]: E0318 12:13:16.984160 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:16 crc kubenswrapper[4843]: E0318 12:13:16.984207 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:18 crc kubenswrapper[4843]: I0318 12:13:18.982771 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:18 crc kubenswrapper[4843]: I0318 12:13:18.982834 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:13:18 crc kubenswrapper[4843]: I0318 12:13:18.982738 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:18 crc kubenswrapper[4843]: I0318 12:13:18.982798 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:18 crc kubenswrapper[4843]: I0318 12:13:18.985861 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 12:13:18 crc kubenswrapper[4843]: I0318 12:13:18.986834 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 12:13:18 crc kubenswrapper[4843]: I0318 12:13:18.987052 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 12:13:18 crc kubenswrapper[4843]: I0318 12:13:18.987188 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 12:13:18 crc kubenswrapper[4843]: I0318 12:13:18.987384 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 12:13:18 crc kubenswrapper[4843]: I0318 12:13:18.989442 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 12:13:19 crc kubenswrapper[4843]: I0318 12:13:19.247898 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:13:20 crc kubenswrapper[4843]: I0318 12:13:20.035257 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:13:20 crc kubenswrapper[4843]: I0318 12:13:20.036159 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.609433 4843 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.655206 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.655708 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.657610 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.658012 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.661097 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.661422 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.661513 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.664745 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.665174 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.665266 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.665191 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ppqpl"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.665619 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.667787 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k6c87"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.668333 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.668771 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hqt5c"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.669396 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.669889 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rs252"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.670359 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rs252" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.670986 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g7fkz"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.671560 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.689125 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.689572 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.689626 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.689584 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.690353 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.690680 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.690810 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.691489 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.691546 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.691619 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.690095 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.692088 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.692215 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.692343 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.692486 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.692587 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.692619 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.694055 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.694163 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.694250 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.694340 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.694440 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.694556 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.694573 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.694768 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.694825 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.694770 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.694998 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.695151 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.695298 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.695438 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.695570 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.695692 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.695800 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.695921 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.696043 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.696144 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.697241 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.697629 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.697743 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.697755 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d9qdf"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.697844 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.698206 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.698368 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.698710 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.699423 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.699466 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.699976 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.700206 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.700750 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.701696 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8lfcz"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.702305 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8lfcz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.702416 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kv474"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.702831 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.704371 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hj6cb"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.705021 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.705162 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hllcc"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.705851 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.720407 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7wm5b"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.723255 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.728345 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.736497 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.740620 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.740924 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.741361 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.742239 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.742285 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.742587 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.743229 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.743352 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.743457 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.743555 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.743692 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.743861 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.744283 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.744439 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.744464 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.744604 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.744784 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.744927 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.745068 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.745085 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.745171 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.745318 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.745750 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.745902 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.746030 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.746138 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.746239 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.746412 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.746515 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.749287 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.749476 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.749704 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.749825 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.749936 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.750049 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.750150 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.750258 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.750371 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.750578 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.746633 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.751483 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.751508 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.752755 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.758374 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.758733 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.758975 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.759155 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9g67x"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.759332 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.759535 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-csz7z"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.759965 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.760221 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.760491 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.760500 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.760570 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.760627 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.761522 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.761905 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.762314 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.762610 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.762863 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.771789 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-t28gc"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.772685 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t28gc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.776579 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.777750 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.788292 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.788620 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.788860 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.788884 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.789863 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.791791 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.796724 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.802837 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.803155 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.808167 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.810589 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.822541 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pmcbk"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.844449 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.844536 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.846157 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.846609 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/93fe4e7a-e8dc-4b64-9ff1-dc50ce085621-auth-proxy-config\") pod \"machine-approver-56656f9798-zsx6v\" (UID: \"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.846737 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93fe4e7a-e8dc-4b64-9ff1-dc50ce085621-config\") pod \"machine-approver-56656f9798-zsx6v\" (UID: \"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.846780 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.846806 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-audit-dir\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.846899 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82377b06-486d-46d6-b28a-0df0d6c86531-encryption-config\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.846920 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af6b48ca-8e58-4194-aa93-59825a221fbc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-g7h64\" (UID: \"af6b48ca-8e58-4194-aa93-59825a221fbc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.846936 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgmjf\" (UniqueName: \"kubernetes.io/projected/7ab6a392-4347-4101-88e2-00ec7b9aecf5-kube-api-access-sgmjf\") pod \"machine-api-operator-5694c8668f-hj6cb\" (UID: \"7ab6a392-4347-4101-88e2-00ec7b9aecf5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.846951 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-audit-policies\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.846977 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ab6a392-4347-4101-88e2-00ec7b9aecf5-config\") pod \"machine-api-operator-5694c8668f-hj6cb\" (UID: \"7ab6a392-4347-4101-88e2-00ec7b9aecf5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.846993 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9gz2\" (UniqueName: \"kubernetes.io/projected/07a3ed5e-9b84-4a42-9f3f-272159105861-kube-api-access-b9gz2\") pod \"dns-operator-744455d44c-rs252\" (UID: \"07a3ed5e-9b84-4a42-9f3f-272159105861\") " pod="openshift-dns-operator/dns-operator-744455d44c-rs252" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847011 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02fe1da6-6389-4153-9405-8a7a5f49fbde-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847031 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phv8d\" (UniqueName: \"kubernetes.io/projected/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-kube-api-access-phv8d\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847049 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82377b06-486d-46d6-b28a-0df0d6c86531-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847066 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-client-ca\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847083 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a0529e7-aab2-41b1-95e1-1cf8154430ca-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-whd7l\" (UID: \"9a0529e7-aab2-41b1-95e1-1cf8154430ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847100 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-serving-cert\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847116 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847133 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/061ecf07-8167-4652-8182-5779e5502bbf-serving-cert\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847151 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02fe1da6-6389-4153-9405-8a7a5f49fbde-config\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847172 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j94s4\" (UniqueName: \"kubernetes.io/projected/68d7cbd5-5cc5-4648-b94f-f256e12ae7d3-kube-api-access-j94s4\") pod \"openshift-config-operator-7777fb866f-k6c87\" (UID: \"68d7cbd5-5cc5-4648-b94f-f256e12ae7d3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847190 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82377b06-486d-46d6-b28a-0df0d6c86531-etcd-client\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847212 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ab6a392-4347-4101-88e2-00ec7b9aecf5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hj6cb\" (UID: \"7ab6a392-4347-4101-88e2-00ec7b9aecf5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847262 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-serving-cert\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847340 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kwkg\" (UniqueName: \"kubernetes.io/projected/93fe4e7a-e8dc-4b64-9ff1-dc50ce085621-kube-api-access-9kwkg\") pod \"machine-approver-56656f9798-zsx6v\" (UID: \"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847395 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847422 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d23cfc00-6762-41fb-bf10-e8aa0eda250b-audit-dir\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847448 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-oauth-config\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847466 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02fe1da6-6389-4153-9405-8a7a5f49fbde-serving-cert\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847489 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4vbl\" (UniqueName: \"kubernetes.io/projected/4e348d26-340c-4888-9c04-5112f6d56b05-kube-api-access-q4vbl\") pod \"console-operator-58897d9998-d9qdf\" (UID: \"4e348d26-340c-4888-9c04-5112f6d56b05\") " pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847507 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-etcd-serving-ca\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847579 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847599 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82377b06-486d-46d6-b28a-0df0d6c86531-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.847790 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.848234 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-spcss"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.849464 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.849979 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850046 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-service-ca\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850089 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850115 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-config\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850136 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-etcd-client\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850161 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fjqr\" (UniqueName: \"kubernetes.io/projected/1da258ba-cd08-4cb9-90bb-18675d625fd1-kube-api-access-4fjqr\") pod \"route-controller-manager-6576b87f9c-7n8zj\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850180 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850202 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68d7cbd5-5cc5-4648-b94f-f256e12ae7d3-serving-cert\") pod \"openshift-config-operator-7777fb866f-k6c87\" (UID: \"68d7cbd5-5cc5-4648-b94f-f256e12ae7d3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850226 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850250 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqbsb\" (UniqueName: \"kubernetes.io/projected/3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2-kube-api-access-xqbsb\") pod \"cluster-samples-operator-665b6dd947-vv7v7\" (UID: \"3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850272 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rd8\" (UniqueName: \"kubernetes.io/projected/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-kube-api-access-94rd8\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850297 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-etcd-service-ca\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850320 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82377b06-486d-46d6-b28a-0df0d6c86531-serving-cert\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850341 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrl8n\" (UniqueName: \"kubernetes.io/projected/82377b06-486d-46d6-b28a-0df0d6c86531-kube-api-access-mrl8n\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850363 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1da258ba-cd08-4cb9-90bb-18675d625fd1-serving-cert\") pod \"route-controller-manager-6576b87f9c-7n8zj\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850389 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af6b48ca-8e58-4194-aa93-59825a221fbc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-g7h64\" (UID: \"af6b48ca-8e58-4194-aa93-59825a221fbc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850413 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850442 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850484 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-audit\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850514 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-serving-cert\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850629 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82377b06-486d-46d6-b28a-0df0d6c86531-audit-dir\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850741 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmfwp\" (UniqueName: \"kubernetes.io/projected/d23cfc00-6762-41fb-bf10-e8aa0eda250b-kube-api-access-hmfwp\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850769 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/68d7cbd5-5cc5-4648-b94f-f256e12ae7d3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k6c87\" (UID: \"68d7cbd5-5cc5-4648-b94f-f256e12ae7d3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850797 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850820 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850824 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldc8t\" (UniqueName: \"kubernetes.io/projected/9a0529e7-aab2-41b1-95e1-1cf8154430ca-kube-api-access-ldc8t\") pod \"openshift-controller-manager-operator-756b6f6bc6-whd7l\" (UID: \"9a0529e7-aab2-41b1-95e1-1cf8154430ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850851 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82377b06-486d-46d6-b28a-0df0d6c86531-audit-policies\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850873 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850894 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07a3ed5e-9b84-4a42-9f3f-272159105861-metrics-tls\") pod \"dns-operator-744455d44c-rs252\" (UID: \"07a3ed5e-9b84-4a42-9f3f-272159105861\") " pod="openshift-dns-operator/dns-operator-744455d44c-rs252" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.850914 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e348d26-340c-4888-9c04-5112f6d56b05-serving-cert\") pod \"console-operator-58897d9998-d9qdf\" (UID: \"4e348d26-340c-4888-9c04-5112f6d56b05\") " pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.851017 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-config\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.851113 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pmcbk" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.851149 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-oauth-serving-cert\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.851334 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vv7v7\" (UID: \"3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.851368 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0529e7-aab2-41b1-95e1-1cf8154430ca-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-whd7l\" (UID: \"9a0529e7-aab2-41b1-95e1-1cf8154430ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.851388 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-encryption-config\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.851410 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.851412 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da258ba-cd08-4cb9-90bb-18675d625fd1-config\") pod \"route-controller-manager-6576b87f9c-7n8zj\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.851668 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.851242 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4fqc5"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852031 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q7x5\" (UniqueName: \"kubernetes.io/projected/02fe1da6-6389-4153-9405-8a7a5f49fbde-kube-api-access-7q7x5\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852053 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e348d26-340c-4888-9c04-5112f6d56b05-trusted-ca\") pod \"console-operator-58897d9998-d9qdf\" (UID: \"4e348d26-340c-4888-9c04-5112f6d56b05\") " pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852071 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/93fe4e7a-e8dc-4b64-9ff1-dc50ce085621-machine-approver-tls\") pod \"machine-approver-56656f9798-zsx6v\" (UID: \"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852087 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-trusted-ca-bundle\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852102 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-image-import-ca\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852137 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7ab6a392-4347-4101-88e2-00ec7b9aecf5-images\") pod \"machine-api-operator-5694c8668f-hj6cb\" (UID: \"7ab6a392-4347-4101-88e2-00ec7b9aecf5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852163 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852204 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1da258ba-cd08-4cb9-90bb-18675d625fd1-client-ca\") pod \"route-controller-manager-6576b87f9c-7n8zj\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852225 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-config\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852247 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vfvl\" (UniqueName: \"kubernetes.io/projected/b999a5c0-f4e8-499b-8f81-283c3a2cf495-kube-api-access-9vfvl\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852268 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-node-pullsecrets\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852292 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02fe1da6-6389-4153-9405-8a7a5f49fbde-service-ca-bundle\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852314 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-config\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852333 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-etcd-ca\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852350 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqq7l\" (UniqueName: \"kubernetes.io/projected/061ecf07-8167-4652-8182-5779e5502bbf-kube-api-access-vqq7l\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852367 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-etcd-client\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852381 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e348d26-340c-4888-9c04-5112f6d56b05-config\") pod \"console-operator-58897d9998-d9qdf\" (UID: \"4e348d26-340c-4888-9c04-5112f6d56b05\") " pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852399 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwjxm\" (UniqueName: \"kubernetes.io/projected/af6b48ca-8e58-4194-aa93-59825a221fbc-kube-api-access-vwjxm\") pod \"openshift-apiserver-operator-796bbdcf4f-g7h64\" (UID: \"af6b48ca-8e58-4194-aa93-59825a221fbc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852415 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9kcp\" (UniqueName: \"kubernetes.io/projected/dc56f3ce-caf3-4d53-9cf3-d909ec3edd16-kube-api-access-t9kcp\") pod \"downloads-7954f5f757-8lfcz\" (UID: \"dc56f3ce-caf3-4d53-9cf3-d909ec3edd16\") " pod="openshift-console/downloads-7954f5f757-8lfcz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.852534 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.858734 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.859436 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563932-bzf8p"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.860771 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.863255 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563932-bzf8p" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.866354 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.867297 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.867367 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.868878 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.869745 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.870401 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g7fkz"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.871805 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.873075 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-f6q6x"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.873334 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.875160 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.881375 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.885394 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-94pzm"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.886098 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k6c87"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.886148 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rs252"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.886162 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.886209 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.886268 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hqt5c"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.886290 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.886312 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4xm45"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.890086 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.890127 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hj6cb"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.890219 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.892515 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9g67x"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.894604 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.896017 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-t28gc"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.898601 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.901511 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7wm5b"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.903567 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.908130 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.908163 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d9qdf"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.909446 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.912315 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kv474"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.914020 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8lfcz"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.915703 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.917556 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.920986 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hllcc"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.925060 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.928247 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ppqpl"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.933279 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-m7wsv"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.936712 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-gscm9"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.938232 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m7wsv" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.940475 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.940928 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.942797 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-gscm9" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.944946 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.946257 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-spcss"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.947792 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.949245 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.950855 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pmcbk"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.952179 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-f6q6x"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954050 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/061ecf07-8167-4652-8182-5779e5502bbf-serving-cert\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954086 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02fe1da6-6389-4153-9405-8a7a5f49fbde-config\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954113 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j94s4\" (UniqueName: \"kubernetes.io/projected/68d7cbd5-5cc5-4648-b94f-f256e12ae7d3-kube-api-access-j94s4\") pod \"openshift-config-operator-7777fb866f-k6c87\" (UID: \"68d7cbd5-5cc5-4648-b94f-f256e12ae7d3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954135 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82377b06-486d-46d6-b28a-0df0d6c86531-etcd-client\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954159 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954183 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ab6a392-4347-4101-88e2-00ec7b9aecf5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hj6cb\" (UID: \"7ab6a392-4347-4101-88e2-00ec7b9aecf5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954208 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-serving-cert\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954230 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kwkg\" (UniqueName: \"kubernetes.io/projected/93fe4e7a-e8dc-4b64-9ff1-dc50ce085621-kube-api-access-9kwkg\") pod \"machine-approver-56656f9798-zsx6v\" (UID: \"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954261 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954281 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02fe1da6-6389-4153-9405-8a7a5f49fbde-serving-cert\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954302 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4vbl\" (UniqueName: \"kubernetes.io/projected/4e348d26-340c-4888-9c04-5112f6d56b05-kube-api-access-q4vbl\") pod \"console-operator-58897d9998-d9qdf\" (UID: \"4e348d26-340c-4888-9c04-5112f6d56b05\") " pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954323 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d23cfc00-6762-41fb-bf10-e8aa0eda250b-audit-dir\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954344 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-oauth-config\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954366 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-etcd-serving-ca\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954386 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954404 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82377b06-486d-46d6-b28a-0df0d6c86531-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954428 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-service-ca\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954455 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2781cd8-ccc6-4c7e-8c88-11788a29888d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zfh7x\" (UID: \"c2781cd8-ccc6-4c7e-8c88-11788a29888d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954483 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-etcd-client\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954509 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954532 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-config\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954555 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954582 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68d7cbd5-5cc5-4648-b94f-f256e12ae7d3-serving-cert\") pod \"openshift-config-operator-7777fb866f-k6c87\" (UID: \"68d7cbd5-5cc5-4648-b94f-f256e12ae7d3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954606 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fjqr\" (UniqueName: \"kubernetes.io/projected/1da258ba-cd08-4cb9-90bb-18675d625fd1-kube-api-access-4fjqr\") pod \"route-controller-manager-6576b87f9c-7n8zj\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954629 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94rd8\" (UniqueName: \"kubernetes.io/projected/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-kube-api-access-94rd8\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954687 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-etcd-service-ca\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954708 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82377b06-486d-46d6-b28a-0df0d6c86531-serving-cert\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954731 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrl8n\" (UniqueName: \"kubernetes.io/projected/82377b06-486d-46d6-b28a-0df0d6c86531-kube-api-access-mrl8n\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954796 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954821 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqbsb\" (UniqueName: \"kubernetes.io/projected/3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2-kube-api-access-xqbsb\") pod \"cluster-samples-operator-665b6dd947-vv7v7\" (UID: \"3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954854 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-audit\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954876 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-serving-cert\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954900 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82377b06-486d-46d6-b28a-0df0d6c86531-audit-dir\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954921 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1da258ba-cd08-4cb9-90bb-18675d625fd1-serving-cert\") pod \"route-controller-manager-6576b87f9c-7n8zj\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954943 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af6b48ca-8e58-4194-aa93-59825a221fbc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-g7h64\" (UID: \"af6b48ca-8e58-4194-aa93-59825a221fbc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954962 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954982 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.955028 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmfwp\" (UniqueName: \"kubernetes.io/projected/d23cfc00-6762-41fb-bf10-e8aa0eda250b-kube-api-access-hmfwp\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.955049 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/68d7cbd5-5cc5-4648-b94f-f256e12ae7d3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k6c87\" (UID: \"68d7cbd5-5cc5-4648-b94f-f256e12ae7d3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.955089 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldc8t\" (UniqueName: \"kubernetes.io/projected/9a0529e7-aab2-41b1-95e1-1cf8154430ca-kube-api-access-ldc8t\") pod \"openshift-controller-manager-operator-756b6f6bc6-whd7l\" (UID: \"9a0529e7-aab2-41b1-95e1-1cf8154430ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.955109 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82377b06-486d-46d6-b28a-0df0d6c86531-audit-policies\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.955129 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.955820 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02fe1da6-6389-4153-9405-8a7a5f49fbde-config\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.955869 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d23cfc00-6762-41fb-bf10-e8aa0eda250b-audit-dir\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956173 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e348d26-340c-4888-9c04-5112f6d56b05-serving-cert\") pod \"console-operator-58897d9998-d9qdf\" (UID: \"4e348d26-340c-4888-9c04-5112f6d56b05\") " pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956198 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956214 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07a3ed5e-9b84-4a42-9f3f-272159105861-metrics-tls\") pod \"dns-operator-744455d44c-rs252\" (UID: \"07a3ed5e-9b84-4a42-9f3f-272159105861\") " pod="openshift-dns-operator/dns-operator-744455d44c-rs252" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956255 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-config\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956274 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-oauth-serving-cert\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956289 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vv7v7\" (UID: \"3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956317 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-encryption-config\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956338 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2781cd8-ccc6-4c7e-8c88-11788a29888d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zfh7x\" (UID: \"c2781cd8-ccc6-4c7e-8c88-11788a29888d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956360 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0529e7-aab2-41b1-95e1-1cf8154430ca-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-whd7l\" (UID: \"9a0529e7-aab2-41b1-95e1-1cf8154430ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956382 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da258ba-cd08-4cb9-90bb-18675d625fd1-config\") pod \"route-controller-manager-6576b87f9c-7n8zj\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956405 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956426 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q7x5\" (UniqueName: \"kubernetes.io/projected/02fe1da6-6389-4153-9405-8a7a5f49fbde-kube-api-access-7q7x5\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956445 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e348d26-340c-4888-9c04-5112f6d56b05-trusted-ca\") pod \"console-operator-58897d9998-d9qdf\" (UID: \"4e348d26-340c-4888-9c04-5112f6d56b05\") " pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956467 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/93fe4e7a-e8dc-4b64-9ff1-dc50ce085621-machine-approver-tls\") pod \"machine-approver-56656f9798-zsx6v\" (UID: \"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956487 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-trusted-ca-bundle\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956506 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-image-import-ca\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956536 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7ab6a392-4347-4101-88e2-00ec7b9aecf5-images\") pod \"machine-api-operator-5694c8668f-hj6cb\" (UID: \"7ab6a392-4347-4101-88e2-00ec7b9aecf5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956578 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956601 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-node-pullsecrets\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956621 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02fe1da6-6389-4153-9405-8a7a5f49fbde-service-ca-bundle\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956637 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1da258ba-cd08-4cb9-90bb-18675d625fd1-client-ca\") pod \"route-controller-manager-6576b87f9c-7n8zj\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956673 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-config\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956696 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vfvl\" (UniqueName: \"kubernetes.io/projected/b999a5c0-f4e8-499b-8f81-283c3a2cf495-kube-api-access-9vfvl\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956729 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-config\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956753 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-etcd-ca\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956781 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-etcd-client\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956855 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e348d26-340c-4888-9c04-5112f6d56b05-config\") pod \"console-operator-58897d9998-d9qdf\" (UID: \"4e348d26-340c-4888-9c04-5112f6d56b05\") " pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956882 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqq7l\" (UniqueName: \"kubernetes.io/projected/061ecf07-8167-4652-8182-5779e5502bbf-kube-api-access-vqq7l\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956912 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6zp8\" (UniqueName: \"kubernetes.io/projected/c2781cd8-ccc6-4c7e-8c88-11788a29888d-kube-api-access-s6zp8\") pod \"kube-storage-version-migrator-operator-b67b599dd-zfh7x\" (UID: \"c2781cd8-ccc6-4c7e-8c88-11788a29888d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956940 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwjxm\" (UniqueName: \"kubernetes.io/projected/af6b48ca-8e58-4194-aa93-59825a221fbc-kube-api-access-vwjxm\") pod \"openshift-apiserver-operator-796bbdcf4f-g7h64\" (UID: \"af6b48ca-8e58-4194-aa93-59825a221fbc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956963 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9kcp\" (UniqueName: \"kubernetes.io/projected/dc56f3ce-caf3-4d53-9cf3-d909ec3edd16-kube-api-access-t9kcp\") pod \"downloads-7954f5f757-8lfcz\" (UID: \"dc56f3ce-caf3-4d53-9cf3-d909ec3edd16\") " pod="openshift-console/downloads-7954f5f757-8lfcz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.956988 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93fe4e7a-e8dc-4b64-9ff1-dc50ce085621-config\") pod \"machine-approver-56656f9798-zsx6v\" (UID: \"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957010 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957036 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-audit-dir\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957059 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/93fe4e7a-e8dc-4b64-9ff1-dc50ce085621-auth-proxy-config\") pod \"machine-approver-56656f9798-zsx6v\" (UID: \"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957082 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82377b06-486d-46d6-b28a-0df0d6c86531-encryption-config\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957111 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-audit-policies\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957167 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af6b48ca-8e58-4194-aa93-59825a221fbc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-g7h64\" (UID: \"af6b48ca-8e58-4194-aa93-59825a221fbc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957265 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgmjf\" (UniqueName: \"kubernetes.io/projected/7ab6a392-4347-4101-88e2-00ec7b9aecf5-kube-api-access-sgmjf\") pod \"machine-api-operator-5694c8668f-hj6cb\" (UID: \"7ab6a392-4347-4101-88e2-00ec7b9aecf5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957364 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9gz2\" (UniqueName: \"kubernetes.io/projected/07a3ed5e-9b84-4a42-9f3f-272159105861-kube-api-access-b9gz2\") pod \"dns-operator-744455d44c-rs252\" (UID: \"07a3ed5e-9b84-4a42-9f3f-272159105861\") " pod="openshift-dns-operator/dns-operator-744455d44c-rs252" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957455 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02fe1da6-6389-4153-9405-8a7a5f49fbde-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957514 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phv8d\" (UniqueName: \"kubernetes.io/projected/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-kube-api-access-phv8d\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957576 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ab6a392-4347-4101-88e2-00ec7b9aecf5-config\") pod \"machine-api-operator-5694c8668f-hj6cb\" (UID: \"7ab6a392-4347-4101-88e2-00ec7b9aecf5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957604 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82377b06-486d-46d6-b28a-0df0d6c86531-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957668 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-client-ca\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957707 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a0529e7-aab2-41b1-95e1-1cf8154430ca-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-whd7l\" (UID: \"9a0529e7-aab2-41b1-95e1-1cf8154430ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.957760 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-serving-cert\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.958853 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.954183 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.959291 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-94pzm"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.959310 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.959322 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.959331 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4fqc5"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.959681 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93fe4e7a-e8dc-4b64-9ff1-dc50ce085621-config\") pod \"machine-approver-56656f9798-zsx6v\" (UID: \"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.959814 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7ab6a392-4347-4101-88e2-00ec7b9aecf5-images\") pod \"machine-api-operator-5694c8668f-hj6cb\" (UID: \"7ab6a392-4347-4101-88e2-00ec7b9aecf5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.960616 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82377b06-486d-46d6-b28a-0df0d6c86531-audit-policies\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.960604 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-etcd-ca\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.960989 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-audit-dir\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.961525 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-node-pullsecrets\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.961615 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-config\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.961931 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/93fe4e7a-e8dc-4b64-9ff1-dc50ce085621-auth-proxy-config\") pod \"machine-approver-56656f9798-zsx6v\" (UID: \"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.962170 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02fe1da6-6389-4153-9405-8a7a5f49fbde-service-ca-bundle\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.962675 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-audit-policies\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.963056 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1da258ba-cd08-4cb9-90bb-18675d625fd1-client-ca\") pod \"route-controller-manager-6576b87f9c-7n8zj\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.963249 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-config\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.963285 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-etcd-service-ca\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.963992 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/061ecf07-8167-4652-8182-5779e5502bbf-serving-cert\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.964263 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.965193 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.965267 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-service-ca\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.965821 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-etcd-serving-ca\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.966350 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.966637 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e348d26-340c-4888-9c04-5112f6d56b05-config\") pod \"console-operator-58897d9998-d9qdf\" (UID: \"4e348d26-340c-4888-9c04-5112f6d56b05\") " pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.967265 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82377b06-486d-46d6-b28a-0df0d6c86531-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.967805 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-etcd-client\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.968252 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-serving-cert\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.968321 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563932-bzf8p"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.968350 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.969129 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af6b48ca-8e58-4194-aa93-59825a221fbc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-g7h64\" (UID: \"af6b48ca-8e58-4194-aa93-59825a221fbc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.969197 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.969470 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0529e7-aab2-41b1-95e1-1cf8154430ca-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-whd7l\" (UID: \"9a0529e7-aab2-41b1-95e1-1cf8154430ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.970059 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-audit\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.971617 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.971645 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.971674 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m7wsv"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.972497 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82377b06-486d-46d6-b28a-0df0d6c86531-encryption-config\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.973087 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ab6a392-4347-4101-88e2-00ec7b9aecf5-config\") pod \"machine-api-operator-5694c8668f-hj6cb\" (UID: \"7ab6a392-4347-4101-88e2-00ec7b9aecf5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.973417 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82377b06-486d-46d6-b28a-0df0d6c86531-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.973787 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.974043 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-client-ca\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.974600 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.974667 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/68d7cbd5-5cc5-4648-b94f-f256e12ae7d3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k6c87\" (UID: \"68d7cbd5-5cc5-4648-b94f-f256e12ae7d3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.974746 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07a3ed5e-9b84-4a42-9f3f-272159105861-metrics-tls\") pod \"dns-operator-744455d44c-rs252\" (UID: \"07a3ed5e-9b84-4a42-9f3f-272159105861\") " pod="openshift-dns-operator/dns-operator-744455d44c-rs252" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.975099 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.975143 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82377b06-486d-46d6-b28a-0df0d6c86531-serving-cert\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.975197 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82377b06-486d-46d6-b28a-0df0d6c86531-audit-dir\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.975492 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-config\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.975510 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-etcd-client\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.975591 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ab6a392-4347-4101-88e2-00ec7b9aecf5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hj6cb\" (UID: \"7ab6a392-4347-4101-88e2-00ec7b9aecf5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.976799 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-serving-cert\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.977203 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82377b06-486d-46d6-b28a-0df0d6c86531-etcd-client\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.977816 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.980843 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.981728 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da258ba-cd08-4cb9-90bb-18675d625fd1-config\") pod \"route-controller-manager-6576b87f9c-7n8zj\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.983995 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-image-import-ca\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.995467 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02fe1da6-6389-4153-9405-8a7a5f49fbde-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.997439 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx"] Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.998463 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02fe1da6-6389-4153-9405-8a7a5f49fbde-serving-cert\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.998506 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a0529e7-aab2-41b1-95e1-1cf8154430ca-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-whd7l\" (UID: \"9a0529e7-aab2-41b1-95e1-1cf8154430ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" Mar 18 12:13:23 crc kubenswrapper[4843]: I0318 12:13:23.999560 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1da258ba-cd08-4cb9-90bb-18675d625fd1-serving-cert\") pod \"route-controller-manager-6576b87f9c-7n8zj\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.001384 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-encryption-config\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.001584 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af6b48ca-8e58-4194-aa93-59825a221fbc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-g7h64\" (UID: \"af6b48ca-8e58-4194-aa93-59825a221fbc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.005092 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e348d26-340c-4888-9c04-5112f6d56b05-serving-cert\") pod \"console-operator-58897d9998-d9qdf\" (UID: \"4e348d26-340c-4888-9c04-5112f6d56b05\") " pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.055783 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/93fe4e7a-e8dc-4b64-9ff1-dc50ce085621-machine-approver-tls\") pod \"machine-approver-56656f9798-zsx6v\" (UID: \"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.058280 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-serving-cert\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.058933 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6zp8\" (UniqueName: \"kubernetes.io/projected/c2781cd8-ccc6-4c7e-8c88-11788a29888d-kube-api-access-s6zp8\") pod \"kube-storage-version-migrator-operator-b67b599dd-zfh7x\" (UID: \"c2781cd8-ccc6-4c7e-8c88-11788a29888d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.059099 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2781cd8-ccc6-4c7e-8c88-11788a29888d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zfh7x\" (UID: \"c2781cd8-ccc6-4c7e-8c88-11788a29888d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.059218 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2781cd8-ccc6-4c7e-8c88-11788a29888d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zfh7x\" (UID: \"c2781cd8-ccc6-4c7e-8c88-11788a29888d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.076428 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-config\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.077043 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.077231 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.077254 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.077550 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.080000 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-trusted-ca-bundle\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.080126 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4xm45"] Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.080445 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2781cd8-ccc6-4c7e-8c88-11788a29888d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zfh7x\" (UID: \"c2781cd8-ccc6-4c7e-8c88-11788a29888d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.081581 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj"] Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.081829 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.081840 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.081978 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.082147 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-oauth-config\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.082158 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vv7v7\" (UID: \"3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.083057 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8hpvl"] Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.083727 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-oauth-serving-cert\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.085050 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2781cd8-ccc6-4c7e-8c88-11788a29888d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zfh7x\" (UID: \"c2781cd8-ccc6-4c7e-8c88-11788a29888d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.085108 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8hpvl"] Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.085188 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8hpvl" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.085782 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e348d26-340c-4888-9c04-5112f6d56b05-trusted-ca\") pod \"console-operator-58897d9998-d9qdf\" (UID: \"4e348d26-340c-4888-9c04-5112f6d56b05\") " pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.086354 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68d7cbd5-5cc5-4648-b94f-f256e12ae7d3-serving-cert\") pod \"openshift-config-operator-7777fb866f-k6c87\" (UID: \"68d7cbd5-5cc5-4648-b94f-f256e12ae7d3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.098229 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.119257 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.138544 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.159464 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.185299 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.198104 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.218229 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.237790 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.259304 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.293128 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.300026 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.319604 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.339471 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.359639 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.378935 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.398880 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.419550 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.438890 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.458846 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.479149 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.497748 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.518920 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.539332 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.559351 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.578460 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.597907 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.619287 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.638607 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.658961 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.679606 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.699715 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.719757 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.738552 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.759907 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.778606 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.799122 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.816077 4843 request.go:700] Waited for 1.004988464s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-dockercfg-k9rxt&limit=500&resourceVersion=0 Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.818156 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.859027 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.879462 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.899732 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.918579 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.939028 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.959510 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 12:13:24 crc kubenswrapper[4843]: I0318 12:13:24.980974 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:24.999681 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.018557 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.046442 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.058427 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.078479 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.098495 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.119403 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.139279 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.159933 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.179866 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.199620 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.219869 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.238975 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.259383 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.279621 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.299025 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.320043 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.340210 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.359031 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.379230 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.399168 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.419041 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.438946 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.459578 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.479413 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.499821 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.518435 4843 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.539214 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.564224 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.579289 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.599991 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.619207 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.638208 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.659015 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.681395 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.719248 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j94s4\" (UniqueName: \"kubernetes.io/projected/68d7cbd5-5cc5-4648-b94f-f256e12ae7d3-kube-api-access-j94s4\") pod \"openshift-config-operator-7777fb866f-k6c87\" (UID: \"68d7cbd5-5cc5-4648-b94f-f256e12ae7d3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.736098 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fjqr\" (UniqueName: \"kubernetes.io/projected/1da258ba-cd08-4cb9-90bb-18675d625fd1-kube-api-access-4fjqr\") pod \"route-controller-manager-6576b87f9c-7n8zj\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.756016 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rd8\" (UniqueName: \"kubernetes.io/projected/e8bd6721-4cfb-4143-94f5-ad1f9fb985fb-kube-api-access-94rd8\") pod \"apiserver-76f77b778f-hqt5c\" (UID: \"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb\") " pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.774974 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrl8n\" (UniqueName: \"kubernetes.io/projected/82377b06-486d-46d6-b28a-0df0d6c86531-kube-api-access-mrl8n\") pod \"apiserver-7bbb656c7d-thtkg\" (UID: \"82377b06-486d-46d6-b28a-0df0d6c86531\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.779965 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.792617 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.795615 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vfvl\" (UniqueName: \"kubernetes.io/projected/b999a5c0-f4e8-499b-8f81-283c3a2cf495-kube-api-access-9vfvl\") pod \"console-f9d7485db-ppqpl\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.804672 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.816555 4843 request.go:700] Waited for 1.852091519s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.821141 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kwkg\" (UniqueName: \"kubernetes.io/projected/93fe4e7a-e8dc-4b64-9ff1-dc50ce085621-kube-api-access-9kwkg\") pod \"machine-approver-56656f9798-zsx6v\" (UID: \"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.835275 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqq7l\" (UniqueName: \"kubernetes.io/projected/061ecf07-8167-4652-8182-5779e5502bbf-kube-api-access-vqq7l\") pod \"controller-manager-879f6c89f-g7fkz\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.849563 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.866958 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwjxm\" (UniqueName: \"kubernetes.io/projected/af6b48ca-8e58-4194-aa93-59825a221fbc-kube-api-access-vwjxm\") pod \"openshift-apiserver-operator-796bbdcf4f-g7h64\" (UID: \"af6b48ca-8e58-4194-aa93-59825a221fbc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.875724 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.904508 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.924855 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4vbl\" (UniqueName: \"kubernetes.io/projected/4e348d26-340c-4888-9c04-5112f6d56b05-kube-api-access-q4vbl\") pod \"console-operator-58897d9998-d9qdf\" (UID: \"4e348d26-340c-4888-9c04-5112f6d56b05\") " pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:25 crc kubenswrapper[4843]: I0318 12:13:25.935056 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9kcp\" (UniqueName: \"kubernetes.io/projected/dc56f3ce-caf3-4d53-9cf3-d909ec3edd16-kube-api-access-t9kcp\") pod \"downloads-7954f5f757-8lfcz\" (UID: \"dc56f3ce-caf3-4d53-9cf3-d909ec3edd16\") " pod="openshift-console/downloads-7954f5f757-8lfcz" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.096726 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8lfcz" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.096791 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.097138 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.097920 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.122304 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.123577 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.123729 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.139891 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmfwp\" (UniqueName: \"kubernetes.io/projected/d23cfc00-6762-41fb-bf10-e8aa0eda250b-kube-api-access-hmfwp\") pod \"oauth-openshift-558db77b4-hllcc\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.143110 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqbsb\" (UniqueName: \"kubernetes.io/projected/3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2-kube-api-access-xqbsb\") pod \"cluster-samples-operator-665b6dd947-vv7v7\" (UID: \"3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.153204 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q7x5\" (UniqueName: \"kubernetes.io/projected/02fe1da6-6389-4153-9405-8a7a5f49fbde-kube-api-access-7q7x5\") pod \"authentication-operator-69f744f599-kv474\" (UID: \"02fe1da6-6389-4153-9405-8a7a5f49fbde\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.154458 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgmjf\" (UniqueName: \"kubernetes.io/projected/7ab6a392-4347-4101-88e2-00ec7b9aecf5-kube-api-access-sgmjf\") pod \"machine-api-operator-5694c8668f-hj6cb\" (UID: \"7ab6a392-4347-4101-88e2-00ec7b9aecf5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.157808 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9gz2\" (UniqueName: \"kubernetes.io/projected/07a3ed5e-9b84-4a42-9f3f-272159105861-kube-api-access-b9gz2\") pod \"dns-operator-744455d44c-rs252\" (UID: \"07a3ed5e-9b84-4a42-9f3f-272159105861\") " pod="openshift-dns-operator/dns-operator-744455d44c-rs252" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.159537 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldc8t\" (UniqueName: \"kubernetes.io/projected/9a0529e7-aab2-41b1-95e1-1cf8154430ca-kube-api-access-ldc8t\") pod \"openshift-controller-manager-operator-756b6f6bc6-whd7l\" (UID: \"9a0529e7-aab2-41b1-95e1-1cf8154430ca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.160708 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rs252" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.162853 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6zp8\" (UniqueName: \"kubernetes.io/projected/c2781cd8-ccc6-4c7e-8c88-11788a29888d-kube-api-access-s6zp8\") pod \"kube-storage-version-migrator-operator-b67b599dd-zfh7x\" (UID: \"c2781cd8-ccc6-4c7e-8c88-11788a29888d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.187261 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.192612 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phv8d\" (UniqueName: \"kubernetes.io/projected/a4489a46-ae0b-4ac9-82c2-24fed2c70f7d-kube-api-access-phv8d\") pod \"etcd-operator-b45778765-7wm5b\" (UID: \"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201384 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bacfe411-3a90-4d97-80d5-c39a102cdd9b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fd878\" (UID: \"bacfe411-3a90-4d97-80d5-c39a102cdd9b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201442 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/22656451-217a-4227-becf-75f7ae30423b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gf5bj\" (UID: \"22656451-217a-4227-becf-75f7ae30423b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201478 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8dec33ef-2627-4878-b7b6-f980772125b8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g5z5h\" (UID: \"8dec33ef-2627-4878-b7b6-f980772125b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201525 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dec33ef-2627-4878-b7b6-f980772125b8-trusted-ca\") pod \"ingress-operator-5b745b69d9-g5z5h\" (UID: \"8dec33ef-2627-4878-b7b6-f980772125b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201547 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6093d2c4-d78c-4522-a95a-224064548148-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jm8sq\" (UID: \"6093d2c4-d78c-4522-a95a-224064548148\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201607 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76plq\" (UniqueName: \"kubernetes.io/projected/8dec33ef-2627-4878-b7b6-f980772125b8-kube-api-access-76plq\") pod \"ingress-operator-5b745b69d9-g5z5h\" (UID: \"8dec33ef-2627-4878-b7b6-f980772125b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201684 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sgmt\" (UniqueName: \"kubernetes.io/projected/5ead3e48-f240-4392-951f-0faca5ec0a8c-kube-api-access-5sgmt\") pod \"multus-admission-controller-857f4d67dd-pmcbk\" (UID: \"5ead3e48-f240-4392-951f-0faca5ec0a8c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pmcbk" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201712 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq94x\" (UniqueName: \"kubernetes.io/projected/22656451-217a-4227-becf-75f7ae30423b-kube-api-access-rq94x\") pod \"control-plane-machine-set-operator-78cbb6b69f-gf5bj\" (UID: \"22656451-217a-4227-becf-75f7ae30423b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201762 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3fa3986-70fd-4d58-a04f-ddeec535f493-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sxdqp\" (UID: \"e3fa3986-70fd-4d58-a04f-ddeec535f493\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201796 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-metrics-certs\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201843 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201889 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-service-ca-bundle\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201922 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/13295157-4a57-4e7f-9ff4-13c2a4381c27-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201944 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff7eee41-1777-4084-ab10-412fe150b5c9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bqfhf\" (UID: \"ff7eee41-1777-4084-ab10-412fe150b5c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.201978 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f14cbab0-e920-47ad-9a17-7d5699e98eee-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5fzqw\" (UID: \"f14cbab0-e920-47ad-9a17-7d5699e98eee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202004 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9vmk\" (UniqueName: \"kubernetes.io/projected/e3fa3986-70fd-4d58-a04f-ddeec535f493-kube-api-access-s9vmk\") pod \"machine-config-controller-84d6567774-sxdqp\" (UID: \"e3fa3986-70fd-4d58-a04f-ddeec535f493\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202043 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-stats-auth\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202112 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ead3e48-f240-4392-951f-0faca5ec0a8c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pmcbk\" (UID: \"5ead3e48-f240-4392-951f-0faca5ec0a8c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pmcbk" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202144 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f14cbab0-e920-47ad-9a17-7d5699e98eee-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5fzqw\" (UID: \"f14cbab0-e920-47ad-9a17-7d5699e98eee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202165 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-default-certificate\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202215 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/13295157-4a57-4e7f-9ff4-13c2a4381c27-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202259 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxbf7\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-kube-api-access-mxbf7\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202286 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7eee41-1777-4084-ab10-412fe150b5c9-config\") pod \"kube-controller-manager-operator-78b949d7b-bqfhf\" (UID: \"ff7eee41-1777-4084-ab10-412fe150b5c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202308 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3fa3986-70fd-4d58-a04f-ddeec535f493-proxy-tls\") pod \"machine-config-controller-84d6567774-sxdqp\" (UID: \"e3fa3986-70fd-4d58-a04f-ddeec535f493\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202332 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-bound-sa-token\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202387 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv9qp\" (UniqueName: \"kubernetes.io/projected/59c312b9-d2a3-4404-9ec7-af59c0faf02a-kube-api-access-pv9qp\") pod \"migrator-59844c95c7-t28gc\" (UID: \"59c312b9-d2a3-4404-9ec7-af59c0faf02a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t28gc" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202675 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bacfe411-3a90-4d97-80d5-c39a102cdd9b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fd878\" (UID: \"bacfe411-3a90-4d97-80d5-c39a102cdd9b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202756 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7eee41-1777-4084-ab10-412fe150b5c9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bqfhf\" (UID: \"ff7eee41-1777-4084-ab10-412fe150b5c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.202849 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bacfe411-3a90-4d97-80d5-c39a102cdd9b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fd878\" (UID: \"bacfe411-3a90-4d97-80d5-c39a102cdd9b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" Mar 18 12:13:26 crc kubenswrapper[4843]: E0318 12:13:26.202985 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:26.702961936 +0000 UTC m=+240.418787660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.203670 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f14cbab0-e920-47ad-9a17-7d5699e98eee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5fzqw\" (UID: \"f14cbab0-e920-47ad-9a17-7d5699e98eee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.203848 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8dec33ef-2627-4878-b7b6-f980772125b8-metrics-tls\") pod \"ingress-operator-5b745b69d9-g5z5h\" (UID: \"8dec33ef-2627-4878-b7b6-f980772125b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.204047 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69sm7\" (UniqueName: \"kubernetes.io/projected/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-kube-api-access-69sm7\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.204095 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/13295157-4a57-4e7f-9ff4-13c2a4381c27-registry-certificates\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.204135 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13295157-4a57-4e7f-9ff4-13c2a4381c27-trusted-ca\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.204309 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwnqt\" (UniqueName: \"kubernetes.io/projected/f14cbab0-e920-47ad-9a17-7d5699e98eee-kube-api-access-mwnqt\") pod \"cluster-image-registry-operator-dc59b4c8b-5fzqw\" (UID: \"f14cbab0-e920-47ad-9a17-7d5699e98eee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.204361 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6093d2c4-d78c-4522-a95a-224064548148-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jm8sq\" (UID: \"6093d2c4-d78c-4522-a95a-224064548148\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.204399 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6093d2c4-d78c-4522-a95a-224064548148-config\") pod \"kube-apiserver-operator-766d6c64bb-jm8sq\" (UID: \"6093d2c4-d78c-4522-a95a-224064548148\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.204489 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-registry-tls\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.246662 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.306177 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.306620 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.306833 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/63edbbe4-08b8-4ea9-987c-83ebca62fc5e-images\") pod \"machine-config-operator-74547568cd-spcss\" (UID: \"63edbbe4-08b8-4ea9-987c-83ebca62fc5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.306862 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1442992-fe43-43be-a43b-48f80db66418-secret-volume\") pod \"collect-profiles-29563920-nnwnx\" (UID: \"b1442992-fe43-43be-a43b-48f80db66418\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.306893 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sgmt\" (UniqueName: \"kubernetes.io/projected/5ead3e48-f240-4392-951f-0faca5ec0a8c-kube-api-access-5sgmt\") pod \"multus-admission-controller-857f4d67dd-pmcbk\" (UID: \"5ead3e48-f240-4392-951f-0faca5ec0a8c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pmcbk" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.306921 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq94x\" (UniqueName: \"kubernetes.io/projected/22656451-217a-4227-becf-75f7ae30423b-kube-api-access-rq94x\") pod \"control-plane-machine-set-operator-78cbb6b69f-gf5bj\" (UID: \"22656451-217a-4227-becf-75f7ae30423b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.306947 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3fa3986-70fd-4d58-a04f-ddeec535f493-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sxdqp\" (UID: \"e3fa3986-70fd-4d58-a04f-ddeec535f493\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307025 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-metrics-certs\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307083 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k95tl\" (UniqueName: \"kubernetes.io/projected/772ec8a8-abe6-43b5-9239-2331a455b602-kube-api-access-k95tl\") pod \"ingress-canary-m7wsv\" (UID: \"772ec8a8-abe6-43b5-9239-2331a455b602\") " pod="openshift-ingress-canary/ingress-canary-m7wsv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307136 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63edbbe4-08b8-4ea9-987c-83ebca62fc5e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-spcss\" (UID: \"63edbbe4-08b8-4ea9-987c-83ebca62fc5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307169 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-service-ca-bundle\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307194 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cf9261b-564c-45e7-a789-0d31a034ad44-srv-cert\") pod \"olm-operator-6b444d44fb-xp4kv\" (UID: \"4cf9261b-564c-45e7-a789-0d31a034ad44\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307232 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/016cbd62-23a6-413f-82b5-b806746e2b01-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4fqc5\" (UID: \"016cbd62-23a6-413f-82b5-b806746e2b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307256 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/92c985ef-3317-4220-819d-f482f8b50d60-tmpfs\") pod \"packageserver-d55dfcdfc-kg7xw\" (UID: \"92c985ef-3317-4220-819d-f482f8b50d60\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307284 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcvrk\" (UniqueName: \"kubernetes.io/projected/cd0cd970-7aa7-4c43-a967-513cd197dc38-kube-api-access-rcvrk\") pod \"service-ca-9c57cc56f-f6q6x\" (UID: \"cd0cd970-7aa7-4c43-a967-513cd197dc38\") " pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307325 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/13295157-4a57-4e7f-9ff4-13c2a4381c27-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307358 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff7eee41-1777-4084-ab10-412fe150b5c9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bqfhf\" (UID: \"ff7eee41-1777-4084-ab10-412fe150b5c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307403 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-plugins-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307424 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8-certs\") pod \"machine-config-server-gscm9\" (UID: \"b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8\") " pod="openshift-machine-config-operator/machine-config-server-gscm9" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307452 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f14cbab0-e920-47ad-9a17-7d5699e98eee-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5fzqw\" (UID: \"f14cbab0-e920-47ad-9a17-7d5699e98eee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307475 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9vmk\" (UniqueName: \"kubernetes.io/projected/e3fa3986-70fd-4d58-a04f-ddeec535f493-kube-api-access-s9vmk\") pod \"machine-config-controller-84d6567774-sxdqp\" (UID: \"e3fa3986-70fd-4d58-a04f-ddeec535f493\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307499 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-898g8\" (UniqueName: \"kubernetes.io/projected/b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8-kube-api-access-898g8\") pod \"machine-config-server-gscm9\" (UID: \"b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8\") " pod="openshift-machine-config-operator/machine-config-server-gscm9" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307535 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4955\" (UniqueName: \"kubernetes.io/projected/9a1ef083-bcc3-48fc-9d9d-41008d4673b0-kube-api-access-c4955\") pod \"catalog-operator-68c6474976-brsfz\" (UID: \"9a1ef083-bcc3-48fc-9d9d-41008d4673b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307559 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-stats-auth\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307576 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcqm7\" (UniqueName: \"kubernetes.io/projected/803876de-64f6-4347-8ea5-6d2d8f87e828-kube-api-access-tcqm7\") pod \"auto-csr-approver-29563932-bzf8p\" (UID: \"803876de-64f6-4347-8ea5-6d2d8f87e828\") " pod="openshift-infra/auto-csr-approver-29563932-bzf8p" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307597 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c4c78e8-738c-4105-8b56-9f8f900b496e-config-volume\") pod \"dns-default-8hpvl\" (UID: \"5c4c78e8-738c-4105-8b56-9f8f900b496e\") " pod="openshift-dns/dns-default-8hpvl" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307621 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1442992-fe43-43be-a43b-48f80db66418-config-volume\") pod \"collect-profiles-29563920-nnwnx\" (UID: \"b1442992-fe43-43be-a43b-48f80db66418\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307640 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ead3e48-f240-4392-951f-0faca5ec0a8c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pmcbk\" (UID: \"5ead3e48-f240-4392-951f-0faca5ec0a8c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pmcbk" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307677 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f14cbab0-e920-47ad-9a17-7d5699e98eee-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5fzqw\" (UID: \"f14cbab0-e920-47ad-9a17-7d5699e98eee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307727 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-default-certificate\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307756 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe3bf0b1-4566-4a39-8629-d4d268bd5977-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dzg6s\" (UID: \"fe3bf0b1-4566-4a39-8629-d4d268bd5977\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307773 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-socket-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307794 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/13295157-4a57-4e7f-9ff4-13c2a4381c27-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307815 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/016cbd62-23a6-413f-82b5-b806746e2b01-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4fqc5\" (UID: \"016cbd62-23a6-413f-82b5-b806746e2b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307857 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjsml\" (UniqueName: \"kubernetes.io/projected/5c4c78e8-738c-4105-8b56-9f8f900b496e-kube-api-access-bjsml\") pod \"dns-default-8hpvl\" (UID: \"5c4c78e8-738c-4105-8b56-9f8f900b496e\") " pod="openshift-dns/dns-default-8hpvl" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307876 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8-node-bootstrap-token\") pod \"machine-config-server-gscm9\" (UID: \"b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8\") " pod="openshift-machine-config-operator/machine-config-server-gscm9" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307904 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mxhd\" (UniqueName: \"kubernetes.io/projected/fe3bf0b1-4566-4a39-8629-d4d268bd5977-kube-api-access-6mxhd\") pod \"package-server-manager-789f6589d5-dzg6s\" (UID: \"fe3bf0b1-4566-4a39-8629-d4d268bd5977\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307936 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxbf7\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-kube-api-access-mxbf7\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307956 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9a1ef083-bcc3-48fc-9d9d-41008d4673b0-srv-cert\") pod \"catalog-operator-68c6474976-brsfz\" (UID: \"9a1ef083-bcc3-48fc-9d9d-41008d4673b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307974 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpbw9\" (UniqueName: \"kubernetes.io/projected/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-kube-api-access-wpbw9\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308040 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7eee41-1777-4084-ab10-412fe150b5c9-config\") pod \"kube-controller-manager-operator-78b949d7b-bqfhf\" (UID: \"ff7eee41-1777-4084-ab10-412fe150b5c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308066 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3fa3986-70fd-4d58-a04f-ddeec535f493-proxy-tls\") pod \"machine-config-controller-84d6567774-sxdqp\" (UID: \"e3fa3986-70fd-4d58-a04f-ddeec535f493\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308105 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/92c985ef-3317-4220-819d-f482f8b50d60-webhook-cert\") pod \"packageserver-d55dfcdfc-kg7xw\" (UID: \"92c985ef-3317-4220-819d-f482f8b50d60\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308142 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-bound-sa-token\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308170 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv9qp\" (UniqueName: \"kubernetes.io/projected/59c312b9-d2a3-4404-9ec7-af59c0faf02a-kube-api-access-pv9qp\") pod \"migrator-59844c95c7-t28gc\" (UID: \"59c312b9-d2a3-4404-9ec7-af59c0faf02a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t28gc" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308193 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a181e9f-1fda-43f1-a42e-c64602afbcd2-config\") pod \"service-ca-operator-777779d784-94pzm\" (UID: \"2a181e9f-1fda-43f1-a42e-c64602afbcd2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308211 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-csi-data-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308247 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c4c78e8-738c-4105-8b56-9f8f900b496e-metrics-tls\") pod \"dns-default-8hpvl\" (UID: \"5c4c78e8-738c-4105-8b56-9f8f900b496e\") " pod="openshift-dns/dns-default-8hpvl" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308274 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bacfe411-3a90-4d97-80d5-c39a102cdd9b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fd878\" (UID: \"bacfe411-3a90-4d97-80d5-c39a102cdd9b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308298 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cd0cd970-7aa7-4c43-a967-513cd197dc38-signing-cabundle\") pod \"service-ca-9c57cc56f-f6q6x\" (UID: \"cd0cd970-7aa7-4c43-a967-513cd197dc38\") " pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308317 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63edbbe4-08b8-4ea9-987c-83ebca62fc5e-proxy-tls\") pod \"machine-config-operator-74547568cd-spcss\" (UID: \"63edbbe4-08b8-4ea9-987c-83ebca62fc5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308337 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cd0cd970-7aa7-4c43-a967-513cd197dc38-signing-key\") pod \"service-ca-9c57cc56f-f6q6x\" (UID: \"cd0cd970-7aa7-4c43-a967-513cd197dc38\") " pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308369 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7fpd\" (UniqueName: \"kubernetes.io/projected/2a181e9f-1fda-43f1-a42e-c64602afbcd2-kube-api-access-v7fpd\") pod \"service-ca-operator-777779d784-94pzm\" (UID: \"2a181e9f-1fda-43f1-a42e-c64602afbcd2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308406 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7eee41-1777-4084-ab10-412fe150b5c9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bqfhf\" (UID: \"ff7eee41-1777-4084-ab10-412fe150b5c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308454 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bacfe411-3a90-4d97-80d5-c39a102cdd9b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fd878\" (UID: \"bacfe411-3a90-4d97-80d5-c39a102cdd9b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308486 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f14cbab0-e920-47ad-9a17-7d5699e98eee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5fzqw\" (UID: \"f14cbab0-e920-47ad-9a17-7d5699e98eee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308515 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xnkn\" (UniqueName: \"kubernetes.io/projected/b1442992-fe43-43be-a43b-48f80db66418-kube-api-access-5xnkn\") pod \"collect-profiles-29563920-nnwnx\" (UID: \"b1442992-fe43-43be-a43b-48f80db66418\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308540 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8dec33ef-2627-4878-b7b6-f980772125b8-metrics-tls\") pod \"ingress-operator-5b745b69d9-g5z5h\" (UID: \"8dec33ef-2627-4878-b7b6-f980772125b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308563 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-registration-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308606 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69sm7\" (UniqueName: \"kubernetes.io/projected/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-kube-api-access-69sm7\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308629 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnfbt\" (UniqueName: \"kubernetes.io/projected/63edbbe4-08b8-4ea9-987c-83ebca62fc5e-kube-api-access-vnfbt\") pod \"machine-config-operator-74547568cd-spcss\" (UID: \"63edbbe4-08b8-4ea9-987c-83ebca62fc5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308671 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/13295157-4a57-4e7f-9ff4-13c2a4381c27-registry-certificates\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308694 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13295157-4a57-4e7f-9ff4-13c2a4381c27-trusted-ca\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308715 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdhqh\" (UniqueName: \"kubernetes.io/projected/92c985ef-3317-4220-819d-f482f8b50d60-kube-api-access-cdhqh\") pod \"packageserver-d55dfcdfc-kg7xw\" (UID: \"92c985ef-3317-4220-819d-f482f8b50d60\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308737 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/772ec8a8-abe6-43b5-9239-2331a455b602-cert\") pod \"ingress-canary-m7wsv\" (UID: \"772ec8a8-abe6-43b5-9239-2331a455b602\") " pod="openshift-ingress-canary/ingress-canary-m7wsv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308756 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9a1ef083-bcc3-48fc-9d9d-41008d4673b0-profile-collector-cert\") pod \"catalog-operator-68c6474976-brsfz\" (UID: \"9a1ef083-bcc3-48fc-9d9d-41008d4673b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308778 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkzfk\" (UniqueName: \"kubernetes.io/projected/4cf9261b-564c-45e7-a789-0d31a034ad44-kube-api-access-jkzfk\") pod \"olm-operator-6b444d44fb-xp4kv\" (UID: \"4cf9261b-564c-45e7-a789-0d31a034ad44\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308818 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv422\" (UniqueName: \"kubernetes.io/projected/016cbd62-23a6-413f-82b5-b806746e2b01-kube-api-access-zv422\") pod \"marketplace-operator-79b997595-4fqc5\" (UID: \"016cbd62-23a6-413f-82b5-b806746e2b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308845 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6093d2c4-d78c-4522-a95a-224064548148-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jm8sq\" (UID: \"6093d2c4-d78c-4522-a95a-224064548148\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308867 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/92c985ef-3317-4220-819d-f482f8b50d60-apiservice-cert\") pod \"packageserver-d55dfcdfc-kg7xw\" (UID: \"92c985ef-3317-4220-819d-f482f8b50d60\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308930 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwnqt\" (UniqueName: \"kubernetes.io/projected/f14cbab0-e920-47ad-9a17-7d5699e98eee-kube-api-access-mwnqt\") pod \"cluster-image-registry-operator-dc59b4c8b-5fzqw\" (UID: \"f14cbab0-e920-47ad-9a17-7d5699e98eee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.308968 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6093d2c4-d78c-4522-a95a-224064548148-config\") pod \"kube-apiserver-operator-766d6c64bb-jm8sq\" (UID: \"6093d2c4-d78c-4522-a95a-224064548148\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.309010 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-registry-tls\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.309069 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/22656451-217a-4227-becf-75f7ae30423b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gf5bj\" (UID: \"22656451-217a-4227-becf-75f7ae30423b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.309102 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cf9261b-564c-45e7-a789-0d31a034ad44-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xp4kv\" (UID: \"4cf9261b-564c-45e7-a789-0d31a034ad44\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.309130 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bacfe411-3a90-4d97-80d5-c39a102cdd9b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fd878\" (UID: \"bacfe411-3a90-4d97-80d5-c39a102cdd9b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.309173 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dec33ef-2627-4878-b7b6-f980772125b8-trusted-ca\") pod \"ingress-operator-5b745b69d9-g5z5h\" (UID: \"8dec33ef-2627-4878-b7b6-f980772125b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.309206 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8dec33ef-2627-4878-b7b6-f980772125b8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g5z5h\" (UID: \"8dec33ef-2627-4878-b7b6-f980772125b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.309236 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a181e9f-1fda-43f1-a42e-c64602afbcd2-serving-cert\") pod \"service-ca-operator-777779d784-94pzm\" (UID: \"2a181e9f-1fda-43f1-a42e-c64602afbcd2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.309278 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6093d2c4-d78c-4522-a95a-224064548148-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jm8sq\" (UID: \"6093d2c4-d78c-4522-a95a-224064548148\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.309310 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76plq\" (UniqueName: \"kubernetes.io/projected/8dec33ef-2627-4878-b7b6-f980772125b8-kube-api-access-76plq\") pod \"ingress-operator-5b745b69d9-g5z5h\" (UID: \"8dec33ef-2627-4878-b7b6-f980772125b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.309353 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-mountpoint-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.309712 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6093d2c4-d78c-4522-a95a-224064548148-config\") pod \"kube-apiserver-operator-766d6c64bb-jm8sq\" (UID: \"6093d2c4-d78c-4522-a95a-224064548148\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.314765 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-stats-auth\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.314949 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.315963 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/13295157-4a57-4e7f-9ff4-13c2a4381c27-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.316161 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-default-certificate\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.307829 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3fa3986-70fd-4d58-a04f-ddeec535f493-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sxdqp\" (UID: \"e3fa3986-70fd-4d58-a04f-ddeec535f493\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.317207 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7eee41-1777-4084-ab10-412fe150b5c9-config\") pod \"kube-controller-manager-operator-78b949d7b-bqfhf\" (UID: \"ff7eee41-1777-4084-ab10-412fe150b5c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" Mar 18 12:13:26 crc kubenswrapper[4843]: E0318 12:13:26.318478 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:26.818456534 +0000 UTC m=+240.534282058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.319931 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dec33ef-2627-4878-b7b6-f980772125b8-trusted-ca\") pod \"ingress-operator-5b745b69d9-g5z5h\" (UID: \"8dec33ef-2627-4878-b7b6-f980772125b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.320413 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/13295157-4a57-4e7f-9ff4-13c2a4381c27-registry-certificates\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.320552 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-service-ca-bundle\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.322357 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8dec33ef-2627-4878-b7b6-f980772125b8-metrics-tls\") pod \"ingress-operator-5b745b69d9-g5z5h\" (UID: \"8dec33ef-2627-4878-b7b6-f980772125b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.323218 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bacfe411-3a90-4d97-80d5-c39a102cdd9b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fd878\" (UID: \"bacfe411-3a90-4d97-80d5-c39a102cdd9b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.329958 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6093d2c4-d78c-4522-a95a-224064548148-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jm8sq\" (UID: \"6093d2c4-d78c-4522-a95a-224064548148\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.335263 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f14cbab0-e920-47ad-9a17-7d5699e98eee-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5fzqw\" (UID: \"f14cbab0-e920-47ad-9a17-7d5699e98eee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.335855 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.336540 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sgmt\" (UniqueName: \"kubernetes.io/projected/5ead3e48-f240-4392-951f-0faca5ec0a8c-kube-api-access-5sgmt\") pod \"multus-admission-controller-857f4d67dd-pmcbk\" (UID: \"5ead3e48-f240-4392-951f-0faca5ec0a8c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pmcbk" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.336756 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7eee41-1777-4084-ab10-412fe150b5c9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bqfhf\" (UID: \"ff7eee41-1777-4084-ab10-412fe150b5c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.337005 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/13295157-4a57-4e7f-9ff4-13c2a4381c27-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.337040 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-registry-tls\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.337081 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-metrics-certs\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.337083 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3fa3986-70fd-4d58-a04f-ddeec535f493-proxy-tls\") pod \"machine-config-controller-84d6567774-sxdqp\" (UID: \"e3fa3986-70fd-4d58-a04f-ddeec535f493\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.337637 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13295157-4a57-4e7f-9ff4-13c2a4381c27-trusted-ca\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.339905 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bacfe411-3a90-4d97-80d5-c39a102cdd9b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fd878\" (UID: \"bacfe411-3a90-4d97-80d5-c39a102cdd9b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.339978 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq94x\" (UniqueName: \"kubernetes.io/projected/22656451-217a-4227-becf-75f7ae30423b-kube-api-access-rq94x\") pod \"control-plane-machine-set-operator-78cbb6b69f-gf5bj\" (UID: \"22656451-217a-4227-becf-75f7ae30423b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.340108 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.356190 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.358365 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f14cbab0-e920-47ad-9a17-7d5699e98eee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5fzqw\" (UID: \"f14cbab0-e920-47ad-9a17-7d5699e98eee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.358408 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ead3e48-f240-4392-951f-0faca5ec0a8c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pmcbk\" (UID: \"5ead3e48-f240-4392-951f-0faca5ec0a8c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pmcbk" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.365846 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxbf7\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-kube-api-access-mxbf7\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.369632 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/22656451-217a-4227-becf-75f7ae30423b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gf5bj\" (UID: \"22656451-217a-4227-becf-75f7ae30423b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.386207 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76plq\" (UniqueName: \"kubernetes.io/projected/8dec33ef-2627-4878-b7b6-f980772125b8-kube-api-access-76plq\") pod \"ingress-operator-5b745b69d9-g5z5h\" (UID: \"8dec33ef-2627-4878-b7b6-f980772125b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.417546 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff7eee41-1777-4084-ab10-412fe150b5c9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bqfhf\" (UID: \"ff7eee41-1777-4084-ab10-412fe150b5c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418386 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xnkn\" (UniqueName: \"kubernetes.io/projected/b1442992-fe43-43be-a43b-48f80db66418-kube-api-access-5xnkn\") pod \"collect-profiles-29563920-nnwnx\" (UID: \"b1442992-fe43-43be-a43b-48f80db66418\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418409 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-registration-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418431 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnfbt\" (UniqueName: \"kubernetes.io/projected/63edbbe4-08b8-4ea9-987c-83ebca62fc5e-kube-api-access-vnfbt\") pod \"machine-config-operator-74547568cd-spcss\" (UID: \"63edbbe4-08b8-4ea9-987c-83ebca62fc5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418448 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdhqh\" (UniqueName: \"kubernetes.io/projected/92c985ef-3317-4220-819d-f482f8b50d60-kube-api-access-cdhqh\") pod \"packageserver-d55dfcdfc-kg7xw\" (UID: \"92c985ef-3317-4220-819d-f482f8b50d60\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418464 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv422\" (UniqueName: \"kubernetes.io/projected/016cbd62-23a6-413f-82b5-b806746e2b01-kube-api-access-zv422\") pod \"marketplace-operator-79b997595-4fqc5\" (UID: \"016cbd62-23a6-413f-82b5-b806746e2b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418479 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/772ec8a8-abe6-43b5-9239-2331a455b602-cert\") pod \"ingress-canary-m7wsv\" (UID: \"772ec8a8-abe6-43b5-9239-2331a455b602\") " pod="openshift-ingress-canary/ingress-canary-m7wsv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418495 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9a1ef083-bcc3-48fc-9d9d-41008d4673b0-profile-collector-cert\") pod \"catalog-operator-68c6474976-brsfz\" (UID: \"9a1ef083-bcc3-48fc-9d9d-41008d4673b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418510 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkzfk\" (UniqueName: \"kubernetes.io/projected/4cf9261b-564c-45e7-a789-0d31a034ad44-kube-api-access-jkzfk\") pod \"olm-operator-6b444d44fb-xp4kv\" (UID: \"4cf9261b-564c-45e7-a789-0d31a034ad44\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418535 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/92c985ef-3317-4220-819d-f482f8b50d60-apiservice-cert\") pod \"packageserver-d55dfcdfc-kg7xw\" (UID: \"92c985ef-3317-4220-819d-f482f8b50d60\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418571 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cf9261b-564c-45e7-a789-0d31a034ad44-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xp4kv\" (UID: \"4cf9261b-564c-45e7-a789-0d31a034ad44\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418597 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a181e9f-1fda-43f1-a42e-c64602afbcd2-serving-cert\") pod \"service-ca-operator-777779d784-94pzm\" (UID: \"2a181e9f-1fda-43f1-a42e-c64602afbcd2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418619 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-mountpoint-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418642 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/63edbbe4-08b8-4ea9-987c-83ebca62fc5e-images\") pod \"machine-config-operator-74547568cd-spcss\" (UID: \"63edbbe4-08b8-4ea9-987c-83ebca62fc5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418679 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1442992-fe43-43be-a43b-48f80db66418-secret-volume\") pod \"collect-profiles-29563920-nnwnx\" (UID: \"b1442992-fe43-43be-a43b-48f80db66418\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418729 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418760 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63edbbe4-08b8-4ea9-987c-83ebca62fc5e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-spcss\" (UID: \"63edbbe4-08b8-4ea9-987c-83ebca62fc5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418786 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k95tl\" (UniqueName: \"kubernetes.io/projected/772ec8a8-abe6-43b5-9239-2331a455b602-kube-api-access-k95tl\") pod \"ingress-canary-m7wsv\" (UID: \"772ec8a8-abe6-43b5-9239-2331a455b602\") " pod="openshift-ingress-canary/ingress-canary-m7wsv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418814 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cf9261b-564c-45e7-a789-0d31a034ad44-srv-cert\") pod \"olm-operator-6b444d44fb-xp4kv\" (UID: \"4cf9261b-564c-45e7-a789-0d31a034ad44\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418845 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/016cbd62-23a6-413f-82b5-b806746e2b01-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4fqc5\" (UID: \"016cbd62-23a6-413f-82b5-b806746e2b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418863 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/92c985ef-3317-4220-819d-f482f8b50d60-tmpfs\") pod \"packageserver-d55dfcdfc-kg7xw\" (UID: \"92c985ef-3317-4220-819d-f482f8b50d60\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418882 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcvrk\" (UniqueName: \"kubernetes.io/projected/cd0cd970-7aa7-4c43-a967-513cd197dc38-kube-api-access-rcvrk\") pod \"service-ca-9c57cc56f-f6q6x\" (UID: \"cd0cd970-7aa7-4c43-a967-513cd197dc38\") " pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418901 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-plugins-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418932 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8-certs\") pod \"machine-config-server-gscm9\" (UID: \"b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8\") " pod="openshift-machine-config-operator/machine-config-server-gscm9" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418953 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4955\" (UniqueName: \"kubernetes.io/projected/9a1ef083-bcc3-48fc-9d9d-41008d4673b0-kube-api-access-c4955\") pod \"catalog-operator-68c6474976-brsfz\" (UID: \"9a1ef083-bcc3-48fc-9d9d-41008d4673b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.418986 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-898g8\" (UniqueName: \"kubernetes.io/projected/b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8-kube-api-access-898g8\") pod \"machine-config-server-gscm9\" (UID: \"b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8\") " pod="openshift-machine-config-operator/machine-config-server-gscm9" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419021 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcqm7\" (UniqueName: \"kubernetes.io/projected/803876de-64f6-4347-8ea5-6d2d8f87e828-kube-api-access-tcqm7\") pod \"auto-csr-approver-29563932-bzf8p\" (UID: \"803876de-64f6-4347-8ea5-6d2d8f87e828\") " pod="openshift-infra/auto-csr-approver-29563932-bzf8p" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419051 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c4c78e8-738c-4105-8b56-9f8f900b496e-config-volume\") pod \"dns-default-8hpvl\" (UID: \"5c4c78e8-738c-4105-8b56-9f8f900b496e\") " pod="openshift-dns/dns-default-8hpvl" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419072 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1442992-fe43-43be-a43b-48f80db66418-config-volume\") pod \"collect-profiles-29563920-nnwnx\" (UID: \"b1442992-fe43-43be-a43b-48f80db66418\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419099 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/016cbd62-23a6-413f-82b5-b806746e2b01-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4fqc5\" (UID: \"016cbd62-23a6-413f-82b5-b806746e2b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419116 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe3bf0b1-4566-4a39-8629-d4d268bd5977-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dzg6s\" (UID: \"fe3bf0b1-4566-4a39-8629-d4d268bd5977\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419135 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-socket-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419154 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8-node-bootstrap-token\") pod \"machine-config-server-gscm9\" (UID: \"b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8\") " pod="openshift-machine-config-operator/machine-config-server-gscm9" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419171 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjsml\" (UniqueName: \"kubernetes.io/projected/5c4c78e8-738c-4105-8b56-9f8f900b496e-kube-api-access-bjsml\") pod \"dns-default-8hpvl\" (UID: \"5c4c78e8-738c-4105-8b56-9f8f900b496e\") " pod="openshift-dns/dns-default-8hpvl" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419188 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mxhd\" (UniqueName: \"kubernetes.io/projected/fe3bf0b1-4566-4a39-8629-d4d268bd5977-kube-api-access-6mxhd\") pod \"package-server-manager-789f6589d5-dzg6s\" (UID: \"fe3bf0b1-4566-4a39-8629-d4d268bd5977\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419206 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9a1ef083-bcc3-48fc-9d9d-41008d4673b0-srv-cert\") pod \"catalog-operator-68c6474976-brsfz\" (UID: \"9a1ef083-bcc3-48fc-9d9d-41008d4673b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419221 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpbw9\" (UniqueName: \"kubernetes.io/projected/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-kube-api-access-wpbw9\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419240 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/92c985ef-3317-4220-819d-f482f8b50d60-webhook-cert\") pod \"packageserver-d55dfcdfc-kg7xw\" (UID: \"92c985ef-3317-4220-819d-f482f8b50d60\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419269 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a181e9f-1fda-43f1-a42e-c64602afbcd2-config\") pod \"service-ca-operator-777779d784-94pzm\" (UID: \"2a181e9f-1fda-43f1-a42e-c64602afbcd2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419282 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-csi-data-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419302 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cd0cd970-7aa7-4c43-a967-513cd197dc38-signing-cabundle\") pod \"service-ca-9c57cc56f-f6q6x\" (UID: \"cd0cd970-7aa7-4c43-a967-513cd197dc38\") " pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419319 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c4c78e8-738c-4105-8b56-9f8f900b496e-metrics-tls\") pod \"dns-default-8hpvl\" (UID: \"5c4c78e8-738c-4105-8b56-9f8f900b496e\") " pod="openshift-dns/dns-default-8hpvl" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419334 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63edbbe4-08b8-4ea9-987c-83ebca62fc5e-proxy-tls\") pod \"machine-config-operator-74547568cd-spcss\" (UID: \"63edbbe4-08b8-4ea9-987c-83ebca62fc5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419347 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cd0cd970-7aa7-4c43-a967-513cd197dc38-signing-key\") pod \"service-ca-9c57cc56f-f6q6x\" (UID: \"cd0cd970-7aa7-4c43-a967-513cd197dc38\") " pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419364 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7fpd\" (UniqueName: \"kubernetes.io/projected/2a181e9f-1fda-43f1-a42e-c64602afbcd2-kube-api-access-v7fpd\") pod \"service-ca-operator-777779d784-94pzm\" (UID: \"2a181e9f-1fda-43f1-a42e-c64602afbcd2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.419849 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-registration-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.423361 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/772ec8a8-abe6-43b5-9239-2331a455b602-cert\") pod \"ingress-canary-m7wsv\" (UID: \"772ec8a8-abe6-43b5-9239-2331a455b602\") " pod="openshift-ingress-canary/ingress-canary-m7wsv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.424087 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c4c78e8-738c-4105-8b56-9f8f900b496e-config-volume\") pod \"dns-default-8hpvl\" (UID: \"5c4c78e8-738c-4105-8b56-9f8f900b496e\") " pod="openshift-dns/dns-default-8hpvl" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.424762 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1442992-fe43-43be-a43b-48f80db66418-config-volume\") pod \"collect-profiles-29563920-nnwnx\" (UID: \"b1442992-fe43-43be-a43b-48f80db66418\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.426029 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/016cbd62-23a6-413f-82b5-b806746e2b01-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4fqc5\" (UID: \"016cbd62-23a6-413f-82b5-b806746e2b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.428852 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe3bf0b1-4566-4a39-8629-d4d268bd5977-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dzg6s\" (UID: \"fe3bf0b1-4566-4a39-8629-d4d268bd5977\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.428939 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-socket-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.429343 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9a1ef083-bcc3-48fc-9d9d-41008d4673b0-profile-collector-cert\") pod \"catalog-operator-68c6474976-brsfz\" (UID: \"9a1ef083-bcc3-48fc-9d9d-41008d4673b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.429693 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8dec33ef-2627-4878-b7b6-f980772125b8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g5z5h\" (UID: \"8dec33ef-2627-4878-b7b6-f980772125b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.430317 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-mountpoint-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.431124 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/63edbbe4-08b8-4ea9-987c-83ebca62fc5e-images\") pod \"machine-config-operator-74547568cd-spcss\" (UID: \"63edbbe4-08b8-4ea9-987c-83ebca62fc5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:26 crc kubenswrapper[4843]: E0318 12:13:26.432006 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:26.931991767 +0000 UTC m=+240.647817291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.432250 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/92c985ef-3317-4220-819d-f482f8b50d60-apiservice-cert\") pod \"packageserver-d55dfcdfc-kg7xw\" (UID: \"92c985ef-3317-4220-819d-f482f8b50d60\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.433983 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a181e9f-1fda-43f1-a42e-c64602afbcd2-config\") pod \"service-ca-operator-777779d784-94pzm\" (UID: \"2a181e9f-1fda-43f1-a42e-c64602afbcd2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.434456 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-csi-data-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.437841 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63edbbe4-08b8-4ea9-987c-83ebca62fc5e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-spcss\" (UID: \"63edbbe4-08b8-4ea9-987c-83ebca62fc5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.438088 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/016cbd62-23a6-413f-82b5-b806746e2b01-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4fqc5\" (UID: \"016cbd62-23a6-413f-82b5-b806746e2b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.438101 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cf9261b-564c-45e7-a789-0d31a034ad44-srv-cert\") pod \"olm-operator-6b444d44fb-xp4kv\" (UID: \"4cf9261b-564c-45e7-a789-0d31a034ad44\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.438212 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-plugins-dir\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.438361 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/92c985ef-3317-4220-819d-f482f8b50d60-tmpfs\") pod \"packageserver-d55dfcdfc-kg7xw\" (UID: \"92c985ef-3317-4220-819d-f482f8b50d60\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.439121 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a181e9f-1fda-43f1-a42e-c64602afbcd2-serving-cert\") pod \"service-ca-operator-777779d784-94pzm\" (UID: \"2a181e9f-1fda-43f1-a42e-c64602afbcd2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.439447 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9a1ef083-bcc3-48fc-9d9d-41008d4673b0-srv-cert\") pod \"catalog-operator-68c6474976-brsfz\" (UID: \"9a1ef083-bcc3-48fc-9d9d-41008d4673b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.440880 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63edbbe4-08b8-4ea9-987c-83ebca62fc5e-proxy-tls\") pod \"machine-config-operator-74547568cd-spcss\" (UID: \"63edbbe4-08b8-4ea9-987c-83ebca62fc5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.441046 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8-certs\") pod \"machine-config-server-gscm9\" (UID: \"b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8\") " pod="openshift-machine-config-operator/machine-config-server-gscm9" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.441254 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c4c78e8-738c-4105-8b56-9f8f900b496e-metrics-tls\") pod \"dns-default-8hpvl\" (UID: \"5c4c78e8-738c-4105-8b56-9f8f900b496e\") " pod="openshift-dns/dns-default-8hpvl" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.441848 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cd0cd970-7aa7-4c43-a967-513cd197dc38-signing-cabundle\") pod \"service-ca-9c57cc56f-f6q6x\" (UID: \"cd0cd970-7aa7-4c43-a967-513cd197dc38\") " pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.443888 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1442992-fe43-43be-a43b-48f80db66418-secret-volume\") pod \"collect-profiles-29563920-nnwnx\" (UID: \"b1442992-fe43-43be-a43b-48f80db66418\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.444578 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8-node-bootstrap-token\") pod \"machine-config-server-gscm9\" (UID: \"b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8\") " pod="openshift-machine-config-operator/machine-config-server-gscm9" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.447549 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/92c985ef-3317-4220-819d-f482f8b50d60-webhook-cert\") pod \"packageserver-d55dfcdfc-kg7xw\" (UID: \"92c985ef-3317-4220-819d-f482f8b50d60\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.450362 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cf9261b-564c-45e7-a789-0d31a034ad44-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xp4kv\" (UID: \"4cf9261b-564c-45e7-a789-0d31a034ad44\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.460990 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cd0cd970-7aa7-4c43-a967-513cd197dc38-signing-key\") pod \"service-ca-9c57cc56f-f6q6x\" (UID: \"cd0cd970-7aa7-4c43-a967-513cd197dc38\") " pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.496217 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv9qp\" (UniqueName: \"kubernetes.io/projected/59c312b9-d2a3-4404-9ec7-af59c0faf02a-kube-api-access-pv9qp\") pod \"migrator-59844c95c7-t28gc\" (UID: \"59c312b9-d2a3-4404-9ec7-af59c0faf02a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t28gc" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.498546 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.552310 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bacfe411-3a90-4d97-80d5-c39a102cdd9b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fd878\" (UID: \"bacfe411-3a90-4d97-80d5-c39a102cdd9b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.555796 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwnqt\" (UniqueName: \"kubernetes.io/projected/f14cbab0-e920-47ad-9a17-7d5699e98eee-kube-api-access-mwnqt\") pod \"cluster-image-registry-operator-dc59b4c8b-5fzqw\" (UID: \"f14cbab0-e920-47ad-9a17-7d5699e98eee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.560339 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6093d2c4-d78c-4522-a95a-224064548148-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jm8sq\" (UID: \"6093d2c4-d78c-4522-a95a-224064548148\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.581133 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pmcbk" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.759506 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-bound-sa-token\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.777934 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69sm7\" (UniqueName: \"kubernetes.io/projected/9d8e49ec-9849-46fe-9a4b-6bbec11b2736-kube-api-access-69sm7\") pod \"router-default-5444994796-csz7z\" (UID: \"9d8e49ec-9849-46fe-9a4b-6bbec11b2736\") " pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.783175 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:26 crc kubenswrapper[4843]: E0318 12:13:26.783598 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:27.28358312 +0000 UTC m=+240.999408644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.783848 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t28gc" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.785015 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.785322 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.785637 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.786105 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.795376 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.812979 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcvrk\" (UniqueName: \"kubernetes.io/projected/cd0cd970-7aa7-4c43-a967-513cd197dc38-kube-api-access-rcvrk\") pod \"service-ca-9c57cc56f-f6q6x\" (UID: \"cd0cd970-7aa7-4c43-a967-513cd197dc38\") " pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.818471 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xnkn\" (UniqueName: \"kubernetes.io/projected/b1442992-fe43-43be-a43b-48f80db66418-kube-api-access-5xnkn\") pod \"collect-profiles-29563920-nnwnx\" (UID: \"b1442992-fe43-43be-a43b-48f80db66418\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.826780 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9vmk\" (UniqueName: \"kubernetes.io/projected/e3fa3986-70fd-4d58-a04f-ddeec535f493-kube-api-access-s9vmk\") pod \"machine-config-controller-84d6567774-sxdqp\" (UID: \"e3fa3986-70fd-4d58-a04f-ddeec535f493\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.827684 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjsml\" (UniqueName: \"kubernetes.io/projected/5c4c78e8-738c-4105-8b56-9f8f900b496e-kube-api-access-bjsml\") pod \"dns-default-8hpvl\" (UID: \"5c4c78e8-738c-4105-8b56-9f8f900b496e\") " pod="openshift-dns/dns-default-8hpvl" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.828297 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv422\" (UniqueName: \"kubernetes.io/projected/016cbd62-23a6-413f-82b5-b806746e2b01-kube-api-access-zv422\") pod \"marketplace-operator-79b997595-4fqc5\" (UID: \"016cbd62-23a6-413f-82b5-b806746e2b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.829548 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-898g8\" (UniqueName: \"kubernetes.io/projected/b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8-kube-api-access-898g8\") pod \"machine-config-server-gscm9\" (UID: \"b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8\") " pod="openshift-machine-config-operator/machine-config-server-gscm9" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.831000 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkzfk\" (UniqueName: \"kubernetes.io/projected/4cf9261b-564c-45e7-a789-0d31a034ad44-kube-api-access-jkzfk\") pod \"olm-operator-6b444d44fb-xp4kv\" (UID: \"4cf9261b-564c-45e7-a789-0d31a034ad44\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.856263 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.869749 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k95tl\" (UniqueName: \"kubernetes.io/projected/772ec8a8-abe6-43b5-9239-2331a455b602-kube-api-access-k95tl\") pod \"ingress-canary-m7wsv\" (UID: \"772ec8a8-abe6-43b5-9239-2331a455b602\") " pod="openshift-ingress-canary/ingress-canary-m7wsv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.870445 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mxhd\" (UniqueName: \"kubernetes.io/projected/fe3bf0b1-4566-4a39-8629-d4d268bd5977-kube-api-access-6mxhd\") pod \"package-server-manager-789f6589d5-dzg6s\" (UID: \"fe3bf0b1-4566-4a39-8629-d4d268bd5977\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.871891 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7fpd\" (UniqueName: \"kubernetes.io/projected/2a181e9f-1fda-43f1-a42e-c64602afbcd2-kube-api-access-v7fpd\") pod \"service-ca-operator-777779d784-94pzm\" (UID: \"2a181e9f-1fda-43f1-a42e-c64602afbcd2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.872539 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdhqh\" (UniqueName: \"kubernetes.io/projected/92c985ef-3317-4220-819d-f482f8b50d60-kube-api-access-cdhqh\") pod \"packageserver-d55dfcdfc-kg7xw\" (UID: \"92c985ef-3317-4220-819d-f482f8b50d60\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.873041 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcqm7\" (UniqueName: \"kubernetes.io/projected/803876de-64f6-4347-8ea5-6d2d8f87e828-kube-api-access-tcqm7\") pod \"auto-csr-approver-29563932-bzf8p\" (UID: \"803876de-64f6-4347-8ea5-6d2d8f87e828\") " pod="openshift-infra/auto-csr-approver-29563932-bzf8p" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.879094 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f14cbab0-e920-47ad-9a17-7d5699e98eee-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5fzqw\" (UID: \"f14cbab0-e920-47ad-9a17-7d5699e98eee\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.881980 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnfbt\" (UniqueName: \"kubernetes.io/projected/63edbbe4-08b8-4ea9-987c-83ebca62fc5e-kube-api-access-vnfbt\") pod \"machine-config-operator-74547568cd-spcss\" (UID: \"63edbbe4-08b8-4ea9-987c-83ebca62fc5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.885861 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:26 crc kubenswrapper[4843]: E0318 12:13:26.889304 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:27.389271606 +0000 UTC m=+241.105097150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.890916 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpbw9\" (UniqueName: \"kubernetes.io/projected/f73981c7-0a7b-4a4d-87f8-9a73e4d794d6-kube-api-access-wpbw9\") pod \"csi-hostpathplugin-4xm45\" (UID: \"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6\") " pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.891901 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.893320 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.895371 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4955\" (UniqueName: \"kubernetes.io/projected/9a1ef083-bcc3-48fc-9d9d-41008d4673b0-kube-api-access-c4955\") pod \"catalog-operator-68c6474976-brsfz\" (UID: \"9a1ef083-bcc3-48fc-9d9d-41008d4673b0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.918756 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.920044 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.928821 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.947444 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563932-bzf8p" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.948213 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:26 crc kubenswrapper[4843]: I0318 12:13:26.948637 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.009676 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:27 crc kubenswrapper[4843]: E0318 12:13:27.010173 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:27.51015667 +0000 UTC m=+241.225982194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.019935 4843 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.021632 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.021791 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.023019 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.037168 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.037363 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m7wsv" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.037947 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.050301 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.050712 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.051336 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4xm45" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.057056 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" event={"ID":"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621","Type":"ContainerStarted","Data":"47c2e124d64883ad64aea14bf4c3af13ccbb23d8e7404f29a948d7e42969ba68"} Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.082492 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.082518 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.090492 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8hpvl" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.092151 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-gscm9" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.113606 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:27 crc kubenswrapper[4843]: E0318 12:13:27.113994 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:27.613981682 +0000 UTC m=+241.329807206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.164904 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.170341 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.222920 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:27 crc kubenswrapper[4843]: E0318 12:13:27.227823 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:27.727776712 +0000 UTC m=+241.443602236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.346458 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:27 crc kubenswrapper[4843]: E0318 12:13:27.347001 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:27.846980368 +0000 UTC m=+241.562805892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.453538 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:27 crc kubenswrapper[4843]: E0318 12:13:27.454079 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:27.954061274 +0000 UTC m=+241.669886798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.480210 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj"] Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.613116 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:27 crc kubenswrapper[4843]: E0318 12:13:27.613710 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:28.113688645 +0000 UTC m=+241.829514169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.729171 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:27 crc kubenswrapper[4843]: E0318 12:13:27.729424 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:28.22939576 +0000 UTC m=+241.945221284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.729672 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:27 crc kubenswrapper[4843]: E0318 12:13:27.730201 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:28.230190323 +0000 UTC m=+241.946015847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.830271 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:27 crc kubenswrapper[4843]: E0318 12:13:27.830858 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:28.330832433 +0000 UTC m=+242.046657957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:27 crc kubenswrapper[4843]: I0318 12:13:27.931977 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:27 crc kubenswrapper[4843]: E0318 12:13:27.932446 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:28.432428671 +0000 UTC m=+242.148254195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.033502 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:28 crc kubenswrapper[4843]: E0318 12:13:28.033900 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:28.533881024 +0000 UTC m=+242.249706548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.071124 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-gscm9" event={"ID":"b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8","Type":"ContainerStarted","Data":"3b8812c2f1901d5ce77a5acff3b307eb4711f9d16fa66d4c1b06dabc9f1da63f"} Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.076230 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" event={"ID":"1da258ba-cd08-4cb9-90bb-18675d625fd1","Type":"ContainerStarted","Data":"d44e7664994e9cbe0c971f8b7e75a01aaa5eb1e52661a150ee2d459bb0c3547d"} Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.084764 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" event={"ID":"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621","Type":"ContainerStarted","Data":"688a6cd88caae5776b79c64d74969451b4d2455e1c40884ab772f9b863f34aa8"} Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.086904 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-csz7z" event={"ID":"9d8e49ec-9849-46fe-9a4b-6bbec11b2736","Type":"ContainerStarted","Data":"7eacedf67709d02b410b4b7eb2b5c7f5dd192d4dd6e94335f26dc68db734367c"} Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.157258 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:28 crc kubenswrapper[4843]: E0318 12:13:28.158191 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:28.658172817 +0000 UTC m=+242.373998341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.261918 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:28 crc kubenswrapper[4843]: E0318 12:13:28.262256 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:28.762226976 +0000 UTC m=+242.478052620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.262441 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:28 crc kubenswrapper[4843]: E0318 12:13:28.262869 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:28.762848524 +0000 UTC m=+242.478674228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.364141 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:28 crc kubenswrapper[4843]: E0318 12:13:28.364633 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:28.864612257 +0000 UTC m=+242.580437781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.466770 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:28 crc kubenswrapper[4843]: E0318 12:13:28.467124 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:28.967110731 +0000 UTC m=+242.682936255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.570101 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:28 crc kubenswrapper[4843]: E0318 12:13:28.570855 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:29.07083838 +0000 UTC m=+242.786663904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.733016 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:28 crc kubenswrapper[4843]: E0318 12:13:28.733446 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:29.233428786 +0000 UTC m=+242.949254310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.850684 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:28 crc kubenswrapper[4843]: E0318 12:13:28.851156 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:29.351137419 +0000 UTC m=+243.066962943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:28 crc kubenswrapper[4843]: I0318 12:13:28.979024 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:28 crc kubenswrapper[4843]: E0318 12:13:28.979468 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:29.479434956 +0000 UTC m=+243.195260480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.113061 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:29 crc kubenswrapper[4843]: E0318 12:13:29.113672 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:29.613631144 +0000 UTC m=+243.329456668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.179162 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-gscm9" event={"ID":"b6ef0d8d-7d44-43ee-9cf6-338a9b1cf6a8","Type":"ContainerStarted","Data":"5aa72eda9b7a1716a42f415deb2ae08a50ab8b8dc1276f7e7b2e1fb444c45b0b"} Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.191094 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-csz7z" event={"ID":"9d8e49ec-9849-46fe-9a4b-6bbec11b2736","Type":"ContainerStarted","Data":"23a1bf1f777a146e02e60ea91b5d413c22ad87c0e98e3f341702e1277019dff4"} Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.215275 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:29 crc kubenswrapper[4843]: E0318 12:13:29.216680 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:29.716624582 +0000 UTC m=+243.432453387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.316935 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:29 crc kubenswrapper[4843]: E0318 12:13:29.318336 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:29.818316353 +0000 UTC m=+243.534141877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.443192 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:29 crc kubenswrapper[4843]: E0318 12:13:29.443514 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:29.943500991 +0000 UTC m=+243.659326505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.496772 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-gscm9" podStartSLOduration=6.496743616 podStartE2EDuration="6.496743616s" podCreationTimestamp="2026-03-18 12:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:29.488284992 +0000 UTC m=+243.204110516" watchObservedRunningTime="2026-03-18 12:13:29.496743616 +0000 UTC m=+243.212569140" Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.545470 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-csz7z" podStartSLOduration=179.545444099 podStartE2EDuration="2m59.545444099s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:29.537342196 +0000 UTC m=+243.253167720" watchObservedRunningTime="2026-03-18 12:13:29.545444099 +0000 UTC m=+243.261269623" Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.717523 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:29 crc kubenswrapper[4843]: E0318 12:13:29.717975 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:30.217947421 +0000 UTC m=+243.933772945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.718172 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:29 crc kubenswrapper[4843]: E0318 12:13:29.718924 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:30.218915509 +0000 UTC m=+243.934741033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.926945 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hqt5c"] Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.928315 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.930491 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:29 crc kubenswrapper[4843]: E0318 12:13:29.931540 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:30.431497746 +0000 UTC m=+244.147323270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.934442 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:29 crc kubenswrapper[4843]: E0318 12:13:29.934956 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:30.434932195 +0000 UTC m=+244.150757719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.939723 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg"] Mar 18 12:13:29 crc kubenswrapper[4843]: I0318 12:13:29.940155 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ppqpl"] Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.038926 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:30 crc kubenswrapper[4843]: E0318 12:13:30.039428 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:30.539405196 +0000 UTC m=+244.255230720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.159367 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:30 crc kubenswrapper[4843]: E0318 12:13:30.159797 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:30.659780396 +0000 UTC m=+244.375605920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.260440 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:30 crc kubenswrapper[4843]: E0318 12:13:30.260921 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:30.76087226 +0000 UTC m=+244.476697784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.307804 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g7fkz"] Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.329504 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" event={"ID":"93fe4e7a-e8dc-4b64-9ff1-dc50ce085621","Type":"ContainerStarted","Data":"d9556b8adf9837f6f69cbdf6aca35dad8a360f0e94339f9db985fbc299a0e540"} Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.329689 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.329805 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.343621 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" event={"ID":"1da258ba-cd08-4cb9-90bb-18675d625fd1","Type":"ContainerStarted","Data":"c03c17b0a5cc07693a1d6178fc57c492bd0c9d00136f81b2cd4bc07238b6e1e3"} Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.344260 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.348133 4843 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7n8zj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" start-of-body= Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.348355 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" podUID="1da258ba-cd08-4cb9-90bb-18675d625fd1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.457472 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:30 crc kubenswrapper[4843]: E0318 12:13:30.468102 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:30.968071252 +0000 UTC m=+244.683896776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.540753 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" podStartSLOduration=180.540735786 podStartE2EDuration="3m0.540735786s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:30.539955664 +0000 UTC m=+244.255781198" watchObservedRunningTime="2026-03-18 12:13:30.540735786 +0000 UTC m=+244.256561310" Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.540843 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zsx6v" podStartSLOduration=181.540839569 podStartE2EDuration="3m1.540839569s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:30.449235109 +0000 UTC m=+244.165060633" watchObservedRunningTime="2026-03-18 12:13:30.540839569 +0000 UTC m=+244.256665083" Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.558702 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:30 crc kubenswrapper[4843]: E0318 12:13:30.559155 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:31.059138686 +0000 UTC m=+244.774964210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.660545 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:30 crc kubenswrapper[4843]: E0318 12:13:30.661119 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:31.161103885 +0000 UTC m=+244.876929399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.775893 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:30 crc kubenswrapper[4843]: E0318 12:13:30.776298 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:31.276280555 +0000 UTC m=+244.992106079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.787854 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.787916 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 18 12:13:30 crc kubenswrapper[4843]: I0318 12:13:30.878129 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:30 crc kubenswrapper[4843]: E0318 12:13:30.878497 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:31.37848234 +0000 UTC m=+245.094307864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:30.980109 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:31 crc kubenswrapper[4843]: E0318 12:13:30.980527 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:31.480506971 +0000 UTC m=+245.196332495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.039073 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k6c87"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.073936 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d9qdf"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.074020 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.103121 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:31 crc kubenswrapper[4843]: E0318 12:13:31.103608 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:31.603592428 +0000 UTC m=+245.319417952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.205080 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:31 crc kubenswrapper[4843]: E0318 12:13:31.205274 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:31.705242148 +0000 UTC m=+245.421067672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.205695 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:31 crc kubenswrapper[4843]: E0318 12:13:31.206204 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:31.706187705 +0000 UTC m=+245.422013409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.307304 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:31 crc kubenswrapper[4843]: E0318 12:13:31.307764 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:31.807748012 +0000 UTC m=+245.523573526 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.357637 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" event={"ID":"061ecf07-8167-4652-8182-5779e5502bbf","Type":"ContainerStarted","Data":"9c74208936b3d5da5bbf785fd6df4ad4876d32e81b24a1586af5360b2f7e2bc4"} Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.357695 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" event={"ID":"061ecf07-8167-4652-8182-5779e5502bbf","Type":"ContainerStarted","Data":"acea307aa487831f35c2e60a269b10c3dbb17a5ab334de85876da1d32083fc33"} Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.358362 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.361234 4843 generic.go:334] "Generic (PLEG): container finished" podID="e8bd6721-4cfb-4143-94f5-ad1f9fb985fb" containerID="572fe1ce450610121947d0b0209fda2d109dde4d6c9ebf392778c70585e7aa4a" exitCode=0 Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.361296 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" event={"ID":"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb","Type":"ContainerDied","Data":"572fe1ce450610121947d0b0209fda2d109dde4d6c9ebf392778c70585e7aa4a"} Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.361315 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" event={"ID":"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb","Type":"ContainerStarted","Data":"39b79e0e61e9315e06781d68fca5a63514907e3155410552238a8f203a317ad9"} Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.363093 4843 generic.go:334] "Generic (PLEG): container finished" podID="82377b06-486d-46d6-b28a-0df0d6c86531" containerID="ddde4f1d821bafa4a3eb951d19e2711079a9c668f33c53383051e0be1570ac42" exitCode=0 Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.363792 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" event={"ID":"82377b06-486d-46d6-b28a-0df0d6c86531","Type":"ContainerDied","Data":"ddde4f1d821bafa4a3eb951d19e2711079a9c668f33c53383051e0be1570ac42"} Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.363823 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" event={"ID":"82377b06-486d-46d6-b28a-0df0d6c86531","Type":"ContainerStarted","Data":"2e1e92a5ac08e245d34d57f87b6bf151408fd35e097524cedb8e59cc018f31c7"} Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.364461 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" event={"ID":"68d7cbd5-5cc5-4648-b94f-f256e12ae7d3","Type":"ContainerStarted","Data":"bfe0849e071e37ce00c6eaee6dd78063f61f4b36847bed6a5de3ee2922c2a768"} Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.367005 4843 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-g7fkz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.367286 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ppqpl" event={"ID":"b999a5c0-f4e8-499b-8f81-283c3a2cf495","Type":"ContainerStarted","Data":"6e1888400dfedd6fd9fb5e94372b9dd93191f93afb383a975f3992e3b221c86a"} Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.367358 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ppqpl" event={"ID":"b999a5c0-f4e8-499b-8f81-283c3a2cf495","Type":"ContainerStarted","Data":"97d1c79c4916e9ac1e3fde35595152116b62cecdf61234a4b1b4d938f7b3cae4"} Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.368144 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" podUID="061ecf07-8167-4652-8182-5779e5502bbf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.369928 4843 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7n8zj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" start-of-body= Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.369989 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" podUID="1da258ba-cd08-4cb9-90bb-18675d625fd1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.370132 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-d9qdf" event={"ID":"4e348d26-340c-4888-9c04-5112f6d56b05","Type":"ContainerStarted","Data":"ac1907b94e2e76ee174afaa313080da606ade50b1e9e4c81db3af0c9699ee519"} Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.409258 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:31 crc kubenswrapper[4843]: E0318 12:13:31.412546 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:31.912530042 +0000 UTC m=+245.628355566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.424177 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" podStartSLOduration=181.424158027 podStartE2EDuration="3m1.424158027s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:31.42390389 +0000 UTC m=+245.139729424" watchObservedRunningTime="2026-03-18 12:13:31.424158027 +0000 UTC m=+245.139983551" Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.514129 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:31 crc kubenswrapper[4843]: E0318 12:13:31.516795 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:32.016771316 +0000 UTC m=+245.732596990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.617690 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:31 crc kubenswrapper[4843]: E0318 12:13:31.626278 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:32.12625465 +0000 UTC m=+245.842080174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.718903 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:31 crc kubenswrapper[4843]: E0318 12:13:31.719343 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:32.219326653 +0000 UTC m=+245.935152167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.738164 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ppqpl" podStartSLOduration=182.738127295 podStartE2EDuration="3m2.738127295s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:31.513165662 +0000 UTC m=+245.228991186" watchObservedRunningTime="2026-03-18 12:13:31.738127295 +0000 UTC m=+245.453952819" Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.742815 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-t28gc"] Mar 18 12:13:31 crc kubenswrapper[4843]: W0318 12:13:31.754787 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59c312b9_d2a3_4404_9ec7_af59c0faf02a.slice/crio-5a90577967ac6fd25a037a932841ca463e5ae76667644aade11c3eeb27408f93 WatchSource:0}: Error finding container 5a90577967ac6fd25a037a932841ca463e5ae76667644aade11c3eeb27408f93: Status 404 returned error can't find the container with id 5a90577967ac6fd25a037a932841ca463e5ae76667644aade11c3eeb27408f93 Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.793995 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-f6q6x"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.796896 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:31 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:31 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:31 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.796968 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.821847 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:31 crc kubenswrapper[4843]: E0318 12:13:31.822604 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:32.322580129 +0000 UTC m=+246.038405653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.847799 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pmcbk"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.889961 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.892988 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hj6cb"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.896539 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.909526 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7wm5b"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.909607 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rs252"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.924665 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:31 crc kubenswrapper[4843]: E0318 12:13:31.927731 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:32.427704729 +0000 UTC m=+246.143530253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.942081 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.948542 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.953206 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8lfcz"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.954122 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.956551 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878"] Mar 18 12:13:31 crc kubenswrapper[4843]: I0318 12:13:31.958808 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq"] Mar 18 12:13:31 crc kubenswrapper[4843]: W0318 12:13:31.964060 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cf9261b_564c_45e7_a789_0d31a034ad44.slice/crio-871d7f30019fec957ae3d5e69eb1c47729e4cd60af92cadea152b105c1a08869 WatchSource:0}: Error finding container 871d7f30019fec957ae3d5e69eb1c47729e4cd60af92cadea152b105c1a08869: Status 404 returned error can't find the container with id 871d7f30019fec957ae3d5e69eb1c47729e4cd60af92cadea152b105c1a08869 Mar 18 12:13:31 crc kubenswrapper[4843]: W0318 12:13:31.982565 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf14cbab0_e920_47ad_9a17_7d5699e98eee.slice/crio-5c61d579d30cfdf241252edbba238d3dc3077bda8f0e2734a0c4af9278c4e623 WatchSource:0}: Error finding container 5c61d579d30cfdf241252edbba238d3dc3077bda8f0e2734a0c4af9278c4e623: Status 404 returned error can't find the container with id 5c61d579d30cfdf241252edbba238d3dc3077bda8f0e2734a0c4af9278c4e623 Mar 18 12:13:31 crc kubenswrapper[4843]: W0318 12:13:31.983759 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a0529e7_aab2_41b1_95e1_1cf8154430ca.slice/crio-6e739c778c632ae025dc1786752ee5c91c7d301a560e86a62d33b4f9d90dc005 WatchSource:0}: Error finding container 6e739c778c632ae025dc1786752ee5c91c7d301a560e86a62d33b4f9d90dc005: Status 404 returned error can't find the container with id 6e739c778c632ae025dc1786752ee5c91c7d301a560e86a62d33b4f9d90dc005 Mar 18 12:13:31 crc kubenswrapper[4843]: W0318 12:13:31.987756 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a3ed5e_9b84_4a42_9f3f_272159105861.slice/crio-5b531fe8c17fdbe231f9e08d42d336033430f073e406d82047803bc5c3852c3a WatchSource:0}: Error finding container 5b531fe8c17fdbe231f9e08d42d336033430f073e406d82047803bc5c3852c3a: Status 404 returned error can't find the container with id 5b531fe8c17fdbe231f9e08d42d336033430f073e406d82047803bc5c3852c3a Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.005755 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8hpvl"] Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.010942 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m7wsv"] Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.013103 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf"] Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.014678 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kv474"] Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.028762 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:32 crc kubenswrapper[4843]: E0318 12:13:32.029248 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:32.529227395 +0000 UTC m=+246.245053069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:32 crc kubenswrapper[4843]: W0318 12:13:32.030011 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4489a46_ae0b_4ac9_82c2_24fed2c70f7d.slice/crio-a5d290338382a4805c983e990035eebbcc5d586572eb9679b5713eee89c63d8e WatchSource:0}: Error finding container a5d290338382a4805c983e990035eebbcc5d586572eb9679b5713eee89c63d8e: Status 404 returned error can't find the container with id a5d290338382a4805c983e990035eebbcc5d586572eb9679b5713eee89c63d8e Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.053148 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64"] Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.057912 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hllcc"] Mar 18 12:13:32 crc kubenswrapper[4843]: W0318 12:13:32.063688 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff7eee41_1777_4084_ab10_412fe150b5c9.slice/crio-a8e6da005d858af453e4025efdf7ed6d47ef681af2c540517b061fda5a71bcae WatchSource:0}: Error finding container a8e6da005d858af453e4025efdf7ed6d47ef681af2c540517b061fda5a71bcae: Status 404 returned error can't find the container with id a8e6da005d858af453e4025efdf7ed6d47ef681af2c540517b061fda5a71bcae Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.065453 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx"] Mar 18 12:13:32 crc kubenswrapper[4843]: W0318 12:13:32.075696 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02fe1da6_6389_4153_9405_8a7a5f49fbde.slice/crio-5c7c3846b0ec7777972b0d611b223137a0c2003b67f7e5704dc2510d35b0a1bf WatchSource:0}: Error finding container 5c7c3846b0ec7777972b0d611b223137a0c2003b67f7e5704dc2510d35b0a1bf: Status 404 returned error can't find the container with id 5c7c3846b0ec7777972b0d611b223137a0c2003b67f7e5704dc2510d35b0a1bf Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.098013 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz"] Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.124918 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h"] Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.129172 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:32 crc kubenswrapper[4843]: E0318 12:13:32.129395 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:32.629369561 +0000 UTC m=+246.345195085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.136715 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4fqc5"] Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.174441 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp"] Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.192107 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4xm45"] Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.197270 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563932-bzf8p"] Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.218868 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw"] Mar 18 12:13:32 crc kubenswrapper[4843]: W0318 12:13:32.229085 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod016cbd62_23a6_413f_82b5_b806746e2b01.slice/crio-f0517b9df50f797f9f3a825136f056837b06342aff2b135302e168d4190d6a72 WatchSource:0}: Error finding container f0517b9df50f797f9f3a825136f056837b06342aff2b135302e168d4190d6a72: Status 404 returned error can't find the container with id f0517b9df50f797f9f3a825136f056837b06342aff2b135302e168d4190d6a72 Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.231238 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:32 crc kubenswrapper[4843]: E0318 12:13:32.231619 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:32.731604228 +0000 UTC m=+246.447429752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.240275 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-spcss"] Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.264962 4843 ???:1] "http: TLS handshake error from 192.168.126.11:40440: no serving certificate available for the kubelet" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.277311 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-94pzm"] Mar 18 12:13:32 crc kubenswrapper[4843]: W0318 12:13:32.303237 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3fa3986_70fd_4d58_a04f_ddeec535f493.slice/crio-5b3ed7c3e2ba43b3721d83e9a2d9594a0d3fadecd1ca8b8bd6157f8c41d26643 WatchSource:0}: Error finding container 5b3ed7c3e2ba43b3721d83e9a2d9594a0d3fadecd1ca8b8bd6157f8c41d26643: Status 404 returned error can't find the container with id 5b3ed7c3e2ba43b3721d83e9a2d9594a0d3fadecd1ca8b8bd6157f8c41d26643 Mar 18 12:13:32 crc kubenswrapper[4843]: W0318 12:13:32.307928 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf73981c7_0a7b_4a4d_87f8_9a73e4d794d6.slice/crio-4e485f2fe7e7500d4577822931ddb32e5c6d1ce91d7fdc45575123511a089a51 WatchSource:0}: Error finding container 4e485f2fe7e7500d4577822931ddb32e5c6d1ce91d7fdc45575123511a089a51: Status 404 returned error can't find the container with id 4e485f2fe7e7500d4577822931ddb32e5c6d1ce91d7fdc45575123511a089a51 Mar 18 12:13:32 crc kubenswrapper[4843]: W0318 12:13:32.309596 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod803876de_64f6_4347_8ea5_6d2d8f87e828.slice/crio-e3c45275eede20f1b5d8794b8ced35da56466a5763fcd86a7fb97b375ee0e26d WatchSource:0}: Error finding container e3c45275eede20f1b5d8794b8ced35da56466a5763fcd86a7fb97b375ee0e26d: Status 404 returned error can't find the container with id e3c45275eede20f1b5d8794b8ced35da56466a5763fcd86a7fb97b375ee0e26d Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.321312 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.332156 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:32 crc kubenswrapper[4843]: E0318 12:13:32.332516 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:32.832501776 +0000 UTC m=+246.548327300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.338207 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s"] Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.366126 4843 ???:1] "http: TLS handshake error from 192.168.126.11:40448: no serving certificate available for the kubelet" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.385831 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" event={"ID":"d23cfc00-6762-41fb-bf10-e8aa0eda250b","Type":"ContainerStarted","Data":"ccbe2ab738871631d1d9e6345d31623a613815921ac57258333ae6dff7fd4812"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.395025 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" event={"ID":"b1442992-fe43-43be-a43b-48f80db66418","Type":"ContainerStarted","Data":"a866923e25a96bc5766c265e0ad47ed03548ba10818e5a220304049ff7f0b5d3"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.398348 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" event={"ID":"6093d2c4-d78c-4522-a95a-224064548148","Type":"ContainerStarted","Data":"a641a2a42230e76ac100b7923a11c498ebcfedc611b664d808fddd3895ec8052"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.404561 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563932-bzf8p" event={"ID":"803876de-64f6-4347-8ea5-6d2d8f87e828","Type":"ContainerStarted","Data":"e3c45275eede20f1b5d8794b8ced35da56466a5763fcd86a7fb97b375ee0e26d"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.406452 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t28gc" event={"ID":"59c312b9-d2a3-4404-9ec7-af59c0faf02a","Type":"ContainerStarted","Data":"7e0c9e979124ba8fe0e7eaf618694929d0ee398991682c2bee7f32f780d11e60"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.406476 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t28gc" event={"ID":"59c312b9-d2a3-4404-9ec7-af59c0faf02a","Type":"ContainerStarted","Data":"5a90577967ac6fd25a037a932841ca463e5ae76667644aade11c3eeb27408f93"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.412086 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rs252" event={"ID":"07a3ed5e-9b84-4a42-9f3f-272159105861","Type":"ContainerStarted","Data":"5b531fe8c17fdbe231f9e08d42d336033430f073e406d82047803bc5c3852c3a"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.413873 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" event={"ID":"4cf9261b-564c-45e7-a789-0d31a034ad44","Type":"ContainerStarted","Data":"871d7f30019fec957ae3d5e69eb1c47729e4cd60af92cadea152b105c1a08869"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.417692 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pmcbk" event={"ID":"5ead3e48-f240-4392-951f-0faca5ec0a8c","Type":"ContainerStarted","Data":"5d2d40b6402c39a3ac29585b24837f0b4939064872dd07384641daba6188856d"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.425142 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" event={"ID":"cd0cd970-7aa7-4c43-a967-513cd197dc38","Type":"ContainerStarted","Data":"87a09b010a64f47592c3b0b2401fabed8cebb5961837eab18ecadabda6a940e9"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.428528 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8hpvl" event={"ID":"5c4c78e8-738c-4105-8b56-9f8f900b496e","Type":"ContainerStarted","Data":"dc3c1093f84cca22d35aeca314ee41886dfa6a8f2a31528fbecb41e39c2b35c7"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.429596 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" event={"ID":"9a0529e7-aab2-41b1-95e1-1cf8154430ca","Type":"ContainerStarted","Data":"6e739c778c632ae025dc1786752ee5c91c7d301a560e86a62d33b4f9d90dc005"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.430391 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lfcz" event={"ID":"dc56f3ce-caf3-4d53-9cf3-d909ec3edd16","Type":"ContainerStarted","Data":"9bef7cdb2855c2b3ac28034b6b4b2c051937952240ab5230ba58829c4eada544"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.430915 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" event={"ID":"63edbbe4-08b8-4ea9-987c-83ebca62fc5e","Type":"ContainerStarted","Data":"0951797304f4b5e3aba0f97941bb2463771fba30286e5a692c5ceef4089ca9ce"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.433236 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-d9qdf" event={"ID":"4e348d26-340c-4888-9c04-5112f6d56b05","Type":"ContainerStarted","Data":"8412d4af1b322a9fd4d46299fab67ddde9e86bfd604001bae1d8cfabf358c668"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.436895 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.438463 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:32 crc kubenswrapper[4843]: E0318 12:13:32.438788 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:32.938773439 +0000 UTC m=+246.654598963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.438923 4843 patch_prober.go:28] interesting pod/console-operator-58897d9998-d9qdf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.438995 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-d9qdf" podUID="4e348d26-340c-4888-9c04-5112f6d56b05" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.457360 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-d9qdf" podStartSLOduration=183.457339744 podStartE2EDuration="3m3.457339744s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:32.457122027 +0000 UTC m=+246.172947561" watchObservedRunningTime="2026-03-18 12:13:32.457339744 +0000 UTC m=+246.173165268" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.461845 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" event={"ID":"9a1ef083-bcc3-48fc-9d9d-41008d4673b0","Type":"ContainerStarted","Data":"840ea074879d2239f9914a7c93c6e93e7f876a38aea2acba9fd430dbb8a729c9"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.473628 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" event={"ID":"ff7eee41-1777-4084-ab10-412fe150b5c9","Type":"ContainerStarted","Data":"a8e6da005d858af453e4025efdf7ed6d47ef681af2c540517b061fda5a71bcae"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.476016 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" event={"ID":"92c985ef-3317-4220-819d-f482f8b50d60","Type":"ContainerStarted","Data":"42d3a3aeb0301559050da2b5c1a42a33256cdda2053b3c944962c000066523c5"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.477272 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4xm45" event={"ID":"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6","Type":"ContainerStarted","Data":"4e485f2fe7e7500d4577822931ddb32e5c6d1ce91d7fdc45575123511a089a51"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.477932 4843 ???:1] "http: TLS handshake error from 192.168.126.11:40454: no serving certificate available for the kubelet" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.478640 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" event={"ID":"f14cbab0-e920-47ad-9a17-7d5699e98eee","Type":"ContainerStarted","Data":"5c61d579d30cfdf241252edbba238d3dc3077bda8f0e2734a0c4af9278c4e623"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.497160 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" event={"ID":"82377b06-486d-46d6-b28a-0df0d6c86531","Type":"ContainerStarted","Data":"1ff10dcfb694c02e05d80b6094610879b4252fd2d2097233766272f91ca3514a"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.503950 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" event={"ID":"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb","Type":"ContainerStarted","Data":"30f85f70ee90eceec51db2be8ac298cb0c14942cde075d8e678c24b5149ff351"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.530471 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" podStartSLOduration=182.530449301 podStartE2EDuration="3m2.530449301s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:32.528721931 +0000 UTC m=+246.244547455" watchObservedRunningTime="2026-03-18 12:13:32.530449301 +0000 UTC m=+246.246274835" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.537550 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" event={"ID":"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d","Type":"ContainerStarted","Data":"a5d290338382a4805c983e990035eebbcc5d586572eb9679b5713eee89c63d8e"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.543768 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:32 crc kubenswrapper[4843]: E0318 12:13:32.543914 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:33.043880438 +0000 UTC m=+246.759705962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.546093 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:32 crc kubenswrapper[4843]: E0318 12:13:32.546527 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:33.046510864 +0000 UTC m=+246.762336388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.551683 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" event={"ID":"016cbd62-23a6-413f-82b5-b806746e2b01","Type":"ContainerStarted","Data":"f0517b9df50f797f9f3a825136f056837b06342aff2b135302e168d4190d6a72"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.564591 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" event={"ID":"7ab6a392-4347-4101-88e2-00ec7b9aecf5","Type":"ContainerStarted","Data":"d9e5885d145431afa6403c2e8dbd2a3cbb8612e59c2479caaccc5638c517ee90"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.570143 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" event={"ID":"bacfe411-3a90-4d97-80d5-c39a102cdd9b","Type":"ContainerStarted","Data":"c7d1c777f7324231c24cccbd805cfff7ddf129f56ede768919f20db1fa8ee62c"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.571103 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" podStartSLOduration=183.571078802 podStartE2EDuration="3m3.571078802s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:32.561429274 +0000 UTC m=+246.277254788" watchObservedRunningTime="2026-03-18 12:13:32.571078802 +0000 UTC m=+246.286904326" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.572165 4843 ???:1] "http: TLS handshake error from 192.168.126.11:40458: no serving certificate available for the kubelet" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.572864 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" event={"ID":"af6b48ca-8e58-4194-aa93-59825a221fbc","Type":"ContainerStarted","Data":"07acc20864c34eeec0d3e19079ae5e97b38676758818218039824a95ceb37a10"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.574811 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj" event={"ID":"22656451-217a-4227-becf-75f7ae30423b","Type":"ContainerStarted","Data":"9b854a56018a083b79637dc48fde38b6de0ddc2cbeafc72c32407c79647660d2"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.578915 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m7wsv" event={"ID":"772ec8a8-abe6-43b5-9239-2331a455b602","Type":"ContainerStarted","Data":"2374e1561eca12e0736c048871944d9d394b0cbfdc0b504c90e3361e73bf6e1e"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.581258 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" event={"ID":"e3fa3986-70fd-4d58-a04f-ddeec535f493","Type":"ContainerStarted","Data":"5b3ed7c3e2ba43b3721d83e9a2d9594a0d3fadecd1ca8b8bd6157f8c41d26643"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.582433 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" event={"ID":"8dec33ef-2627-4878-b7b6-f980772125b8","Type":"ContainerStarted","Data":"89dee36c1ce077a96567581e061d3f60b71b5cb372fc5ee0be4cb32019fc2dca"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.588895 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" event={"ID":"02fe1da6-6389-4153-9405-8a7a5f49fbde","Type":"ContainerStarted","Data":"5c7c3846b0ec7777972b0d611b223137a0c2003b67f7e5704dc2510d35b0a1bf"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.600707 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7" event={"ID":"3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2","Type":"ContainerStarted","Data":"6d3f26efc1bf65861fcbd3cb639e0674471c525e79c4d8bf1b20a543c09400e2"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.600754 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7" event={"ID":"3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2","Type":"ContainerStarted","Data":"2de610718c291ce0f3cfd050dafece6c06114170054b1f3a95d4d7e906547e60"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.600765 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7" event={"ID":"3f212ced-b4c9-45f4-9c40-2d03ce0bd4b2","Type":"ContainerStarted","Data":"e0132386ef8da896d1009d8f0eec183501946eb02318d95e8af463c7141355ab"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.612738 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" event={"ID":"c2781cd8-ccc6-4c7e-8c88-11788a29888d","Type":"ContainerStarted","Data":"835fa99052d3bb3ef2fe9f64bea84d9299c47bdb47490df7f984ebc6f610b1f2"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.617235 4843 generic.go:334] "Generic (PLEG): container finished" podID="68d7cbd5-5cc5-4648-b94f-f256e12ae7d3" containerID="cc94e720b40af8b582d110f33bffcbafbbf9f511922d62a7e73309a6bba46961" exitCode=0 Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.619541 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" event={"ID":"68d7cbd5-5cc5-4648-b94f-f256e12ae7d3","Type":"ContainerDied","Data":"cc94e720b40af8b582d110f33bffcbafbbf9f511922d62a7e73309a6bba46961"} Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.625444 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.639786 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vv7v7" podStartSLOduration=183.639756241 podStartE2EDuration="3m3.639756241s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:32.62998368 +0000 UTC m=+246.345809224" watchObservedRunningTime="2026-03-18 12:13:32.639756241 +0000 UTC m=+246.355581765" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.651734 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:32 crc kubenswrapper[4843]: E0318 12:13:32.653625 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:33.15358332 +0000 UTC m=+246.869408844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.683538 4843 ???:1] "http: TLS handshake error from 192.168.126.11:40466: no serving certificate available for the kubelet" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.756091 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:32 crc kubenswrapper[4843]: E0318 12:13:32.756852 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:33.256840386 +0000 UTC m=+246.972665900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.773429 4843 ???:1] "http: TLS handshake error from 192.168.126.11:40482: no serving certificate available for the kubelet" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.791829 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:32 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:32 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:32 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.791886 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.796447 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.861280 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:32 crc kubenswrapper[4843]: E0318 12:13:32.861874 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:33.361850402 +0000 UTC m=+247.077675926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.880174 4843 ???:1] "http: TLS handshake error from 192.168.126.11:40484: no serving certificate available for the kubelet" Mar 18 12:13:32 crc kubenswrapper[4843]: I0318 12:13:32.966903 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:32 crc kubenswrapper[4843]: E0318 12:13:32.967880 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:33.467862658 +0000 UTC m=+247.183688182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:33 crc kubenswrapper[4843]: I0318 12:13:33.024360 4843 ???:1] "http: TLS handshake error from 192.168.126.11:40496: no serving certificate available for the kubelet" Mar 18 12:13:33 crc kubenswrapper[4843]: I0318 12:13:33.068309 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:33 crc kubenswrapper[4843]: E0318 12:13:33.068771 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:33.568756306 +0000 UTC m=+247.284581830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:33 crc kubenswrapper[4843]: I0318 12:13:33.170625 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:33 crc kubenswrapper[4843]: E0318 12:13:33.171055 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:33.671021223 +0000 UTC m=+247.386846747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:33 crc kubenswrapper[4843]: I0318 12:13:33.281191 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:33 crc kubenswrapper[4843]: E0318 12:13:33.281506 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:33.781489027 +0000 UTC m=+247.497314551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:33 crc kubenswrapper[4843]: I0318 12:13:33.390771 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:33 crc kubenswrapper[4843]: E0318 12:13:33.391237 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:33.89122151 +0000 UTC m=+247.607047034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:33 crc kubenswrapper[4843]: I0318 12:13:33.501155 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:33 crc kubenswrapper[4843]: E0318 12:13:33.501464 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:34.001448876 +0000 UTC m=+247.717274400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:33 crc kubenswrapper[4843]: I0318 12:13:33.696433 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:33 crc kubenswrapper[4843]: E0318 12:13:33.697003 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:34.196751035 +0000 UTC m=+247.912576559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:33 crc kubenswrapper[4843]: I0318 12:13:33.878094 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:33 crc kubenswrapper[4843]: E0318 12:13:33.879083 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:34.379053009 +0000 UTC m=+248.094878623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:33 crc kubenswrapper[4843]: I0318 12:13:33.880364 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:33 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:33 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:33 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:33 crc kubenswrapper[4843]: I0318 12:13:33.880402 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:33 crc kubenswrapper[4843]: I0318 12:13:33.912637 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" event={"ID":"cd0cd970-7aa7-4c43-a967-513cd197dc38","Type":"ContainerStarted","Data":"cea0b58e24e560a0db0add9ae348d8da6935908424a46a73e173031cd5d8091d"} Mar 18 12:13:33 crc kubenswrapper[4843]: I0318 12:13:33.980064 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:33 crc kubenswrapper[4843]: E0318 12:13:33.980436 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:34.480424161 +0000 UTC m=+248.196249685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.081437 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:34 crc kubenswrapper[4843]: E0318 12:13:34.081899 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:34.581882675 +0000 UTC m=+248.297708199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.111521 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" event={"ID":"4cf9261b-564c-45e7-a789-0d31a034ad44","Type":"ContainerStarted","Data":"8c4211955d909718157f7cdefec4e505bd1d0a09e3a6bfdcaa7ae9be4015cacc"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.112797 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.121810 4843 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xp4kv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.121864 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" podUID="4cf9261b-564c-45e7-a789-0d31a034ad44" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.188135 4843 ???:1] "http: TLS handshake error from 192.168.126.11:40508: no serving certificate available for the kubelet" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.188840 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:34 crc kubenswrapper[4843]: E0318 12:13:34.189813 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:34.689798526 +0000 UTC m=+248.405624050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.192042 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" event={"ID":"7ab6a392-4347-4101-88e2-00ec7b9aecf5","Type":"ContainerStarted","Data":"6bc53eecaf599b4122ccb9b8f641a5d67a26032d73202c944793cf2e8ae4a046"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.316177 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-f6q6x" podStartSLOduration=184.316159168 podStartE2EDuration="3m4.316159168s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:34.111849349 +0000 UTC m=+247.827674873" watchObservedRunningTime="2026-03-18 12:13:34.316159168 +0000 UTC m=+248.031984692" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.318605 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:34 crc kubenswrapper[4843]: E0318 12:13:34.319478 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:34.819463803 +0000 UTC m=+248.535289327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.341836 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" event={"ID":"9a0529e7-aab2-41b1-95e1-1cf8154430ca","Type":"ContainerStarted","Data":"92d98d4eb270d438504669480b29d7efeaf0f65d05d4b5a6cd30912847e30f8a"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.350078 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" event={"ID":"2a181e9f-1fda-43f1-a42e-c64602afbcd2","Type":"ContainerStarted","Data":"7df6b302428e31635a904bc137a2ae5cd2eb827f0f16f151408bd267b2da9510"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.351303 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pmcbk" event={"ID":"5ead3e48-f240-4392-951f-0faca5ec0a8c","Type":"ContainerStarted","Data":"fbcfc2d8961e5790263cf8f562522988ac4153b19e6268eef845d6b86e9fcf14"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.352095 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" event={"ID":"fe3bf0b1-4566-4a39-8629-d4d268bd5977","Type":"ContainerStarted","Data":"5e2382978b7c6e97610293135ef634486ac0698206e6ad8dc3ed17a9edc9496b"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.353225 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rs252" event={"ID":"07a3ed5e-9b84-4a42-9f3f-272159105861","Type":"ContainerStarted","Data":"cf5516d62152c3624f8d94f11e8f91a19c7cf6e631fc95d0130df759e199fc8b"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.354476 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" event={"ID":"bacfe411-3a90-4d97-80d5-c39a102cdd9b","Type":"ContainerStarted","Data":"04f5b75ec328e04058635c892c9399b650ee8a94aaf128a497b986ca7b08040c"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.358186 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t28gc" event={"ID":"59c312b9-d2a3-4404-9ec7-af59c0faf02a","Type":"ContainerStarted","Data":"f6d74f35870035e343f28ca2b4408232dbb739603ae84f9a680963227146a664"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.360357 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" event={"ID":"f14cbab0-e920-47ad-9a17-7d5699e98eee","Type":"ContainerStarted","Data":"83964f38bcdddb7990bbac77a3ffb143ef8213dff2cead4de923e791d2233860"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.362250 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lfcz" event={"ID":"dc56f3ce-caf3-4d53-9cf3-d909ec3edd16","Type":"ContainerStarted","Data":"747400a427647f29da7773f37c6dae5b34a9e4b2b0e7d45da2591f2ec708955a"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.362989 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8lfcz" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.364010 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.364054 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.393276 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" event={"ID":"e8bd6721-4cfb-4143-94f5-ad1f9fb985fb","Type":"ContainerStarted","Data":"b78ef0f77f7b7a3fff14d0ae42af84080774bd6cdaa2b5773d9591d75012fdb8"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.395020 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" podStartSLOduration=184.39500157 podStartE2EDuration="3m4.39500157s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:34.317400123 +0000 UTC m=+248.033225647" watchObservedRunningTime="2026-03-18 12:13:34.39500157 +0000 UTC m=+248.110827084" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.416051 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" event={"ID":"c2781cd8-ccc6-4c7e-8c88-11788a29888d","Type":"ContainerStarted","Data":"c85ce342792fe7753305133539802e262e57422d4acd25b8c7727bbadf010ef8"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.444345 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5fzqw" podStartSLOduration=184.444322671 podStartE2EDuration="3m4.444322671s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:34.443080286 +0000 UTC m=+248.158905830" watchObservedRunningTime="2026-03-18 12:13:34.444322671 +0000 UTC m=+248.160148195" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.444760 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.446166 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-whd7l" podStartSLOduration=184.446153904 podStartE2EDuration="3m4.446153904s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:34.394965309 +0000 UTC m=+248.110790843" watchObservedRunningTime="2026-03-18 12:13:34.446153904 +0000 UTC m=+248.161979428" Mar 18 12:13:34 crc kubenswrapper[4843]: E0318 12:13:34.451465 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:34.951450326 +0000 UTC m=+248.667275850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.465725 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj" event={"ID":"22656451-217a-4227-becf-75f7ae30423b","Type":"ContainerStarted","Data":"332745d36507f83beb0eacdeb1c31de0a00f97a16971c4e32e5db4c09da60c48"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.481136 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m7wsv" event={"ID":"772ec8a8-abe6-43b5-9239-2331a455b602","Type":"ContainerStarted","Data":"51aa7740b298e1e83a9dd2a2b0adef6bc61eb5a41581af585392ce834ef35515"} Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.527671 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t28gc" podStartSLOduration=184.527642093 podStartE2EDuration="3m4.527642093s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:34.526082148 +0000 UTC m=+248.241907672" watchObservedRunningTime="2026-03-18 12:13:34.527642093 +0000 UTC m=+248.243467617" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.547794 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:34 crc kubenswrapper[4843]: E0318 12:13:34.549044 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:35.049011739 +0000 UTC m=+248.764837263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.600522 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fd878" podStartSLOduration=184.600498503 podStartE2EDuration="3m4.600498503s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:34.591074531 +0000 UTC m=+248.306900055" watchObservedRunningTime="2026-03-18 12:13:34.600498503 +0000 UTC m=+248.316324037" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.602092 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8lfcz" podStartSLOduration=185.602084818 podStartE2EDuration="3m5.602084818s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:34.567211143 +0000 UTC m=+248.283036667" watchObservedRunningTime="2026-03-18 12:13:34.602084818 +0000 UTC m=+248.317910342" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.651436 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:34 crc kubenswrapper[4843]: E0318 12:13:34.661146 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:35.16112792 +0000 UTC m=+248.876953454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.695213 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gf5bj" podStartSLOduration=184.695196812 podStartE2EDuration="3m4.695196812s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:34.634665347 +0000 UTC m=+248.350490871" watchObservedRunningTime="2026-03-18 12:13:34.695196812 +0000 UTC m=+248.411022326" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.763815 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:34 crc kubenswrapper[4843]: E0318 12:13:34.764095 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:35.264064977 +0000 UTC m=+248.979890501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.764314 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:34 crc kubenswrapper[4843]: E0318 12:13:34.764788 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:35.264780467 +0000 UTC m=+248.980605981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.807479 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:34 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:34 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:34 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.807532 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.884219 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:34 crc kubenswrapper[4843]: E0318 12:13:34.884885 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:35.384856878 +0000 UTC m=+249.100682402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:34 crc kubenswrapper[4843]: I0318 12:13:34.990644 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:34 crc kubenswrapper[4843]: E0318 12:13:34.991021 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:35.491008048 +0000 UTC m=+249.206833572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.013306 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-m7wsv" podStartSLOduration=12.01328793 podStartE2EDuration="12.01328793s" podCreationTimestamp="2026-03-18 12:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:35.011428696 +0000 UTC m=+248.727254220" watchObservedRunningTime="2026-03-18 12:13:35.01328793 +0000 UTC m=+248.729113454" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.014228 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zfh7x" podStartSLOduration=185.014222257 podStartE2EDuration="3m5.014222257s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:34.749928979 +0000 UTC m=+248.465754523" watchObservedRunningTime="2026-03-18 12:13:35.014222257 +0000 UTC m=+248.730047781" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.093449 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:35 crc kubenswrapper[4843]: E0318 12:13:35.093918 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:35.593895093 +0000 UTC m=+249.309720617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.195362 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:35 crc kubenswrapper[4843]: E0318 12:13:35.196065 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:35.696042496 +0000 UTC m=+249.411868020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.297239 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:35 crc kubenswrapper[4843]: E0318 12:13:35.297626 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:35.797609353 +0000 UTC m=+249.513434877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.399515 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:35 crc kubenswrapper[4843]: E0318 12:13:35.399863 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:35.89985214 +0000 UTC m=+249.615677664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.482748 4843 patch_prober.go:28] interesting pod/console-operator-58897d9998-d9qdf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.482814 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-d9qdf" podUID="4e348d26-340c-4888-9c04-5112f6d56b05" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.492605 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8hpvl" event={"ID":"5c4c78e8-738c-4105-8b56-9f8f900b496e","Type":"ContainerStarted","Data":"d8f39a9bd4457761538cf8585cf23efefac2fab3ad1df5a9833d1cbb459fd0b5"} Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.493671 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" event={"ID":"ff7eee41-1777-4084-ab10-412fe150b5c9","Type":"ContainerStarted","Data":"455cb9f92fb3e18dc0402207e3ff889f8c1764b3c1dddf9d87b0b989a4461e9b"} Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.500116 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:35 crc kubenswrapper[4843]: E0318 12:13:35.500493 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:36.00047601 +0000 UTC m=+249.716301534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.507169 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" event={"ID":"e3fa3986-70fd-4d58-a04f-ddeec535f493","Type":"ContainerStarted","Data":"0bfc6fb15498405760642e76bc3db575fbb2edd2d203f1b29a3b3760dfe63f98"} Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.521413 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" event={"ID":"d23cfc00-6762-41fb-bf10-e8aa0eda250b","Type":"ContainerStarted","Data":"247bc2a7727d0ad2ea7fc84497909b28d97d02d823d34e3a7070a26ad920e270"} Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.522433 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bqfhf" podStartSLOduration=185.522413752 podStartE2EDuration="3m5.522413752s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:35.521041353 +0000 UTC m=+249.236866887" watchObservedRunningTime="2026-03-18 12:13:35.522413752 +0000 UTC m=+249.238239276" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.522985 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.524225 4843 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-hllcc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" start-of-body= Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.524262 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" podUID="d23cfc00-6762-41fb-bf10-e8aa0eda250b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.530191 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" event={"ID":"b1442992-fe43-43be-a43b-48f80db66418","Type":"ContainerStarted","Data":"bcdf1634e918c197bd848d59680a4a1bd7ebec66d9798689e2470cd359e4a791"} Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.554489 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" event={"ID":"fe3bf0b1-4566-4a39-8629-d4d268bd5977","Type":"ContainerStarted","Data":"165ba529081c57e78221255787a9d9b6a56212c2e3cfe751ce4f6e79c89491cb"} Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.556462 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" event={"ID":"6093d2c4-d78c-4522-a95a-224064548148","Type":"ContainerStarted","Data":"7d8ecaf1ef9f452caf308f4f2452a53123889664455419ee9bfe9f0e2081f43e"} Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.558999 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" event={"ID":"016cbd62-23a6-413f-82b5-b806746e2b01","Type":"ContainerStarted","Data":"dcb1aacff14f14eda524bbbce7c735928a7013c6a883c2246818cf8b8b346e99"} Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.560549 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.560622 4843 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4fqc5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.560666 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" podUID="016cbd62-23a6-413f-82b5-b806746e2b01" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.561898 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" event={"ID":"af6b48ca-8e58-4194-aa93-59825a221fbc","Type":"ContainerStarted","Data":"4e9e3ed8acb7e0d3bb4560caf00729640b70b7e79282dd0d29f982fc1e679dbb"} Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.563518 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" event={"ID":"02fe1da6-6389-4153-9405-8a7a5f49fbde","Type":"ContainerStarted","Data":"7ce5124902e9c0bd2695d073598155fb06ba9e8ceffe934b1fb2d312378fa56f"} Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.565845 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" event={"ID":"a4489a46-ae0b-4ac9-82c2-24fed2c70f7d","Type":"ContainerStarted","Data":"567e02b122d014a7425229ecae013eac3adf0bf773cfdc95f014904c1ad4c3b1"} Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.571411 4843 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xp4kv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.571446 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" podUID="4cf9261b-564c-45e7-a789-0d31a034ad44" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.571508 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.571561 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.603456 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:35 crc kubenswrapper[4843]: E0318 12:13:35.618189 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:36.118161972 +0000 UTC m=+249.833987506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.621075 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-d9qdf" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.669165 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-g7h64" podStartSLOduration=186.66914272100001 podStartE2EDuration="3m6.669142721s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:35.63475741 +0000 UTC m=+249.350582934" watchObservedRunningTime="2026-03-18 12:13:35.669142721 +0000 UTC m=+249.384968235" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.669356 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" podStartSLOduration=186.669348027 podStartE2EDuration="3m6.669348027s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:35.598791503 +0000 UTC m=+249.314617027" watchObservedRunningTime="2026-03-18 12:13:35.669348027 +0000 UTC m=+249.385173551" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.705227 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:35 crc kubenswrapper[4843]: E0318 12:13:35.707755 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:36.207739453 +0000 UTC m=+249.923564977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.824340 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.824390 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:35 crc kubenswrapper[4843]: I0318 12:13:35.827092 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:35 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:35 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:35 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.047589 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:36 crc kubenswrapper[4843]: E0318 12:13:36.048302 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:36.548284688 +0000 UTC m=+250.264110212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.048332 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.049251 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.049367 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.050913 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-7wm5b" podStartSLOduration=186.050880983 podStartE2EDuration="3m6.050880983s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:35.823979414 +0000 UTC m=+249.539804948" watchObservedRunningTime="2026-03-18 12:13:36.050880983 +0000 UTC m=+249.766706507" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.052263 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.052697 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.055773 4843 patch_prober.go:28] interesting pod/apiserver-76f77b778f-hqt5c container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.055812 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" podUID="e8bd6721-4cfb-4143-94f5-ad1f9fb985fb" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.056583 4843 patch_prober.go:28] interesting pod/console-f9d7485db-ppqpl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.056606 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ppqpl" podUID="b999a5c0-f4e8-499b-8f81-283c3a2cf495" containerName="console" probeResult="failure" output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.057730 4843 ???:1] "http: TLS handshake error from 192.168.126.11:40514: no serving certificate available for the kubelet" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.105513 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.105521 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.105560 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.105582 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.129219 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.148617 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" podStartSLOduration=187.14860075 podStartE2EDuration="3m7.14860075s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:36.147870429 +0000 UTC m=+249.863695953" watchObservedRunningTime="2026-03-18 12:13:36.14860075 +0000 UTC m=+249.864426274" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.154725 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:36 crc kubenswrapper[4843]: E0318 12:13:36.156912 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:36.656893728 +0000 UTC m=+250.372719252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.262174 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:36 crc kubenswrapper[4843]: E0318 12:13:36.262560 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:36.762546684 +0000 UTC m=+250.478372208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.269338 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" podStartSLOduration=186.269321329 podStartE2EDuration="3m6.269321329s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:36.265991753 +0000 UTC m=+249.981817277" watchObservedRunningTime="2026-03-18 12:13:36.269321329 +0000 UTC m=+249.985146853" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.451744 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:36 crc kubenswrapper[4843]: E0318 12:13:36.452407 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:36.952390545 +0000 UTC m=+250.668216069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.464853 4843 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-hllcc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.464916 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" podUID="d23cfc00-6762-41fb-bf10-e8aa0eda250b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.466411 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-kv474" podStartSLOduration=187.466392679 podStartE2EDuration="3m7.466392679s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:36.465337258 +0000 UTC m=+250.181164872" watchObservedRunningTime="2026-03-18 12:13:36.466392679 +0000 UTC m=+250.182218223" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.570665 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:36 crc kubenswrapper[4843]: E0318 12:13:36.571055 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.071043385 +0000 UTC m=+250.786868909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.594748 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jm8sq" podStartSLOduration=186.594726897 podStartE2EDuration="3m6.594726897s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:36.513921209 +0000 UTC m=+250.229746733" watchObservedRunningTime="2026-03-18 12:13:36.594726897 +0000 UTC m=+250.310552421" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.642065 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" event={"ID":"63edbbe4-08b8-4ea9-987c-83ebca62fc5e","Type":"ContainerStarted","Data":"6ea07869ae0854f6c285c29c568b6d42029872990385ceef10b71d5796a27f5b"} Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.646561 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rs252" event={"ID":"07a3ed5e-9b84-4a42-9f3f-272159105861","Type":"ContainerStarted","Data":"96ed5fa6ae765350942cbcc00aae4abdd394cb60f331b18807ca1f9c09a17e74"} Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.676564 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:36 crc kubenswrapper[4843]: E0318 12:13:36.678211 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.178168412 +0000 UTC m=+250.893994106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.692995 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4xm45" event={"ID":"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6","Type":"ContainerStarted","Data":"10e34b59ea8ee090b2f4e36c1884ccc11cdb9008621eda16e88d50baf9be8247"} Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.703493 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-rs252" podStartSLOduration=186.703459961 podStartE2EDuration="3m6.703459961s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:36.692832385 +0000 UTC m=+250.408657909" watchObservedRunningTime="2026-03-18 12:13:36.703459961 +0000 UTC m=+250.419285485" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.703927 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" event={"ID":"92c985ef-3317-4220-819d-f482f8b50d60","Type":"ContainerStarted","Data":"a14ef3380ca71e4473475b0d9ef2e8616c006581b82f3d6ff69a765344b17185"} Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.705987 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.720989 4843 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kg7xw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.721084 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" event={"ID":"2a181e9f-1fda-43f1-a42e-c64602afbcd2","Type":"ContainerStarted","Data":"1769b580ce9c00ecc9600c6414b8bb61bf0015ab3130727fa3cd6b82dddaef0a"} Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.721109 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" podUID="92c985ef-3317-4220-819d-f482f8b50d60" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.770934 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" event={"ID":"7ab6a392-4347-4101-88e2-00ec7b9aecf5","Type":"ContainerStarted","Data":"4468f6e56c66a96edcacc38ee49e6864466567ac04c7738fa58851b8f2cab1f6"} Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.777060 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pmcbk" event={"ID":"5ead3e48-f240-4392-951f-0faca5ec0a8c","Type":"ContainerStarted","Data":"1edeefd79a445a9d16a999540ebe4bef85d0de9cd015c21418f700ae849a232e"} Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.778001 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:36 crc kubenswrapper[4843]: E0318 12:13:36.778459 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.278437592 +0000 UTC m=+250.994263286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.801088 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.812932 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:36 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:36 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:36 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.812993 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.857363 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" event={"ID":"9a1ef083-bcc3-48fc-9d9d-41008d4673b0","Type":"ContainerStarted","Data":"c9ce1f4bd53708b79c5202e592a2c7fad967de42110ea7afe649da7b53ba8e4d"} Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.858378 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.863397 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8hpvl" event={"ID":"5c4c78e8-738c-4105-8b56-9f8f900b496e","Type":"ContainerStarted","Data":"24a1680f130d80ae7e874bec4cd3588bf5d405f44091059e1cdb7ef0e85d556e"} Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.864059 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-8hpvl" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.865917 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" event={"ID":"8dec33ef-2627-4878-b7b6-f980772125b8","Type":"ContainerStarted","Data":"38056d93da3cc4c7689f59e81403e5b4646523f2fb3ac945ae3c3b25266300e9"} Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.873130 4843 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xp4kv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.873168 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" podUID="4cf9261b-564c-45e7-a789-0d31a034ad44" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.873257 4843 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-hllcc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.873271 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" podUID="d23cfc00-6762-41fb-bf10-e8aa0eda250b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.873463 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.873478 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.873531 4843 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4fqc5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.873542 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" podUID="016cbd62-23a6-413f-82b5-b806746e2b01" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.879545 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.885312 4843 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-brsfz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.885386 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" podUID="9a1ef083-bcc3-48fc-9d9d-41008d4673b0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: E0318 12:13:36.885557 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.385533669 +0000 UTC m=+251.101359193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.905862 4843 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4fqc5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.905912 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" podUID="016cbd62-23a6-413f-82b5-b806746e2b01" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.905964 4843 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4fqc5 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.905976 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" podUID="016cbd62-23a6-413f-82b5-b806746e2b01" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.922250 4843 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xp4kv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.922311 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" podUID="4cf9261b-564c-45e7-a789-0d31a034ad44" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.922447 4843 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xp4kv container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.922485 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" podUID="4cf9261b-564c-45e7-a789-0d31a034ad44" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.924885 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:36 crc kubenswrapper[4843]: E0318 12:13:36.928860 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.428842857 +0000 UTC m=+251.144668381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.968739 4843 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-brsfz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.968805 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" podUID="9a1ef083-bcc3-48fc-9d9d-41008d4673b0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.968808 4843 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kg7xw container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.968870 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" podUID="92c985ef-3317-4220-819d-f482f8b50d60" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.968885 4843 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kg7xw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.968939 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" podUID="92c985ef-3317-4220-819d-f482f8b50d60" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.968988 4843 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-brsfz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.969057 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" podUID="9a1ef083-bcc3-48fc-9d9d-41008d4673b0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.972618 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" podStartSLOduration=186.972600198 podStartE2EDuration="3m6.972600198s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:36.778415852 +0000 UTC m=+250.494241396" watchObservedRunningTime="2026-03-18 12:13:36.972600198 +0000 UTC m=+250.688425722" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.973406 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-pmcbk" podStartSLOduration=186.973398191 podStartE2EDuration="3m6.973398191s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:36.969460298 +0000 UTC m=+250.685285812" watchObservedRunningTime="2026-03-18 12:13:36.973398191 +0000 UTC m=+250.689223715" Mar 18 12:13:36 crc kubenswrapper[4843]: I0318 12:13:36.975607 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-thtkg" Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.027814 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:37 crc kubenswrapper[4843]: E0318 12:13:37.029047 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.529026134 +0000 UTC m=+251.244851808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.134352 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:37 crc kubenswrapper[4843]: E0318 12:13:37.134688 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.634676689 +0000 UTC m=+251.350502203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.377279 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:37 crc kubenswrapper[4843]: E0318 12:13:37.377593 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.87757626 +0000 UTC m=+251.593401784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.484167 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:37 crc kubenswrapper[4843]: E0318 12:13:37.484773 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.984758659 +0000 UTC m=+251.700584183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.530125 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-94pzm" podStartSLOduration=187.530103116 podStartE2EDuration="3m7.530103116s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:37.205881902 +0000 UTC m=+250.921707426" watchObservedRunningTime="2026-03-18 12:13:37.530103116 +0000 UTC m=+251.245928640" Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.594825 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:37 crc kubenswrapper[4843]: E0318 12:13:37.595202 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:38.095186252 +0000 UTC m=+251.811011776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.693822 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" podStartSLOduration=187.693807934 podStartE2EDuration="3m7.693807934s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:37.691337503 +0000 UTC m=+251.407163027" watchObservedRunningTime="2026-03-18 12:13:37.693807934 +0000 UTC m=+251.409633458" Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.694322 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-hj6cb" podStartSLOduration=187.694314029 podStartE2EDuration="3m7.694314029s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:37.567962578 +0000 UTC m=+251.283788122" watchObservedRunningTime="2026-03-18 12:13:37.694314029 +0000 UTC m=+251.410139553" Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.699663 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:37 crc kubenswrapper[4843]: E0318 12:13:37.699971 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:38.199954632 +0000 UTC m=+251.915780156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.796475 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:37 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:37 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:37 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.796820 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.812186 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:37 crc kubenswrapper[4843]: E0318 12:13:37.812732 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:38.312715652 +0000 UTC m=+252.028541176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.907457 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" event={"ID":"68d7cbd5-5cc5-4648-b94f-f256e12ae7d3","Type":"ContainerStarted","Data":"4e5858572d2dd272062ddec0fb17b3e1dbc094f77546d43f9c07796c833a23eb"} Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.907736 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.909696 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" event={"ID":"e3fa3986-70fd-4d58-a04f-ddeec535f493","Type":"ContainerStarted","Data":"495a95dea275c2c874eff1badcf1c7087137a08e47c801b0a55ce8f1d4c4afc5"} Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.913223 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:37 crc kubenswrapper[4843]: E0318 12:13:37.913556 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:38.413541448 +0000 UTC m=+252.129366972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.915570 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" event={"ID":"8dec33ef-2627-4878-b7b6-f980772125b8","Type":"ContainerStarted","Data":"ec29aed98a1f5d9b31a2a684116176c7b5dcabf40bc9806dd5494a975b2b0c05"} Mar 18 12:13:37 crc kubenswrapper[4843]: I0318 12:13:37.924528 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" event={"ID":"63edbbe4-08b8-4ea9-987c-83ebca62fc5e","Type":"ContainerStarted","Data":"b8f21b32563e2e13307f1fa962a7d0c31d656d2c4c6ff8c08c364704d2f2a4fe"} Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.017048 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:38 crc kubenswrapper[4843]: E0318 12:13:38.017669 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:38.517634268 +0000 UTC m=+252.233459792 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.023957 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" event={"ID":"fe3bf0b1-4566-4a39-8629-d4d268bd5977","Type":"ContainerStarted","Data":"8252e4bfc4166ba69787145b3821778db93b348a2244c62410d2a2762f257a1e"} Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.024115 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.024184 4843 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kg7xw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.024361 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" podUID="92c985ef-3317-4220-819d-f482f8b50d60" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.025485 4843 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-hllcc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" start-of-body= Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.025540 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" podUID="d23cfc00-6762-41fb-bf10-e8aa0eda250b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.025836 4843 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-brsfz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.025920 4843 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4fqc5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.025975 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" podUID="016cbd62-23a6-413f-82b5-b806746e2b01" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.025944 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" podUID="9a1ef083-bcc3-48fc-9d9d-41008d4673b0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.120539 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:38 crc kubenswrapper[4843]: E0318 12:13:38.131177 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:38.63115903 +0000 UTC m=+252.346984724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.174551 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8hpvl" podStartSLOduration=15.17453486 podStartE2EDuration="15.17453486s" podCreationTimestamp="2026-03-18 12:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:38.11764136 +0000 UTC m=+251.833466874" watchObservedRunningTime="2026-03-18 12:13:38.17453486 +0000 UTC m=+251.890360384" Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.222528 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:38 crc kubenswrapper[4843]: E0318 12:13:38.224869 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:38.72484705 +0000 UTC m=+252.440672574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.329617 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:38 crc kubenswrapper[4843]: E0318 12:13:38.330110 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:38.830093133 +0000 UTC m=+252.545918657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.403097 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-spcss" podStartSLOduration=188.403073767 podStartE2EDuration="3m8.403073767s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:38.400663087 +0000 UTC m=+252.116488611" watchObservedRunningTime="2026-03-18 12:13:38.403073767 +0000 UTC m=+252.118899291" Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.430587 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" podStartSLOduration=189.430568589 podStartE2EDuration="3m9.430568589s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:38.428277613 +0000 UTC m=+252.144103137" watchObservedRunningTime="2026-03-18 12:13:38.430568589 +0000 UTC m=+252.146394113" Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.431392 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:38 crc kubenswrapper[4843]: E0318 12:13:38.431474 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:38.931457575 +0000 UTC m=+252.647283099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.434571 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:38 crc kubenswrapper[4843]: E0318 12:13:38.434962 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:38.934950356 +0000 UTC m=+252.650775880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.481414 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" podStartSLOduration=188.481396764 podStartE2EDuration="3m8.481396764s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:38.479120949 +0000 UTC m=+252.194946483" watchObservedRunningTime="2026-03-18 12:13:38.481396764 +0000 UTC m=+252.197222288" Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.535370 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:38 crc kubenswrapper[4843]: E0318 12:13:38.535804 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:39.035764761 +0000 UTC m=+252.751590285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.576314 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g5z5h" podStartSLOduration=188.576294729 podStartE2EDuration="3m8.576294729s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:38.521882421 +0000 UTC m=+252.237707945" watchObservedRunningTime="2026-03-18 12:13:38.576294729 +0000 UTC m=+252.292120253" Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.637348 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:38 crc kubenswrapper[4843]: E0318 12:13:38.637762 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:39.13774752 +0000 UTC m=+252.853573034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.726970 4843 ???:1] "http: TLS handshake error from 192.168.126.11:40516: no serving certificate available for the kubelet" Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.738094 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:38 crc kubenswrapper[4843]: E0318 12:13:38.738433 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:39.238419141 +0000 UTC m=+252.954244665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.792376 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:38 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:38 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:38 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.792436 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.839764 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:38 crc kubenswrapper[4843]: E0318 12:13:38.840226 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:39.340202975 +0000 UTC m=+253.056028499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.941281 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:38 crc kubenswrapper[4843]: E0318 12:13:38.941498 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:39.441465383 +0000 UTC m=+253.157290907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:38 crc kubenswrapper[4843]: I0318 12:13:38.941812 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:38 crc kubenswrapper[4843]: E0318 12:13:38.942219 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:39.442207675 +0000 UTC m=+253.158033209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.042958 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:39 crc kubenswrapper[4843]: E0318 12:13:39.043326 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:39.543310299 +0000 UTC m=+253.259135823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.162819 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:39 crc kubenswrapper[4843]: E0318 12:13:39.163189 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:39.663173913 +0000 UTC m=+253.378999447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.164111 4843 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-brsfz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.164186 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" podUID="9a1ef083-bcc3-48fc-9d9d-41008d4673b0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.164353 4843 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kg7xw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.164402 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" podUID="92c985ef-3317-4220-819d-f482f8b50d60" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.165441 4843 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-k6c87 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.165500 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" podUID="68d7cbd5-5cc5-4648-b94f-f256e12ae7d3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.264402 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:39 crc kubenswrapper[4843]: E0318 12:13:39.265203 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:39.765182313 +0000 UTC m=+253.481007837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.366945 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:39 crc kubenswrapper[4843]: E0318 12:13:39.367424 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:39.86740418 +0000 UTC m=+253.583229704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.468402 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:39 crc kubenswrapper[4843]: E0318 12:13:39.468842 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:39.968817722 +0000 UTC m=+253.684643246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.570686 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:39 crc kubenswrapper[4843]: E0318 12:13:39.571171 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:40.071149832 +0000 UTC m=+253.786975516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.671776 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:39 crc kubenswrapper[4843]: E0318 12:13:39.671983 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:40.171950006 +0000 UTC m=+253.887775530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.672227 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:39 crc kubenswrapper[4843]: E0318 12:13:39.672641 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:40.172620476 +0000 UTC m=+253.888446170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.773807 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:39 crc kubenswrapper[4843]: E0318 12:13:39.773958 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:40.273930006 +0000 UTC m=+253.989755530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.774130 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:39 crc kubenswrapper[4843]: E0318 12:13:39.774457 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:40.274449441 +0000 UTC m=+253.990274965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.796892 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:39 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:39 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:39 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.796973 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.875045 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:39 crc kubenswrapper[4843]: E0318 12:13:39.875495 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:40.375479833 +0000 UTC m=+254.091305357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:39 crc kubenswrapper[4843]: I0318 12:13:39.976366 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:39 crc kubenswrapper[4843]: E0318 12:13:39.976828 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:40.476811563 +0000 UTC m=+254.192637087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.082108 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:40 crc kubenswrapper[4843]: E0318 12:13:40.082405 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:40.582388076 +0000 UTC m=+254.298213600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.185896 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.186222 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:40 crc kubenswrapper[4843]: E0318 12:13:40.186530 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:40.686517487 +0000 UTC m=+254.402343011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.238195 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4xm45" event={"ID":"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6","Type":"ContainerStarted","Data":"44639da12eb3d8d3ca568c0a69da687e4a37001a9707a7b79388c991d9f926c2"} Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.238239 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4xm45" event={"ID":"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6","Type":"ContainerStarted","Data":"de5be19abb8d77d5f68bf166f3e2a3775a3edd83f69fe302c481640487a17ac4"} Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.287082 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:40 crc kubenswrapper[4843]: E0318 12:13:40.287432 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:40.787376994 +0000 UTC m=+254.503202518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.312036 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.324540 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c-metrics-certs\") pod \"network-metrics-daemon-sn986\" (UID: \"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c\") " pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.401141 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:40 crc kubenswrapper[4843]: E0318 12:13:40.401543 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:40.901526674 +0000 UTC m=+254.617352198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.758735 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:40 crc kubenswrapper[4843]: E0318 12:13:40.759292 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:41.259271905 +0000 UTC m=+254.975097429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.793231 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.793417 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sn986" Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.816878 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:40 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:40 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:40 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.816943 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.988481 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.989081 4843 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-k6c87 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.989128 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" podUID="68d7cbd5-5cc5-4648-b94f-f256e12ae7d3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.989639 4843 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-k6c87 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 18 12:13:40 crc kubenswrapper[4843]: I0318 12:13:40.989701 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" podUID="68d7cbd5-5cc5-4648-b94f-f256e12ae7d3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 18 12:13:40 crc kubenswrapper[4843]: E0318 12:13:40.989887 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:41.489877011 +0000 UTC m=+255.205702535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.027369 4843 patch_prober.go:28] interesting pod/apiserver-76f77b778f-hqt5c container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 18 12:13:41 crc kubenswrapper[4843]: [+]log ok Mar 18 12:13:41 crc kubenswrapper[4843]: [+]etcd ok Mar 18 12:13:41 crc kubenswrapper[4843]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 18 12:13:41 crc kubenswrapper[4843]: [+]poststarthook/generic-apiserver-start-informers ok Mar 18 12:13:41 crc kubenswrapper[4843]: [+]poststarthook/max-in-flight-filter ok Mar 18 12:13:41 crc kubenswrapper[4843]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 18 12:13:41 crc kubenswrapper[4843]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 18 12:13:41 crc kubenswrapper[4843]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 18 12:13:41 crc kubenswrapper[4843]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 18 12:13:41 crc kubenswrapper[4843]: [+]poststarthook/project.openshift.io-projectcache ok Mar 18 12:13:41 crc kubenswrapper[4843]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 18 12:13:41 crc kubenswrapper[4843]: [+]poststarthook/openshift.io-startinformers ok Mar 18 12:13:41 crc kubenswrapper[4843]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 18 12:13:41 crc kubenswrapper[4843]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 18 12:13:41 crc kubenswrapper[4843]: livez check failed Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.027457 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" podUID="e8bd6721-4cfb-4143-94f5-ad1f9fb985fb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.159622 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:41 crc kubenswrapper[4843]: E0318 12:13:41.159779 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:41.659752137 +0000 UTC m=+255.375577661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.160206 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:41 crc kubenswrapper[4843]: E0318 12:13:41.160508 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:41.660494179 +0000 UTC m=+255.376319703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.309592 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:41 crc kubenswrapper[4843]: E0318 12:13:41.309997 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:41.809978167 +0000 UTC m=+255.525803691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.428213 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:41 crc kubenswrapper[4843]: E0318 12:13:41.428719 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:41.928676408 +0000 UTC m=+255.644501932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.441845 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4xm45" event={"ID":"f73981c7-0a7b-4a4d-87f8-9a73e4d794d6","Type":"ContainerStarted","Data":"47495d432e4a67160f85a3283e77b24d63d2a521e5831fcd836c397e33639c78"} Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.529632 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:41 crc kubenswrapper[4843]: E0318 12:13:41.529834 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:42.029802753 +0000 UTC m=+255.745628277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.530115 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:41 crc kubenswrapper[4843]: E0318 12:13:41.530462 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:42.030445261 +0000 UTC m=+255.746270785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.614575 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sxdqp" podStartSLOduration=191.614559166 podStartE2EDuration="3m11.614559166s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:38.576821025 +0000 UTC m=+252.292646549" watchObservedRunningTime="2026-03-18 12:13:41.614559166 +0000 UTC m=+255.330384690" Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.616220 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4xm45" podStartSLOduration=18.616209433 podStartE2EDuration="18.616209433s" podCreationTimestamp="2026-03-18 12:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:41.614234446 +0000 UTC m=+255.330059970" watchObservedRunningTime="2026-03-18 12:13:41.616209433 +0000 UTC m=+255.332034957" Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.644065 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:41 crc kubenswrapper[4843]: E0318 12:13:41.644255 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:42.144228841 +0000 UTC m=+255.860054365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.745339 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:41 crc kubenswrapper[4843]: E0318 12:13:41.745830 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:42.245805398 +0000 UTC m=+255.961630922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.829350 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:41 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:41 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:41 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.829426 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.854568 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:41 crc kubenswrapper[4843]: E0318 12:13:41.855070 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:42.355049877 +0000 UTC m=+256.070875401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.904579 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.905739 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.915384 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.915680 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.957817 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.957905 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.958038 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:41 crc kubenswrapper[4843]: E0318 12:13:41.958450 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:42.458432616 +0000 UTC m=+256.174258140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:41 crc kubenswrapper[4843]: I0318 12:13:41.958820 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.070509 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.070881 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.070932 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.071012 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:13:42 crc kubenswrapper[4843]: E0318 12:13:42.071774 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:42.571747942 +0000 UTC m=+256.287573466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.172317 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:42 crc kubenswrapper[4843]: E0318 12:13:42.172916 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:42.672892228 +0000 UTC m=+256.388717752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.196005 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.243752 4843 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.273201 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:42 crc kubenswrapper[4843]: E0318 12:13:42.273722 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:42.773703673 +0000 UTC m=+256.489529197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.353126 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.376602 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:42 crc kubenswrapper[4843]: E0318 12:13:42.377395 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:42.87737905 +0000 UTC m=+256.593204574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.459750 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rfg7x"] Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.466273 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.478533 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:42 crc kubenswrapper[4843]: E0318 12:13:42.479427 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:42.979404041 +0000 UTC m=+256.695229555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.538633 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.634578 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/431329ed-c93d-4d44-bb49-ebed3a083f1a-utilities\") pod \"certified-operators-rfg7x\" (UID: \"431329ed-c93d-4d44-bb49-ebed3a083f1a\") " pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.634608 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/431329ed-c93d-4d44-bb49-ebed3a083f1a-catalog-content\") pod \"certified-operators-rfg7x\" (UID: \"431329ed-c93d-4d44-bb49-ebed3a083f1a\") " pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.634675 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.634816 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh8qh\" (UniqueName: \"kubernetes.io/projected/431329ed-c93d-4d44-bb49-ebed3a083f1a-kube-api-access-hh8qh\") pod \"certified-operators-rfg7x\" (UID: \"431329ed-c93d-4d44-bb49-ebed3a083f1a\") " pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:13:42 crc kubenswrapper[4843]: E0318 12:13:42.634934 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:43.134923093 +0000 UTC m=+256.850748607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.667748 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lcsxl"] Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.668836 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.743225 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.744387 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.744667 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/431329ed-c93d-4d44-bb49-ebed3a083f1a-utilities\") pod \"certified-operators-rfg7x\" (UID: \"431329ed-c93d-4d44-bb49-ebed3a083f1a\") " pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.744727 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/431329ed-c93d-4d44-bb49-ebed3a083f1a-catalog-content\") pod \"certified-operators-rfg7x\" (UID: \"431329ed-c93d-4d44-bb49-ebed3a083f1a\") " pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.744756 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cef8c60-1bee-4798-9147-4d7f360ae4b2-utilities\") pod \"community-operators-lcsxl\" (UID: \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\") " pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.744973 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dp4b\" (UniqueName: \"kubernetes.io/projected/3cef8c60-1bee-4798-9147-4d7f360ae4b2-kube-api-access-9dp4b\") pod \"community-operators-lcsxl\" (UID: \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\") " pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.745012 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cef8c60-1bee-4798-9147-4d7f360ae4b2-catalog-content\") pod \"community-operators-lcsxl\" (UID: \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\") " pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.745153 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh8qh\" (UniqueName: \"kubernetes.io/projected/431329ed-c93d-4d44-bb49-ebed3a083f1a-kube-api-access-hh8qh\") pod \"certified-operators-rfg7x\" (UID: \"431329ed-c93d-4d44-bb49-ebed3a083f1a\") " pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:13:42 crc kubenswrapper[4843]: E0318 12:13:42.745629 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:43.245610934 +0000 UTC m=+256.961436458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.746224 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/431329ed-c93d-4d44-bb49-ebed3a083f1a-utilities\") pod \"certified-operators-rfg7x\" (UID: \"431329ed-c93d-4d44-bb49-ebed3a083f1a\") " pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.746555 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/431329ed-c93d-4d44-bb49-ebed3a083f1a-catalog-content\") pod \"certified-operators-rfg7x\" (UID: \"431329ed-c93d-4d44-bb49-ebed3a083f1a\") " pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.985461 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:42 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:42 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:42 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:42 crc kubenswrapper[4843]: I0318 12:13:42.985540 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.105719 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cef8c60-1bee-4798-9147-4d7f360ae4b2-utilities\") pod \"community-operators-lcsxl\" (UID: \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\") " pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.105760 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dp4b\" (UniqueName: \"kubernetes.io/projected/3cef8c60-1bee-4798-9147-4d7f360ae4b2-kube-api-access-9dp4b\") pod \"community-operators-lcsxl\" (UID: \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\") " pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.105789 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cef8c60-1bee-4798-9147-4d7f360ae4b2-catalog-content\") pod \"community-operators-lcsxl\" (UID: \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\") " pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.105822 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.106920 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cef8c60-1bee-4798-9147-4d7f360ae4b2-utilities\") pod \"community-operators-lcsxl\" (UID: \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\") " pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.107262 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cef8c60-1bee-4798-9147-4d7f360ae4b2-catalog-content\") pod \"community-operators-lcsxl\" (UID: \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\") " pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.107630 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lcsxl"] Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.107685 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rfg7x"] Mar 18 12:13:43 crc kubenswrapper[4843]: E0318 12:13:43.107972 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:43.607953647 +0000 UTC m=+257.323779171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.127455 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh8qh\" (UniqueName: \"kubernetes.io/projected/431329ed-c93d-4d44-bb49-ebed3a083f1a-kube-api-access-hh8qh\") pod \"certified-operators-rfg7x\" (UID: \"431329ed-c93d-4d44-bb49-ebed3a083f1a\") " pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.154429 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xs5wj"] Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.155807 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.165038 4843 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T12:13:42.243781401Z","Handler":null,"Name":""} Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.165428 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.204037 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t62l2"] Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.205524 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.208957 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.209202 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffad2911-2fae-4030-a3ad-d36a5f95fc07-utilities\") pod \"certified-operators-t62l2\" (UID: \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\") " pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.209268 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-catalog-content\") pod \"community-operators-xs5wj\" (UID: \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\") " pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.209326 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42kbw\" (UniqueName: \"kubernetes.io/projected/ffad2911-2fae-4030-a3ad-d36a5f95fc07-kube-api-access-42kbw\") pod \"certified-operators-t62l2\" (UID: \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\") " pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.209342 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-utilities\") pod \"community-operators-xs5wj\" (UID: \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\") " pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.209362 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmf2j\" (UniqueName: \"kubernetes.io/projected/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-kube-api-access-zmf2j\") pod \"community-operators-xs5wj\" (UID: \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\") " pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.209428 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffad2911-2fae-4030-a3ad-d36a5f95fc07-catalog-content\") pod \"certified-operators-t62l2\" (UID: \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\") " pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:13:43 crc kubenswrapper[4843]: E0318 12:13:43.209608 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:43.709589336 +0000 UTC m=+257.425414860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.322018 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42kbw\" (UniqueName: \"kubernetes.io/projected/ffad2911-2fae-4030-a3ad-d36a5f95fc07-kube-api-access-42kbw\") pod \"certified-operators-t62l2\" (UID: \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\") " pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.322048 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-utilities\") pod \"community-operators-xs5wj\" (UID: \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\") " pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.322063 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmf2j\" (UniqueName: \"kubernetes.io/projected/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-kube-api-access-zmf2j\") pod \"community-operators-xs5wj\" (UID: \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\") " pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.322155 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.322193 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffad2911-2fae-4030-a3ad-d36a5f95fc07-catalog-content\") pod \"certified-operators-t62l2\" (UID: \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\") " pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.322294 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffad2911-2fae-4030-a3ad-d36a5f95fc07-utilities\") pod \"certified-operators-t62l2\" (UID: \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\") " pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.322328 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-catalog-content\") pod \"community-operators-xs5wj\" (UID: \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\") " pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.323170 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-catalog-content\") pod \"community-operators-xs5wj\" (UID: \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\") " pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.323743 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-utilities\") pod \"community-operators-xs5wj\" (UID: \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\") " pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:13:43 crc kubenswrapper[4843]: E0318 12:13:43.324319 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:43.824302662 +0000 UTC m=+257.540128186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9g67x" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.324971 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffad2911-2fae-4030-a3ad-d36a5f95fc07-catalog-content\") pod \"certified-operators-t62l2\" (UID: \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\") " pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.325324 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffad2911-2fae-4030-a3ad-d36a5f95fc07-utilities\") pod \"certified-operators-t62l2\" (UID: \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\") " pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:13:43 crc kubenswrapper[4843]: E0318 12:13:43.375834 4843 projected.go:194] Error preparing data for projected volume kube-api-access-42kbw for pod openshift-marketplace/certified-operators-t62l2: failed to fetch token: serviceaccounts "certified-operators" is forbidden: node requested token bound to a pod scheduled on a different node Mar 18 12:13:43 crc kubenswrapper[4843]: E0318 12:13:43.375923 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffad2911-2fae-4030-a3ad-d36a5f95fc07-kube-api-access-42kbw podName:ffad2911-2fae-4030-a3ad-d36a5f95fc07 nodeName:}" failed. No retries permitted until 2026-03-18 12:13:43.87590448 +0000 UTC m=+257.591730004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-42kbw" (UniqueName: "kubernetes.io/projected/ffad2911-2fae-4030-a3ad-d36a5f95fc07-kube-api-access-42kbw") pod "certified-operators-t62l2" (UID: "ffad2911-2fae-4030-a3ad-d36a5f95fc07") : failed to fetch token: serviceaccounts "certified-operators" is forbidden: node requested token bound to a pod scheduled on a different node Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.498064 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:43 crc kubenswrapper[4843]: E0318 12:13:43.498865 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:43.998828512 +0000 UTC m=+257.714654176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.504805 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dp4b\" (UniqueName: \"kubernetes.io/projected/3cef8c60-1bee-4798-9147-4d7f360ae4b2-kube-api-access-9dp4b\") pod \"community-operators-lcsxl\" (UID: \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\") " pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.579870 4843 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.579911 4843 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.603554 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.628619 4843 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.628681 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.754192 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.781721 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xs5wj"] Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.790510 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmf2j\" (UniqueName: \"kubernetes.io/projected/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-kube-api-access-zmf2j\") pod \"community-operators-xs5wj\" (UID: \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\") " pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.797463 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:43 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:43 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:43 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.797539 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.797596 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.830768 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9g67x\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.936160 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42kbw\" (UniqueName: \"kubernetes.io/projected/ffad2911-2fae-4030-a3ad-d36a5f95fc07-kube-api-access-42kbw\") pod \"certified-operators-t62l2\" (UID: \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\") " pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.965170 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k6c87" Mar 18 12:13:43 crc kubenswrapper[4843]: I0318 12:13:43.966916 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t62l2"] Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.040872 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.165470 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sn986"] Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.529857 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.534361 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.535487 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.542823 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42kbw\" (UniqueName: \"kubernetes.io/projected/ffad2911-2fae-4030-a3ad-d36a5f95fc07-kube-api-access-42kbw\") pod \"certified-operators-t62l2\" (UID: \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\") " pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.695184 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.696280 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.696377 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.808200 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.809872 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:44 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:44 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:44 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.809931 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.833822 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.843444 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.906082 4843 ???:1] "http: TLS handshake error from 192.168.126.11:42610: no serving certificate available for the kubelet" Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.911195 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9082af15-0a2d-4006-a24c-211de4dd2a73-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9082af15-0a2d-4006-a24c-211de4dd2a73\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:13:44 crc kubenswrapper[4843]: I0318 12:13:44.911260 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9082af15-0a2d-4006-a24c-211de4dd2a73-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9082af15-0a2d-4006-a24c-211de4dd2a73\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.140791 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9082af15-0a2d-4006-a24c-211de4dd2a73-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9082af15-0a2d-4006-a24c-211de4dd2a73\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.140871 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9082af15-0a2d-4006-a24c-211de4dd2a73-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9082af15-0a2d-4006-a24c-211de4dd2a73\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.141284 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9082af15-0a2d-4006-a24c-211de4dd2a73-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9082af15-0a2d-4006-a24c-211de4dd2a73\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.161073 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.175630 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sn986" event={"ID":"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c","Type":"ContainerStarted","Data":"d426f3150154442f538e6dcf995dd26383bc6260e34893bf6fd1643fec0e0568"} Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.182405 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8hpvl" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.182460 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dgr8z"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.190978 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.285163 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.289716 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9082af15-0a2d-4006-a24c-211de4dd2a73-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9082af15-0a2d-4006-a24c-211de4dd2a73\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.316744 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dgr8z"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.317142 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.321099 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g7fkz"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.321369 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" podUID="061ecf07-8167-4652-8182-5779e5502bbf" containerName="controller-manager" containerID="cri-o://9c74208936b3d5da5bbf785fd6df4ad4876d32e81b24a1586af5360b2f7e2bc4" gracePeriod=30 Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.341953 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.342212 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" podUID="1da258ba-cd08-4cb9-90bb-18675d625fd1" containerName="route-controller-manager" containerID="cri-o://c03c17b0a5cc07693a1d6178fc57c492bd0c9d00136f81b2cd4bc07238b6e1e3" gracePeriod=30 Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.392046 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb1dd70-302b-4217-b17c-211aea971073-catalog-content\") pod \"redhat-operators-dgr8z\" (UID: \"feb1dd70-302b-4217-b17c-211aea971073\") " pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.392206 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lwgq\" (UniqueName: \"kubernetes.io/projected/feb1dd70-302b-4217-b17c-211aea971073-kube-api-access-4lwgq\") pod \"redhat-operators-dgr8z\" (UID: \"feb1dd70-302b-4217-b17c-211aea971073\") " pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.392262 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb1dd70-302b-4217-b17c-211aea971073-utilities\") pod \"redhat-operators-dgr8z\" (UID: \"feb1dd70-302b-4217-b17c-211aea971073\") " pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.436478 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j2p2b"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.439638 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.460056 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.463676 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.479878 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2p2b"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.495429 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lwgq\" (UniqueName: \"kubernetes.io/projected/feb1dd70-302b-4217-b17c-211aea971073-kube-api-access-4lwgq\") pod \"redhat-operators-dgr8z\" (UID: \"feb1dd70-302b-4217-b17c-211aea971073\") " pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.495505 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb1dd70-302b-4217-b17c-211aea971073-utilities\") pod \"redhat-operators-dgr8z\" (UID: \"feb1dd70-302b-4217-b17c-211aea971073\") " pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.495578 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb1dd70-302b-4217-b17c-211aea971073-catalog-content\") pod \"redhat-operators-dgr8z\" (UID: \"feb1dd70-302b-4217-b17c-211aea971073\") " pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.496137 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb1dd70-302b-4217-b17c-211aea971073-catalog-content\") pod \"redhat-operators-dgr8z\" (UID: \"feb1dd70-302b-4217-b17c-211aea971073\") " pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.496717 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb1dd70-302b-4217-b17c-211aea971073-utilities\") pod \"redhat-operators-dgr8z\" (UID: \"feb1dd70-302b-4217-b17c-211aea971073\") " pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.535576 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lwgq\" (UniqueName: \"kubernetes.io/projected/feb1dd70-302b-4217-b17c-211aea971073-kube-api-access-4lwgq\") pod \"redhat-operators-dgr8z\" (UID: \"feb1dd70-302b-4217-b17c-211aea971073\") " pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.605552 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc0843b-8f5c-434e-986e-3aab182caad3-catalog-content\") pod \"redhat-marketplace-j2p2b\" (UID: \"0dc0843b-8f5c-434e-986e-3aab182caad3\") " pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.605647 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt446\" (UniqueName: \"kubernetes.io/projected/0dc0843b-8f5c-434e-986e-3aab182caad3-kube-api-access-bt446\") pod \"redhat-marketplace-j2p2b\" (UID: \"0dc0843b-8f5c-434e-986e-3aab182caad3\") " pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.605712 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc0843b-8f5c-434e-986e-3aab182caad3-utilities\") pod \"redhat-marketplace-j2p2b\" (UID: \"0dc0843b-8f5c-434e-986e-3aab182caad3\") " pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.633377 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.643754 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mrwd9"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.645529 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.655850 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mrwd9"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.709445 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc0843b-8f5c-434e-986e-3aab182caad3-catalog-content\") pod \"redhat-marketplace-j2p2b\" (UID: \"0dc0843b-8f5c-434e-986e-3aab182caad3\") " pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.709508 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt446\" (UniqueName: \"kubernetes.io/projected/0dc0843b-8f5c-434e-986e-3aab182caad3-kube-api-access-bt446\") pod \"redhat-marketplace-j2p2b\" (UID: \"0dc0843b-8f5c-434e-986e-3aab182caad3\") " pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.709545 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc0843b-8f5c-434e-986e-3aab182caad3-utilities\") pod \"redhat-marketplace-j2p2b\" (UID: \"0dc0843b-8f5c-434e-986e-3aab182caad3\") " pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.710322 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc0843b-8f5c-434e-986e-3aab182caad3-utilities\") pod \"redhat-marketplace-j2p2b\" (UID: \"0dc0843b-8f5c-434e-986e-3aab182caad3\") " pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.710528 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc0843b-8f5c-434e-986e-3aab182caad3-catalog-content\") pod \"redhat-marketplace-j2p2b\" (UID: \"0dc0843b-8f5c-434e-986e-3aab182caad3\") " pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.741531 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rfg7x"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.775283 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt446\" (UniqueName: \"kubernetes.io/projected/0dc0843b-8f5c-434e-986e-3aab182caad3-kube-api-access-bt446\") pod \"redhat-marketplace-j2p2b\" (UID: \"0dc0843b-8f5c-434e-986e-3aab182caad3\") " pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.792362 4843 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7n8zj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" start-of-body= Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.792412 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" podUID="1da258ba-cd08-4cb9-90bb-18675d625fd1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.794229 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:45 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:45 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:45 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.794269 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.797675 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-52bm4"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.820087 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.822142 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv5gs\" (UniqueName: \"kubernetes.io/projected/7aa438f5-fa8b-43f7-93b7-67e592ea698c-kube-api-access-vv5gs\") pod \"redhat-operators-mrwd9\" (UID: \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\") " pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.822185 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa438f5-fa8b-43f7-93b7-67e592ea698c-catalog-content\") pod \"redhat-operators-mrwd9\" (UID: \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\") " pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.822221 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa438f5-fa8b-43f7-93b7-67e592ea698c-utilities\") pod \"redhat-operators-mrwd9\" (UID: \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\") " pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.828743 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lcsxl"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.851003 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-52bm4"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.868534 4843 patch_prober.go:28] interesting pod/console-f9d7485db-ppqpl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.869300 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ppqpl" podUID="b999a5c0-f4e8-499b-8f81-283c3a2cf495" containerName="console" probeResult="failure" output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.936186 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-utilities\") pod \"redhat-marketplace-52bm4\" (UID: \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\") " pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.936260 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv5gs\" (UniqueName: \"kubernetes.io/projected/7aa438f5-fa8b-43f7-93b7-67e592ea698c-kube-api-access-vv5gs\") pod \"redhat-operators-mrwd9\" (UID: \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\") " pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.936292 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa438f5-fa8b-43f7-93b7-67e592ea698c-catalog-content\") pod \"redhat-operators-mrwd9\" (UID: \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\") " pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.936321 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa438f5-fa8b-43f7-93b7-67e592ea698c-utilities\") pod \"redhat-operators-mrwd9\" (UID: \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\") " pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.936370 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-catalog-content\") pod \"redhat-marketplace-52bm4\" (UID: \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\") " pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.936393 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc4ck\" (UniqueName: \"kubernetes.io/projected/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-kube-api-access-zc4ck\") pod \"redhat-marketplace-52bm4\" (UID: \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\") " pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.937187 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa438f5-fa8b-43f7-93b7-67e592ea698c-catalog-content\") pod \"redhat-operators-mrwd9\" (UID: \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\") " pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.937421 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa438f5-fa8b-43f7-93b7-67e592ea698c-utilities\") pod \"redhat-operators-mrwd9\" (UID: \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\") " pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.937467 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xs5wj"] Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.946327 4843 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-g7fkz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.946388 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" podUID="061ecf07-8167-4652-8182-5779e5502bbf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.958465 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:45 crc kubenswrapper[4843]: W0318 12:13:45.977218 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cef8c60_1bee_4798_9147_4d7f360ae4b2.slice/crio-d956f0fa72ebbed74079ef69b2db5051d1194160057bf61685b5a073ed89b3c9 WatchSource:0}: Error finding container d956f0fa72ebbed74079ef69b2db5051d1194160057bf61685b5a073ed89b3c9: Status 404 returned error can't find the container with id d956f0fa72ebbed74079ef69b2db5051d1194160057bf61685b5a073ed89b3c9 Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.978746 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-hqt5c" Mar 18 12:13:45 crc kubenswrapper[4843]: W0318 12:13:45.980538 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ed2d4f_4e0a_4de3_9bc3_b7b5cc93defa.slice/crio-b441ec1c8d0e83f2fc294282a146aebf545026f7692de6b04c6efb65c3fe80c8 WatchSource:0}: Error finding container b441ec1c8d0e83f2fc294282a146aebf545026f7692de6b04c6efb65c3fe80c8: Status 404 returned error can't find the container with id b441ec1c8d0e83f2fc294282a146aebf545026f7692de6b04c6efb65c3fe80c8 Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.991867 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:13:45 crc kubenswrapper[4843]: I0318 12:13:45.992897 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv5gs\" (UniqueName: \"kubernetes.io/projected/7aa438f5-fa8b-43f7-93b7-67e592ea698c-kube-api-access-vv5gs\") pod \"redhat-operators-mrwd9\" (UID: \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\") " pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.022325 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.044488 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-catalog-content\") pod \"redhat-marketplace-52bm4\" (UID: \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\") " pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.044549 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc4ck\" (UniqueName: \"kubernetes.io/projected/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-kube-api-access-zc4ck\") pod \"redhat-marketplace-52bm4\" (UID: \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\") " pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.044620 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-utilities\") pod \"redhat-marketplace-52bm4\" (UID: \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\") " pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.044963 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-catalog-content\") pod \"redhat-marketplace-52bm4\" (UID: \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\") " pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.049823 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-utilities\") pod \"redhat-marketplace-52bm4\" (UID: \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\") " pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.088173 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc4ck\" (UniqueName: \"kubernetes.io/projected/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-kube-api-access-zc4ck\") pod \"redhat-marketplace-52bm4\" (UID: \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\") " pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.347488 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.388836 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.388925 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.389331 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.389350 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.470619 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.521295 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73","Type":"ContainerStarted","Data":"e3ed85ee754bd67cb0a4c4564aaf29ce5ed90cc760aab8658117e1489ba62c0d"} Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.525526 4843 generic.go:334] "Generic (PLEG): container finished" podID="1da258ba-cd08-4cb9-90bb-18675d625fd1" containerID="c03c17b0a5cc07693a1d6178fc57c492bd0c9d00136f81b2cd4bc07238b6e1e3" exitCode=0 Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.525579 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" event={"ID":"1da258ba-cd08-4cb9-90bb-18675d625fd1","Type":"ContainerDied","Data":"c03c17b0a5cc07693a1d6178fc57c492bd0c9d00136f81b2cd4bc07238b6e1e3"} Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.549980 4843 generic.go:334] "Generic (PLEG): container finished" podID="061ecf07-8167-4652-8182-5779e5502bbf" containerID="9c74208936b3d5da5bbf785fd6df4ad4876d32e81b24a1586af5360b2f7e2bc4" exitCode=0 Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.550107 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" event={"ID":"061ecf07-8167-4652-8182-5779e5502bbf","Type":"ContainerDied","Data":"9c74208936b3d5da5bbf785fd6df4ad4876d32e81b24a1586af5360b2f7e2bc4"} Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.554084 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfg7x" event={"ID":"431329ed-c93d-4d44-bb49-ebed3a083f1a","Type":"ContainerStarted","Data":"f4345fc1eacef1bfb0aa089076a77a76a1704e5cccd4b32880640c99eeab4386"} Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.561865 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs5wj" event={"ID":"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa","Type":"ContainerStarted","Data":"b441ec1c8d0e83f2fc294282a146aebf545026f7692de6b04c6efb65c3fe80c8"} Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.564227 4843 generic.go:334] "Generic (PLEG): container finished" podID="b1442992-fe43-43be-a43b-48f80db66418" containerID="bcdf1634e918c197bd848d59680a4a1bd7ebec66d9798689e2470cd359e4a791" exitCode=0 Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.564312 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" event={"ID":"b1442992-fe43-43be-a43b-48f80db66418","Type":"ContainerDied","Data":"bcdf1634e918c197bd848d59680a4a1bd7ebec66d9798689e2470cd359e4a791"} Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.568483 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcsxl" event={"ID":"3cef8c60-1bee-4798-9147-4d7f360ae4b2","Type":"ContainerStarted","Data":"d956f0fa72ebbed74079ef69b2db5051d1194160057bf61685b5a073ed89b3c9"} Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.569895 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9g67x"] Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.793550 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:46 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:46 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:46 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.793952 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.821565 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t62l2"] Mar 18 12:13:46 crc kubenswrapper[4843]: I0318 12:13:46.878079 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.050019 4843 ???:1] "http: TLS handshake error from 192.168.126.11:42624: no serving certificate available for the kubelet" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.050560 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1da258ba-cd08-4cb9-90bb-18675d625fd1-client-ca\") pod \"1da258ba-cd08-4cb9-90bb-18675d625fd1\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.050657 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da258ba-cd08-4cb9-90bb-18675d625fd1-config\") pod \"1da258ba-cd08-4cb9-90bb-18675d625fd1\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.050756 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1da258ba-cd08-4cb9-90bb-18675d625fd1-serving-cert\") pod \"1da258ba-cd08-4cb9-90bb-18675d625fd1\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.050829 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fjqr\" (UniqueName: \"kubernetes.io/projected/1da258ba-cd08-4cb9-90bb-18675d625fd1-kube-api-access-4fjqr\") pod \"1da258ba-cd08-4cb9-90bb-18675d625fd1\" (UID: \"1da258ba-cd08-4cb9-90bb-18675d625fd1\") " Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.051835 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1da258ba-cd08-4cb9-90bb-18675d625fd1-config" (OuterVolumeSpecName: "config") pod "1da258ba-cd08-4cb9-90bb-18675d625fd1" (UID: "1da258ba-cd08-4cb9-90bb-18675d625fd1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.052027 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1da258ba-cd08-4cb9-90bb-18675d625fd1-client-ca" (OuterVolumeSpecName: "client-ca") pod "1da258ba-cd08-4cb9-90bb-18675d625fd1" (UID: "1da258ba-cd08-4cb9-90bb-18675d625fd1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.103152 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1da258ba-cd08-4cb9-90bb-18675d625fd1-kube-api-access-4fjqr" (OuterVolumeSpecName: "kube-api-access-4fjqr") pod "1da258ba-cd08-4cb9-90bb-18675d625fd1" (UID: "1da258ba-cd08-4cb9-90bb-18675d625fd1"). InnerVolumeSpecName "kube-api-access-4fjqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.171445 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fjqr\" (UniqueName: \"kubernetes.io/projected/1da258ba-cd08-4cb9-90bb-18675d625fd1-kube-api-access-4fjqr\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.171600 4843 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1da258ba-cd08-4cb9-90bb-18675d625fd1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.176334 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da258ba-cd08-4cb9-90bb-18675d625fd1-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.176608 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da258ba-cd08-4cb9-90bb-18675d625fd1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1da258ba-cd08-4cb9-90bb-18675d625fd1" (UID: "1da258ba-cd08-4cb9-90bb-18675d625fd1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.194252 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.194300 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xp4kv" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.194325 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kg7xw" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.194348 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-brsfz" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.256302 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c"] Mar 18 12:13:47 crc kubenswrapper[4843]: E0318 12:13:47.256691 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da258ba-cd08-4cb9-90bb-18675d625fd1" containerName="route-controller-manager" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.256710 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da258ba-cd08-4cb9-90bb-18675d625fd1" containerName="route-controller-manager" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.256902 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da258ba-cd08-4cb9-90bb-18675d625fd1" containerName="route-controller-manager" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.257474 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.278156 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-config\") pod \"route-controller-manager-69b5466b65-pwv7c\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.278443 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8p7f\" (UniqueName: \"kubernetes.io/projected/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-kube-api-access-v8p7f\") pod \"route-controller-manager-69b5466b65-pwv7c\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.278724 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-serving-cert\") pod \"route-controller-manager-69b5466b65-pwv7c\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.278858 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-client-ca\") pod \"route-controller-manager-69b5466b65-pwv7c\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.279026 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1da258ba-cd08-4cb9-90bb-18675d625fd1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.384424 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-config\") pod \"route-controller-manager-69b5466b65-pwv7c\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.384643 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8p7f\" (UniqueName: \"kubernetes.io/projected/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-kube-api-access-v8p7f\") pod \"route-controller-manager-69b5466b65-pwv7c\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.384695 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-serving-cert\") pod \"route-controller-manager-69b5466b65-pwv7c\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.384716 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-client-ca\") pod \"route-controller-manager-69b5466b65-pwv7c\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.386104 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-client-ca\") pod \"route-controller-manager-69b5466b65-pwv7c\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.635898 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8p7f\" (UniqueName: \"kubernetes.io/projected/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-kube-api-access-v8p7f\") pod \"route-controller-manager-69b5466b65-pwv7c\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.638387 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-config\") pod \"route-controller-manager-69b5466b65-pwv7c\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.647772 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-serving-cert\") pod \"route-controller-manager-69b5466b65-pwv7c\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.649718 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c"] Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.758941 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.784958 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sn986" event={"ID":"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c","Type":"ContainerStarted","Data":"8fcef0d78f79034268bc6d42e8853fa5bee8f94d2c6b3c66300c0b2d9c46f30a"} Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.793752 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:47 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:47 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:47 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.793800 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.794970 4843 generic.go:334] "Generic (PLEG): container finished" podID="431329ed-c93d-4d44-bb49-ebed3a083f1a" containerID="36322762d4c9361b7defc234d8f0c2ef1e90e49e143b2c1c30da89b0c1937d0b" exitCode=0 Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.795102 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfg7x" event={"ID":"431329ed-c93d-4d44-bb49-ebed3a083f1a","Type":"ContainerDied","Data":"36322762d4c9361b7defc234d8f0c2ef1e90e49e143b2c1c30da89b0c1937d0b"} Mar 18 12:13:47 crc kubenswrapper[4843]: I0318 12:13:47.804894 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.108961 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73","Type":"ContainerStarted","Data":"61a4cc002e1560605153f2e774fcd7f47e2f38245c15f5234c3452e6c56b0e6e"} Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.132235 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" event={"ID":"13295157-4a57-4e7f-9ff4-13c2a4381c27","Type":"ContainerStarted","Data":"941d3851100d62a06548ff236679ceefd6db270e5856456a99912e60241302c1"} Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.133223 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.159768 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" event={"ID":"1da258ba-cd08-4cb9-90bb-18675d625fd1","Type":"ContainerDied","Data":"d44e7664994e9cbe0c971f8b7e75a01aaa5eb1e52661a150ee2d459bb0c3547d"} Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.159827 4843 scope.go:117] "RemoveContainer" containerID="c03c17b0a5cc07693a1d6178fc57c492bd0c9d00136f81b2cd4bc07238b6e1e3" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.159962 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.172160 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t62l2" event={"ID":"ffad2911-2fae-4030-a3ad-d36a5f95fc07","Type":"ContainerStarted","Data":"c810d230efbe57b5b85506c28282397425602a35cbf86d7794d21d0df1aac1c0"} Mar 18 12:13:48 crc kubenswrapper[4843]: W0318 12:13:48.191886 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9082af15_0a2d_4006_a24c_211de4dd2a73.slice/crio-124da7dcf369a9546db8658c631f90c8b487cad04bb0f16d3f1e55cf64077d9e WatchSource:0}: Error finding container 124da7dcf369a9546db8658c631f90c8b487cad04bb0f16d3f1e55cf64077d9e: Status 404 returned error can't find the container with id 124da7dcf369a9546db8658c631f90c8b487cad04bb0f16d3f1e55cf64077d9e Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.316624 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dgr8z"] Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.434790 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2p2b"] Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.459890 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" podStartSLOduration=198.459864176 podStartE2EDuration="3m18.459864176s" podCreationTimestamp="2026-03-18 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:48.454284135 +0000 UTC m=+262.170109659" watchObservedRunningTime="2026-03-18 12:13:48.459864176 +0000 UTC m=+262.175689700" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.542864 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-52bm4"] Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.552320 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=7.55229904 podStartE2EDuration="7.55229904s" podCreationTimestamp="2026-03-18 12:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:48.546839182 +0000 UTC m=+262.262664706" watchObservedRunningTime="2026-03-18 12:13:48.55229904 +0000 UTC m=+262.268124564" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.571422 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mrwd9"] Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.649600 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.667018 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj"] Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.668913 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7n8zj"] Mar 18 12:13:48 crc kubenswrapper[4843]: W0318 12:13:48.719423 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aa438f5_fa8b_43f7_93b7_67e592ea698c.slice/crio-be65cb05c3eed2feb6571f468eb09586652a93c779f4894d07f32893f50bb666 WatchSource:0}: Error finding container be65cb05c3eed2feb6571f468eb09586652a93c779f4894d07f32893f50bb666: Status 404 returned error can't find the container with id be65cb05c3eed2feb6571f468eb09586652a93c779f4894d07f32893f50bb666 Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.782437 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c"] Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.792905 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.793469 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:48 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:48 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:48 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.793504 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.819324 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-proxy-ca-bundles\") pod \"061ecf07-8167-4652-8182-5779e5502bbf\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.820380 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "061ecf07-8167-4652-8182-5779e5502bbf" (UID: "061ecf07-8167-4652-8182-5779e5502bbf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.819630 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-config\") pod \"061ecf07-8167-4652-8182-5779e5502bbf\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.820444 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-config" (OuterVolumeSpecName: "config") pod "061ecf07-8167-4652-8182-5779e5502bbf" (UID: "061ecf07-8167-4652-8182-5779e5502bbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.820524 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-client-ca\") pod \"061ecf07-8167-4652-8182-5779e5502bbf\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.820584 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqq7l\" (UniqueName: \"kubernetes.io/projected/061ecf07-8167-4652-8182-5779e5502bbf-kube-api-access-vqq7l\") pod \"061ecf07-8167-4652-8182-5779e5502bbf\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.820616 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/061ecf07-8167-4652-8182-5779e5502bbf-serving-cert\") pod \"061ecf07-8167-4652-8182-5779e5502bbf\" (UID: \"061ecf07-8167-4652-8182-5779e5502bbf\") " Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.821185 4843 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.821208 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.822591 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-client-ca" (OuterVolumeSpecName: "client-ca") pod "061ecf07-8167-4652-8182-5779e5502bbf" (UID: "061ecf07-8167-4652-8182-5779e5502bbf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.827611 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/061ecf07-8167-4652-8182-5779e5502bbf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "061ecf07-8167-4652-8182-5779e5502bbf" (UID: "061ecf07-8167-4652-8182-5779e5502bbf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.829021 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/061ecf07-8167-4652-8182-5779e5502bbf-kube-api-access-vqq7l" (OuterVolumeSpecName: "kube-api-access-vqq7l") pod "061ecf07-8167-4652-8182-5779e5502bbf" (UID: "061ecf07-8167-4652-8182-5779e5502bbf"). InnerVolumeSpecName "kube-api-access-vqq7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.921683 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1442992-fe43-43be-a43b-48f80db66418-config-volume\") pod \"b1442992-fe43-43be-a43b-48f80db66418\" (UID: \"b1442992-fe43-43be-a43b-48f80db66418\") " Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.921745 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xnkn\" (UniqueName: \"kubernetes.io/projected/b1442992-fe43-43be-a43b-48f80db66418-kube-api-access-5xnkn\") pod \"b1442992-fe43-43be-a43b-48f80db66418\" (UID: \"b1442992-fe43-43be-a43b-48f80db66418\") " Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.921903 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1442992-fe43-43be-a43b-48f80db66418-secret-volume\") pod \"b1442992-fe43-43be-a43b-48f80db66418\" (UID: \"b1442992-fe43-43be-a43b-48f80db66418\") " Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.922193 4843 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/061ecf07-8167-4652-8182-5779e5502bbf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.922212 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqq7l\" (UniqueName: \"kubernetes.io/projected/061ecf07-8167-4652-8182-5779e5502bbf-kube-api-access-vqq7l\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.922226 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/061ecf07-8167-4652-8182-5779e5502bbf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:48 crc kubenswrapper[4843]: I0318 12:13:48.922962 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1442992-fe43-43be-a43b-48f80db66418-config-volume" (OuterVolumeSpecName: "config-volume") pod "b1442992-fe43-43be-a43b-48f80db66418" (UID: "b1442992-fe43-43be-a43b-48f80db66418"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.027637 4843 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1442992-fe43-43be-a43b-48f80db66418-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.065964 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1da258ba-cd08-4cb9-90bb-18675d625fd1" path="/var/lib/kubelet/pods/1da258ba-cd08-4cb9-90bb-18675d625fd1/volumes" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.105940 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1442992-fe43-43be-a43b-48f80db66418-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b1442992-fe43-43be-a43b-48f80db66418" (UID: "b1442992-fe43-43be-a43b-48f80db66418"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.128599 4843 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1442992-fe43-43be-a43b-48f80db66418-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.201552 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1442992-fe43-43be-a43b-48f80db66418-kube-api-access-5xnkn" (OuterVolumeSpecName: "kube-api-access-5xnkn") pod "b1442992-fe43-43be-a43b-48f80db66418" (UID: "b1442992-fe43-43be-a43b-48f80db66418"). InnerVolumeSpecName "kube-api-access-5xnkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.229434 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.229442 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx" event={"ID":"b1442992-fe43-43be-a43b-48f80db66418","Type":"ContainerDied","Data":"a866923e25a96bc5766c265e0ad47ed03548ba10818e5a220304049ff7f0b5d3"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.229492 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a866923e25a96bc5766c265e0ad47ed03548ba10818e5a220304049ff7f0b5d3" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.230459 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xnkn\" (UniqueName: \"kubernetes.io/projected/b1442992-fe43-43be-a43b-48f80db66418-kube-api-access-5xnkn\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.232377 4843 generic.go:334] "Generic (PLEG): container finished" podID="3cef8c60-1bee-4798-9147-4d7f360ae4b2" containerID="be74ec7f4cdc3f4dbb14e146f7c29360a43a82bcf7f7ff5f935c417a497c53b2" exitCode=0 Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.232466 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcsxl" event={"ID":"3cef8c60-1bee-4798-9147-4d7f360ae4b2","Type":"ContainerDied","Data":"be74ec7f4cdc3f4dbb14e146f7c29360a43a82bcf7f7ff5f935c417a497c53b2"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.254623 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sn986" event={"ID":"62e4aeb8-cdaa-44e9-8c3a-a7da67ca037c","Type":"ContainerStarted","Data":"2483ef289566ed04254a515b3644fa3ac2716958bfebf10c22d8e64fdaa073b4"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.301930 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrwd9" event={"ID":"7aa438f5-fa8b-43f7-93b7-67e592ea698c","Type":"ContainerStarted","Data":"be65cb05c3eed2feb6571f468eb09586652a93c779f4894d07f32893f50bb666"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.399201 4843 generic.go:334] "Generic (PLEG): container finished" podID="0dc0843b-8f5c-434e-986e-3aab182caad3" containerID="4697ff225f5b17311de3b67b15466aa056abc008617715defe9df7c22e7fc06c" exitCode=0 Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.399799 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2p2b" event={"ID":"0dc0843b-8f5c-434e-986e-3aab182caad3","Type":"ContainerDied","Data":"4697ff225f5b17311de3b67b15466aa056abc008617715defe9df7c22e7fc06c"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.399934 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2p2b" event={"ID":"0dc0843b-8f5c-434e-986e-3aab182caad3","Type":"ContainerStarted","Data":"271a34b99b3b195fe59f39c3cc71821b24ccfa1d415a7271277350604398fe6e"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.411701 4843 generic.go:334] "Generic (PLEG): container finished" podID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" containerID="71e453afbd09fc8b0f42a5ebd24e7bfe39d37bb2f7182a38d72b253b91b3c210" exitCode=0 Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.411860 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t62l2" event={"ID":"ffad2911-2fae-4030-a3ad-d36a5f95fc07","Type":"ContainerDied","Data":"71e453afbd09fc8b0f42a5ebd24e7bfe39d37bb2f7182a38d72b253b91b3c210"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.425067 4843 generic.go:334] "Generic (PLEG): container finished" podID="93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" containerID="73d0c20023e3fdaa8b51f84bc1b1a2ebcb64c55fef243ef0d5300b17b1d8f0e1" exitCode=0 Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.425188 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs5wj" event={"ID":"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa","Type":"ContainerDied","Data":"73d0c20023e3fdaa8b51f84bc1b1a2ebcb64c55fef243ef0d5300b17b1d8f0e1"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.465068 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7d97898b4d-tqmxb"] Mar 18 12:13:49 crc kubenswrapper[4843]: E0318 12:13:49.465456 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061ecf07-8167-4652-8182-5779e5502bbf" containerName="controller-manager" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.465475 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="061ecf07-8167-4652-8182-5779e5502bbf" containerName="controller-manager" Mar 18 12:13:49 crc kubenswrapper[4843]: E0318 12:13:49.466133 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1442992-fe43-43be-a43b-48f80db66418" containerName="collect-profiles" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.466146 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1442992-fe43-43be-a43b-48f80db66418" containerName="collect-profiles" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.466342 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1442992-fe43-43be-a43b-48f80db66418" containerName="collect-profiles" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.466588 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="061ecf07-8167-4652-8182-5779e5502bbf" containerName="controller-manager" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.486688 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sn986" podStartSLOduration=200.486637008 podStartE2EDuration="3m20.486637008s" podCreationTimestamp="2026-03-18 12:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:49.474113987 +0000 UTC m=+263.189939521" watchObservedRunningTime="2026-03-18 12:13:49.486637008 +0000 UTC m=+263.202462532" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.488376 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52bm4" event={"ID":"95b0d7ed-60bd-467a-a1b4-86f2c32096ab","Type":"ContainerStarted","Data":"7836da3f8aca48fd4cca7333f3b44430842c9618822020ea23eb7bf226156589"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.488632 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.489805 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" event={"ID":"061ecf07-8167-4652-8182-5779e5502bbf","Type":"ContainerDied","Data":"acea307aa487831f35c2e60a269b10c3dbb17a5ab334de85876da1d32083fc33"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.489845 4843 scope.go:117] "RemoveContainer" containerID="9c74208936b3d5da5bbf785fd6df4ad4876d32e81b24a1586af5360b2f7e2bc4" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.489855 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-g7fkz" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.515200 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9082af15-0a2d-4006-a24c-211de4dd2a73","Type":"ContainerStarted","Data":"6666e2973e33d26f8f4bb8f596619a580ac3ddc055845808a39c27e655dd9d9e"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.515253 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9082af15-0a2d-4006-a24c-211de4dd2a73","Type":"ContainerStarted","Data":"124da7dcf369a9546db8658c631f90c8b487cad04bb0f16d3f1e55cf64077d9e"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.517801 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d97898b4d-tqmxb"] Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.517849 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" event={"ID":"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1","Type":"ContainerStarted","Data":"b0b18dbee18652ec5ad5bfbf58a624a3f43b6cf4e728393dd5b2b9336fc221c5"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.525978 4843 generic.go:334] "Generic (PLEG): container finished" podID="feb1dd70-302b-4217-b17c-211aea971073" containerID="3e36dbb19f5b18461a56a84bb106487e92925feaafbe6b7c685b98a88b1c3d5b" exitCode=0 Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.526642 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dgr8z" event={"ID":"feb1dd70-302b-4217-b17c-211aea971073","Type":"ContainerDied","Data":"3e36dbb19f5b18461a56a84bb106487e92925feaafbe6b7c685b98a88b1c3d5b"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.526779 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dgr8z" event={"ID":"feb1dd70-302b-4217-b17c-211aea971073","Type":"ContainerStarted","Data":"bf0d10651cf85304026ad33c5deebf391ff15abde3e28ab8bcf5071834b865e7"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.539513 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" event={"ID":"13295157-4a57-4e7f-9ff4-13c2a4381c27","Type":"ContainerStarted","Data":"40553dc05de5155b32f1542c3e39c5a553f8008987939eabcf80cbf72ad9c002"} Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.594918 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-client-ca\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.594992 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s7jz\" (UniqueName: \"kubernetes.io/projected/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-kube-api-access-2s7jz\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.595368 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-config\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.596367 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-serving-cert\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.596434 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-proxy-ca-bundles\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.612193 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=5.612173956 podStartE2EDuration="5.612173956s" podCreationTimestamp="2026-03-18 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:49.61162627 +0000 UTC m=+263.327451794" watchObservedRunningTime="2026-03-18 12:13:49.612173956 +0000 UTC m=+263.327999480" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.698814 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-serving-cert\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.698913 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-proxy-ca-bundles\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.699101 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-client-ca\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.699155 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s7jz\" (UniqueName: \"kubernetes.io/projected/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-kube-api-access-2s7jz\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.699256 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-config\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.705767 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-client-ca\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.731066 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-serving-cert\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.735125 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-config\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.741995 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-proxy-ca-bundles\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.744639 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g7fkz"] Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.752796 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g7fkz"] Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.767877 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s7jz\" (UniqueName: \"kubernetes.io/projected/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-kube-api-access-2s7jz\") pod \"controller-manager-7d97898b4d-tqmxb\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.891220 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.892962 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:49 crc kubenswrapper[4843]: [-]has-synced failed: reason withheld Mar 18 12:13:49 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:49 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:49 crc kubenswrapper[4843]: I0318 12:13:49.892991 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.035372 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.035429 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.560690 4843 generic.go:334] "Generic (PLEG): container finished" podID="9082af15-0a2d-4006-a24c-211de4dd2a73" containerID="6666e2973e33d26f8f4bb8f596619a580ac3ddc055845808a39c27e655dd9d9e" exitCode=0 Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.560773 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9082af15-0a2d-4006-a24c-211de4dd2a73","Type":"ContainerDied","Data":"6666e2973e33d26f8f4bb8f596619a580ac3ddc055845808a39c27e655dd9d9e"} Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.564476 4843 generic.go:334] "Generic (PLEG): container finished" podID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" containerID="7e43f0dd207ab0a15dbc5c3e5b7d992a7fcce8194846826e81d2935102b4b930" exitCode=0 Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.564561 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52bm4" event={"ID":"95b0d7ed-60bd-467a-a1b4-86f2c32096ab","Type":"ContainerDied","Data":"7e43f0dd207ab0a15dbc5c3e5b7d992a7fcce8194846826e81d2935102b4b930"} Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.568480 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" event={"ID":"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1","Type":"ContainerStarted","Data":"8f7cbd9e5a70a34e09ff95964ad4daccf0d788696d03868c4e43158d08d41891"} Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.569004 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.571797 4843 generic.go:334] "Generic (PLEG): container finished" podID="8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73" containerID="61a4cc002e1560605153f2e774fcd7f47e2f38245c15f5234c3452e6c56b0e6e" exitCode=0 Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.571900 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73","Type":"ContainerDied","Data":"61a4cc002e1560605153f2e774fcd7f47e2f38245c15f5234c3452e6c56b0e6e"} Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.581204 4843 generic.go:334] "Generic (PLEG): container finished" podID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" containerID="297bf8b562fbb8821a53dd8001ec896cd26fdf2dae429dfa2d511194f3361d8d" exitCode=0 Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.581441 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrwd9" event={"ID":"7aa438f5-fa8b-43f7-93b7-67e592ea698c","Type":"ContainerDied","Data":"297bf8b562fbb8821a53dd8001ec896cd26fdf2dae429dfa2d511194f3361d8d"} Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.797964 4843 patch_prober.go:28] interesting pod/router-default-5444994796-csz7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:50 crc kubenswrapper[4843]: [+]has-synced ok Mar 18 12:13:50 crc kubenswrapper[4843]: [+]process-running ok Mar 18 12:13:50 crc kubenswrapper[4843]: healthz check failed Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.799622 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-csz7z" podUID="9d8e49ec-9849-46fe-9a4b-6bbec11b2736" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:50 crc kubenswrapper[4843]: I0318 12:13:50.888733 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" podStartSLOduration=5.888714478 podStartE2EDuration="5.888714478s" podCreationTimestamp="2026-03-18 12:13:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:50.791431424 +0000 UTC m=+264.507256948" watchObservedRunningTime="2026-03-18 12:13:50.888714478 +0000 UTC m=+264.604540002" Mar 18 12:13:51 crc kubenswrapper[4843]: I0318 12:13:51.073922 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="061ecf07-8167-4652-8182-5779e5502bbf" path="/var/lib/kubelet/pods/061ecf07-8167-4652-8182-5779e5502bbf/volumes" Mar 18 12:13:51 crc kubenswrapper[4843]: I0318 12:13:51.074909 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:13:51 crc kubenswrapper[4843]: I0318 12:13:51.700863 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d97898b4d-tqmxb"] Mar 18 12:13:51 crc kubenswrapper[4843]: I0318 12:13:51.965903 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:52 crc kubenswrapper[4843]: I0318 12:13:52.012341 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-csz7z" Mar 18 12:13:52 crc kubenswrapper[4843]: I0318 12:13:52.817396 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" event={"ID":"9e80c536-1b7a-41dd-af6c-5eecc484dd6d","Type":"ContainerStarted","Data":"ddd5f073d9861331788e65868fec01c17918a6f5b63226c9b908a531eb36c045"} Mar 18 12:13:52 crc kubenswrapper[4843]: I0318 12:13:52.817425 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" event={"ID":"9e80c536-1b7a-41dd-af6c-5eecc484dd6d","Type":"ContainerStarted","Data":"cfcbb529fce3178a462a273511e669498b723b1d347ee30c56aa7c6c266c0d21"} Mar 18 12:13:52 crc kubenswrapper[4843]: I0318 12:13:52.818126 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:52 crc kubenswrapper[4843]: I0318 12:13:52.828518 4843 patch_prober.go:28] interesting pod/controller-manager-7d97898b4d-tqmxb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Mar 18 12:13:52 crc kubenswrapper[4843]: I0318 12:13:52.828594 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" podUID="9e80c536-1b7a-41dd-af6c-5eecc484dd6d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Mar 18 12:13:52 crc kubenswrapper[4843]: I0318 12:13:52.858873 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" podStartSLOduration=7.8588494 podStartE2EDuration="7.8588494s" podCreationTimestamp="2026-03-18 12:13:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:52.848150281 +0000 UTC m=+266.563975805" watchObservedRunningTime="2026-03-18 12:13:52.8588494 +0000 UTC m=+266.574675104" Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.006474 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.149822 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.153123 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9082af15-0a2d-4006-a24c-211de4dd2a73-kube-api-access\") pod \"9082af15-0a2d-4006-a24c-211de4dd2a73\" (UID: \"9082af15-0a2d-4006-a24c-211de4dd2a73\") " Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.153216 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9082af15-0a2d-4006-a24c-211de4dd2a73-kubelet-dir\") pod \"9082af15-0a2d-4006-a24c-211de4dd2a73\" (UID: \"9082af15-0a2d-4006-a24c-211de4dd2a73\") " Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.153907 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9082af15-0a2d-4006-a24c-211de4dd2a73-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9082af15-0a2d-4006-a24c-211de4dd2a73" (UID: "9082af15-0a2d-4006-a24c-211de4dd2a73"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.162225 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9082af15-0a2d-4006-a24c-211de4dd2a73-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9082af15-0a2d-4006-a24c-211de4dd2a73" (UID: "9082af15-0a2d-4006-a24c-211de4dd2a73"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.354814 4843 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9082af15-0a2d-4006-a24c-211de4dd2a73-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.354847 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9082af15-0a2d-4006-a24c-211de4dd2a73-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.459264 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73-kube-api-access\") pod \"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73\" (UID: \"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73\") " Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.459335 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73-kubelet-dir\") pod \"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73\" (UID: \"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73\") " Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.459899 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73" (UID: "8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.483328 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73" (UID: "8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.603031 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:53 crc kubenswrapper[4843]: I0318 12:13:53.603069 4843 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:54 crc kubenswrapper[4843]: I0318 12:13:54.231162 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9082af15-0a2d-4006-a24c-211de4dd2a73","Type":"ContainerDied","Data":"124da7dcf369a9546db8658c631f90c8b487cad04bb0f16d3f1e55cf64077d9e"} Mar 18 12:13:54 crc kubenswrapper[4843]: I0318 12:13:54.231225 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="124da7dcf369a9546db8658c631f90c8b487cad04bb0f16d3f1e55cf64077d9e" Mar 18 12:13:54 crc kubenswrapper[4843]: I0318 12:13:54.231321 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:13:54 crc kubenswrapper[4843]: I0318 12:13:54.369799 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73","Type":"ContainerDied","Data":"e3ed85ee754bd67cb0a4c4564aaf29ce5ed90cc760aab8658117e1489ba62c0d"} Mar 18 12:13:54 crc kubenswrapper[4843]: I0318 12:13:54.369866 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3ed85ee754bd67cb0a4c4564aaf29ce5ed90cc760aab8658117e1489ba62c0d" Mar 18 12:13:54 crc kubenswrapper[4843]: I0318 12:13:54.369886 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:13:54 crc kubenswrapper[4843]: I0318 12:13:54.384865 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:13:55 crc kubenswrapper[4843]: I0318 12:13:55.329725 4843 ???:1] "http: TLS handshake error from 192.168.126.11:51368: no serving certificate available for the kubelet" Mar 18 12:13:56 crc kubenswrapper[4843]: I0318 12:13:56.000101 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:56 crc kubenswrapper[4843]: I0318 12:13:56.005683 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:13:56 crc kubenswrapper[4843]: I0318 12:13:56.207573 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:13:56 crc kubenswrapper[4843]: I0318 12:13:56.207677 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:13:56 crc kubenswrapper[4843]: I0318 12:13:56.207776 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:13:56 crc kubenswrapper[4843]: I0318 12:13:56.207848 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-8lfcz" Mar 18 12:13:56 crc kubenswrapper[4843]: I0318 12:13:56.207676 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:13:56 crc kubenswrapper[4843]: I0318 12:13:56.208673 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:13:56 crc kubenswrapper[4843]: I0318 12:13:56.208697 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:13:56 crc kubenswrapper[4843]: I0318 12:13:56.210328 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"747400a427647f29da7773f37c6dae5b34a9e4b2b0e7d45da2591f2ec708955a"} pod="openshift-console/downloads-7954f5f757-8lfcz" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 18 12:13:56 crc kubenswrapper[4843]: I0318 12:13:56.210392 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" containerID="cri-o://747400a427647f29da7773f37c6dae5b34a9e4b2b0e7d45da2591f2ec708955a" gracePeriod=2 Mar 18 12:13:56 crc kubenswrapper[4843]: I0318 12:13:56.582058 4843 generic.go:334] "Generic (PLEG): container finished" podID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerID="747400a427647f29da7773f37c6dae5b34a9e4b2b0e7d45da2591f2ec708955a" exitCode=0 Mar 18 12:13:56 crc kubenswrapper[4843]: I0318 12:13:56.582531 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lfcz" event={"ID":"dc56f3ce-caf3-4d53-9cf3-d909ec3edd16","Type":"ContainerDied","Data":"747400a427647f29da7773f37c6dae5b34a9e4b2b0e7d45da2591f2ec708955a"} Mar 18 12:13:57 crc kubenswrapper[4843]: I0318 12:13:57.594115 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lfcz" event={"ID":"dc56f3ce-caf3-4d53-9cf3-d909ec3edd16","Type":"ContainerStarted","Data":"b98997f63dc1d0419f0d9c36f269826159cf0b0153cd569869e1bf145fc88548"} Mar 18 12:13:57 crc kubenswrapper[4843]: I0318 12:13:57.595742 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8lfcz" Mar 18 12:13:57 crc kubenswrapper[4843]: I0318 12:13:57.688920 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:13:57 crc kubenswrapper[4843]: I0318 12:13:57.688981 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:13:58 crc kubenswrapper[4843]: I0318 12:13:58.610041 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:13:58 crc kubenswrapper[4843]: I0318 12:13:58.610091 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:13:59 crc kubenswrapper[4843]: I0318 12:13:59.682298 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:13:59 crc kubenswrapper[4843]: I0318 12:13:59.682727 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:14:00 crc kubenswrapper[4843]: I0318 12:14:00.135035 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563934-9j5hk"] Mar 18 12:14:00 crc kubenswrapper[4843]: E0318 12:14:00.135319 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9082af15-0a2d-4006-a24c-211de4dd2a73" containerName="pruner" Mar 18 12:14:00 crc kubenswrapper[4843]: I0318 12:14:00.135339 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="9082af15-0a2d-4006-a24c-211de4dd2a73" containerName="pruner" Mar 18 12:14:00 crc kubenswrapper[4843]: E0318 12:14:00.135361 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73" containerName="pruner" Mar 18 12:14:00 crc kubenswrapper[4843]: I0318 12:14:00.135369 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73" containerName="pruner" Mar 18 12:14:00 crc kubenswrapper[4843]: I0318 12:14:00.135484 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cdb7c14-d1fe-4081-aca5-3a5c8cd59c73" containerName="pruner" Mar 18 12:14:00 crc kubenswrapper[4843]: I0318 12:14:00.135500 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="9082af15-0a2d-4006-a24c-211de4dd2a73" containerName="pruner" Mar 18 12:14:00 crc kubenswrapper[4843]: I0318 12:14:00.136013 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563934-9j5hk" Mar 18 12:14:00 crc kubenswrapper[4843]: I0318 12:14:00.143366 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:14:00 crc kubenswrapper[4843]: I0318 12:14:00.186908 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563934-9j5hk"] Mar 18 12:14:00 crc kubenswrapper[4843]: I0318 12:14:00.533038 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb5nn\" (UniqueName: \"kubernetes.io/projected/4f32540f-28b9-46d1-a943-f04368d4cae2-kube-api-access-fb5nn\") pod \"auto-csr-approver-29563934-9j5hk\" (UID: \"4f32540f-28b9-46d1-a943-f04368d4cae2\") " pod="openshift-infra/auto-csr-approver-29563934-9j5hk" Mar 18 12:14:00 crc kubenswrapper[4843]: I0318 12:14:00.662421 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb5nn\" (UniqueName: \"kubernetes.io/projected/4f32540f-28b9-46d1-a943-f04368d4cae2-kube-api-access-fb5nn\") pod \"auto-csr-approver-29563934-9j5hk\" (UID: \"4f32540f-28b9-46d1-a943-f04368d4cae2\") " pod="openshift-infra/auto-csr-approver-29563934-9j5hk" Mar 18 12:14:00 crc kubenswrapper[4843]: I0318 12:14:00.688530 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb5nn\" (UniqueName: \"kubernetes.io/projected/4f32540f-28b9-46d1-a943-f04368d4cae2-kube-api-access-fb5nn\") pod \"auto-csr-approver-29563934-9j5hk\" (UID: \"4f32540f-28b9-46d1-a943-f04368d4cae2\") " pod="openshift-infra/auto-csr-approver-29563934-9j5hk" Mar 18 12:14:00 crc kubenswrapper[4843]: I0318 12:14:00.798751 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563934-9j5hk" Mar 18 12:14:02 crc kubenswrapper[4843]: I0318 12:14:02.150840 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563934-9j5hk"] Mar 18 12:14:02 crc kubenswrapper[4843]: I0318 12:14:02.199501 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d97898b4d-tqmxb"] Mar 18 12:14:02 crc kubenswrapper[4843]: I0318 12:14:02.199806 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" podUID="9e80c536-1b7a-41dd-af6c-5eecc484dd6d" containerName="controller-manager" containerID="cri-o://ddd5f073d9861331788e65868fec01c17918a6f5b63226c9b908a531eb36c045" gracePeriod=30 Mar 18 12:14:02 crc kubenswrapper[4843]: W0318 12:14:02.203380 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f32540f_28b9_46d1_a943_f04368d4cae2.slice/crio-38cacacff8e32f34521742741680d20cb23c272fe46917a15ee7fcabca071898 WatchSource:0}: Error finding container 38cacacff8e32f34521742741680d20cb23c272fe46917a15ee7fcabca071898: Status 404 returned error can't find the container with id 38cacacff8e32f34521742741680d20cb23c272fe46917a15ee7fcabca071898 Mar 18 12:14:02 crc kubenswrapper[4843]: I0318 12:14:02.261144 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c"] Mar 18 12:14:02 crc kubenswrapper[4843]: I0318 12:14:02.261404 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" podUID="68fc3c3a-6864-4185-a7b5-1fb508f7bfe1" containerName="route-controller-manager" containerID="cri-o://8f7cbd9e5a70a34e09ff95964ad4daccf0d788696d03868c4e43158d08d41891" gracePeriod=30 Mar 18 12:14:02 crc kubenswrapper[4843]: I0318 12:14:02.770236 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563934-9j5hk" event={"ID":"4f32540f-28b9-46d1-a943-f04368d4cae2","Type":"ContainerStarted","Data":"38cacacff8e32f34521742741680d20cb23c272fe46917a15ee7fcabca071898"} Mar 18 12:14:02 crc kubenswrapper[4843]: I0318 12:14:02.772701 4843 generic.go:334] "Generic (PLEG): container finished" podID="9e80c536-1b7a-41dd-af6c-5eecc484dd6d" containerID="ddd5f073d9861331788e65868fec01c17918a6f5b63226c9b908a531eb36c045" exitCode=0 Mar 18 12:14:02 crc kubenswrapper[4843]: I0318 12:14:02.772765 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" event={"ID":"9e80c536-1b7a-41dd-af6c-5eecc484dd6d","Type":"ContainerDied","Data":"ddd5f073d9861331788e65868fec01c17918a6f5b63226c9b908a531eb36c045"} Mar 18 12:14:02 crc kubenswrapper[4843]: I0318 12:14:02.774574 4843 generic.go:334] "Generic (PLEG): container finished" podID="68fc3c3a-6864-4185-a7b5-1fb508f7bfe1" containerID="8f7cbd9e5a70a34e09ff95964ad4daccf0d788696d03868c4e43158d08d41891" exitCode=0 Mar 18 12:14:02 crc kubenswrapper[4843]: I0318 12:14:02.774604 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" event={"ID":"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1","Type":"ContainerDied","Data":"8f7cbd9e5a70a34e09ff95964ad4daccf0d788696d03868c4e43158d08d41891"} Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.689953 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.779544 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.832267 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f"] Mar 18 12:14:04 crc kubenswrapper[4843]: E0318 12:14:04.835701 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68fc3c3a-6864-4185-a7b5-1fb508f7bfe1" containerName="route-controller-manager" Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.835814 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="68fc3c3a-6864-4185-a7b5-1fb508f7bfe1" containerName="route-controller-manager" Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.836050 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="68fc3c3a-6864-4185-a7b5-1fb508f7bfe1" containerName="route-controller-manager" Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.836723 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.844307 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f"] Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.893735 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" event={"ID":"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1","Type":"ContainerDied","Data":"b0b18dbee18652ec5ad5bfbf58a624a3f43b6cf4e728393dd5b2b9336fc221c5"} Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.893797 4843 scope.go:117] "RemoveContainer" containerID="8f7cbd9e5a70a34e09ff95964ad4daccf0d788696d03868c4e43158d08d41891" Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.893944 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c" Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.985492 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8p7f\" (UniqueName: \"kubernetes.io/projected/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-kube-api-access-v8p7f\") pod \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.985760 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-client-ca\") pod \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.985820 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-serving-cert\") pod \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.985843 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-config\") pod \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\" (UID: \"68fc3c3a-6864-4185-a7b5-1fb508f7bfe1\") " Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.986020 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4f52adb-8c9a-4044-a040-e907b51a89fd-serving-cert\") pod \"route-controller-manager-5f65587dd-ptg5f\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.986044 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4f52adb-8c9a-4044-a040-e907b51a89fd-config\") pod \"route-controller-manager-5f65587dd-ptg5f\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.986085 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4f52adb-8c9a-4044-a040-e907b51a89fd-client-ca\") pod \"route-controller-manager-5f65587dd-ptg5f\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.986108 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwrqk\" (UniqueName: \"kubernetes.io/projected/d4f52adb-8c9a-4044-a040-e907b51a89fd-kube-api-access-qwrqk\") pod \"route-controller-manager-5f65587dd-ptg5f\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.986937 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-config" (OuterVolumeSpecName: "config") pod "68fc3c3a-6864-4185-a7b5-1fb508f7bfe1" (UID: "68fc3c3a-6864-4185-a7b5-1fb508f7bfe1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:04 crc kubenswrapper[4843]: I0318 12:14:04.987859 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-client-ca" (OuterVolumeSpecName: "client-ca") pod "68fc3c3a-6864-4185-a7b5-1fb508f7bfe1" (UID: "68fc3c3a-6864-4185-a7b5-1fb508f7bfe1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.034371 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "68fc3c3a-6864-4185-a7b5-1fb508f7bfe1" (UID: "68fc3c3a-6864-4185-a7b5-1fb508f7bfe1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.037078 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-kube-api-access-v8p7f" (OuterVolumeSpecName: "kube-api-access-v8p7f") pod "68fc3c3a-6864-4185-a7b5-1fb508f7bfe1" (UID: "68fc3c3a-6864-4185-a7b5-1fb508f7bfe1"). InnerVolumeSpecName "kube-api-access-v8p7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.059734 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.221180 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwrqk\" (UniqueName: \"kubernetes.io/projected/d4f52adb-8c9a-4044-a040-e907b51a89fd-kube-api-access-qwrqk\") pod \"route-controller-manager-5f65587dd-ptg5f\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.221353 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4f52adb-8c9a-4044-a040-e907b51a89fd-serving-cert\") pod \"route-controller-manager-5f65587dd-ptg5f\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.221387 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4f52adb-8c9a-4044-a040-e907b51a89fd-config\") pod \"route-controller-manager-5f65587dd-ptg5f\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.221444 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4f52adb-8c9a-4044-a040-e907b51a89fd-client-ca\") pod \"route-controller-manager-5f65587dd-ptg5f\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.221898 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8p7f\" (UniqueName: \"kubernetes.io/projected/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-kube-api-access-v8p7f\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.221940 4843 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.221955 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.221966 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.222532 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4f52adb-8c9a-4044-a040-e907b51a89fd-client-ca\") pod \"route-controller-manager-5f65587dd-ptg5f\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.223954 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4f52adb-8c9a-4044-a040-e907b51a89fd-config\") pod \"route-controller-manager-5f65587dd-ptg5f\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.232283 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4f52adb-8c9a-4044-a040-e907b51a89fd-serving-cert\") pod \"route-controller-manager-5f65587dd-ptg5f\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.243985 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwrqk\" (UniqueName: \"kubernetes.io/projected/d4f52adb-8c9a-4044-a040-e907b51a89fd-kube-api-access-qwrqk\") pod \"route-controller-manager-5f65587dd-ptg5f\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.257315 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c"] Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.262630 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b5466b65-pwv7c"] Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.322842 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s7jz\" (UniqueName: \"kubernetes.io/projected/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-kube-api-access-2s7jz\") pod \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.322964 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-config\") pod \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.323155 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-client-ca\") pod \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.323233 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-proxy-ca-bundles\") pod \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.323267 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-serving-cert\") pod \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\" (UID: \"9e80c536-1b7a-41dd-af6c-5eecc484dd6d\") " Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.324193 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-config" (OuterVolumeSpecName: "config") pod "9e80c536-1b7a-41dd-af6c-5eecc484dd6d" (UID: "9e80c536-1b7a-41dd-af6c-5eecc484dd6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.324719 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-client-ca" (OuterVolumeSpecName: "client-ca") pod "9e80c536-1b7a-41dd-af6c-5eecc484dd6d" (UID: "9e80c536-1b7a-41dd-af6c-5eecc484dd6d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.325111 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9e80c536-1b7a-41dd-af6c-5eecc484dd6d" (UID: "9e80c536-1b7a-41dd-af6c-5eecc484dd6d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.327817 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9e80c536-1b7a-41dd-af6c-5eecc484dd6d" (UID: "9e80c536-1b7a-41dd-af6c-5eecc484dd6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.331564 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-kube-api-access-2s7jz" (OuterVolumeSpecName: "kube-api-access-2s7jz") pod "9e80c536-1b7a-41dd-af6c-5eecc484dd6d" (UID: "9e80c536-1b7a-41dd-af6c-5eecc484dd6d"). InnerVolumeSpecName "kube-api-access-2s7jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.467677 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.467718 4843 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.467736 4843 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.467754 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.467771 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s7jz\" (UniqueName: \"kubernetes.io/projected/9e80c536-1b7a-41dd-af6c-5eecc484dd6d-kube-api-access-2s7jz\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:05 crc kubenswrapper[4843]: I0318 12:14:05.472378 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:05.961320 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" event={"ID":"9e80c536-1b7a-41dd-af6c-5eecc484dd6d","Type":"ContainerDied","Data":"cfcbb529fce3178a462a273511e669498b723b1d347ee30c56aa7c6c266c0d21"} Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:05.961641 4843 scope.go:117] "RemoveContainer" containerID="ddd5f073d9861331788e65868fec01c17918a6f5b63226c9b908a531eb36c045" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:05.961754 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d97898b4d-tqmxb" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.133125 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.133203 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.140118 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d97898b4d-tqmxb"] Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.146545 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7d97898b4d-tqmxb"] Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.146703 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.146747 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.782617 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f"] Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.857929 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5bb49457bf-jnhf6"] Mar 18 12:14:06 crc kubenswrapper[4843]: E0318 12:14:06.858207 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e80c536-1b7a-41dd-af6c-5eecc484dd6d" containerName="controller-manager" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.858228 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e80c536-1b7a-41dd-af6c-5eecc484dd6d" containerName="controller-manager" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.858366 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e80c536-1b7a-41dd-af6c-5eecc484dd6d" containerName="controller-manager" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.859112 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.862595 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.863455 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.863522 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.863616 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.864831 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.865371 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.868928 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bb49457bf-jnhf6"] Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.871522 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.930175 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-client-ca\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.930255 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-serving-cert\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.930291 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-proxy-ca-bundles\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:06 crc kubenswrapper[4843]: I0318 12:14:06.930312 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-config\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.016109 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68fc3c3a-6864-4185-a7b5-1fb508f7bfe1" path="/var/lib/kubelet/pods/68fc3c3a-6864-4185-a7b5-1fb508f7bfe1/volumes" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.017003 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e80c536-1b7a-41dd-af6c-5eecc484dd6d" path="/var/lib/kubelet/pods/9e80c536-1b7a-41dd-af6c-5eecc484dd6d/volumes" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.032269 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-client-ca\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.033147 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbdh4\" (UniqueName: \"kubernetes.io/projected/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-kube-api-access-zbdh4\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.033101 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-client-ca\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.033249 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-serving-cert\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.033963 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-proxy-ca-bundles\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.033992 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-config\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.035067 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-config\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.035713 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-proxy-ca-bundles\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.060948 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-serving-cert\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.135538 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbdh4\" (UniqueName: \"kubernetes.io/projected/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-kube-api-access-zbdh4\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.193334 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbdh4\" (UniqueName: \"kubernetes.io/projected/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-kube-api-access-zbdh4\") pod \"controller-manager-5bb49457bf-jnhf6\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:07 crc kubenswrapper[4843]: I0318 12:14:07.578463 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:17 crc kubenswrapper[4843]: I0318 12:14:16.157296 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:14:17 crc kubenswrapper[4843]: I0318 12:14:16.157943 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:14:17 crc kubenswrapper[4843]: I0318 12:14:16.157318 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:14:17 crc kubenswrapper[4843]: I0318 12:14:16.158226 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:14:17 crc kubenswrapper[4843]: I0318 12:14:16.953551 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dzg6s" Mar 18 12:14:17 crc kubenswrapper[4843]: I0318 12:14:17.911018 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 12:14:17 crc kubenswrapper[4843]: I0318 12:14:17.912130 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:17 crc kubenswrapper[4843]: I0318 12:14:17.918117 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 12:14:17 crc kubenswrapper[4843]: I0318 12:14:17.919298 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 12:14:17 crc kubenswrapper[4843]: I0318 12:14:17.919678 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 12:14:18 crc kubenswrapper[4843]: I0318 12:14:18.007906 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fb45905-6d0b-4397-8f3c-e0aad1a88609-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6fb45905-6d0b-4397-8f3c-e0aad1a88609\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:18 crc kubenswrapper[4843]: I0318 12:14:18.008007 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fb45905-6d0b-4397-8f3c-e0aad1a88609-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6fb45905-6d0b-4397-8f3c-e0aad1a88609\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:18 crc kubenswrapper[4843]: I0318 12:14:18.184200 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fb45905-6d0b-4397-8f3c-e0aad1a88609-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6fb45905-6d0b-4397-8f3c-e0aad1a88609\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:18 crc kubenswrapper[4843]: I0318 12:14:18.184289 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fb45905-6d0b-4397-8f3c-e0aad1a88609-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6fb45905-6d0b-4397-8f3c-e0aad1a88609\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:18 crc kubenswrapper[4843]: I0318 12:14:18.185520 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fb45905-6d0b-4397-8f3c-e0aad1a88609-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6fb45905-6d0b-4397-8f3c-e0aad1a88609\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:18 crc kubenswrapper[4843]: I0318 12:14:18.214397 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fb45905-6d0b-4397-8f3c-e0aad1a88609-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6fb45905-6d0b-4397-8f3c-e0aad1a88609\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:18 crc kubenswrapper[4843]: I0318 12:14:18.293413 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:20 crc kubenswrapper[4843]: I0318 12:14:20.086522 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:14:20 crc kubenswrapper[4843]: I0318 12:14:20.086923 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:14:20 crc kubenswrapper[4843]: I0318 12:14:20.087004 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:14:20 crc kubenswrapper[4843]: I0318 12:14:20.087595 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:14:20 crc kubenswrapper[4843]: I0318 12:14:20.087669 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651" gracePeriod=600 Mar 18 12:14:21 crc kubenswrapper[4843]: I0318 12:14:21.286686 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651" exitCode=0 Mar 18 12:14:21 crc kubenswrapper[4843]: I0318 12:14:21.286755 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651"} Mar 18 12:14:21 crc kubenswrapper[4843]: I0318 12:14:21.966547 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 12:14:21 crc kubenswrapper[4843]: I0318 12:14:21.967355 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:21 crc kubenswrapper[4843]: I0318 12:14:21.983101 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 12:14:22 crc kubenswrapper[4843]: I0318 12:14:22.061988 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec08b84-0b49-4fa8-932e-2b18dc727e06-kube-api-access\") pod \"installer-9-crc\" (UID: \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:22 crc kubenswrapper[4843]: I0318 12:14:22.062232 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec08b84-0b49-4fa8-932e-2b18dc727e06-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:22 crc kubenswrapper[4843]: I0318 12:14:22.062348 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec08b84-0b49-4fa8-932e-2b18dc727e06-var-lock\") pod \"installer-9-crc\" (UID: \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:22 crc kubenswrapper[4843]: I0318 12:14:22.067725 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bb49457bf-jnhf6"] Mar 18 12:14:22 crc kubenswrapper[4843]: I0318 12:14:22.163021 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec08b84-0b49-4fa8-932e-2b18dc727e06-var-lock\") pod \"installer-9-crc\" (UID: \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:22 crc kubenswrapper[4843]: I0318 12:14:22.163109 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec08b84-0b49-4fa8-932e-2b18dc727e06-kube-api-access\") pod \"installer-9-crc\" (UID: \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:22 crc kubenswrapper[4843]: I0318 12:14:22.163187 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec08b84-0b49-4fa8-932e-2b18dc727e06-var-lock\") pod \"installer-9-crc\" (UID: \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:22 crc kubenswrapper[4843]: I0318 12:14:22.163210 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec08b84-0b49-4fa8-932e-2b18dc727e06-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:22 crc kubenswrapper[4843]: I0318 12:14:22.163275 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec08b84-0b49-4fa8-932e-2b18dc727e06-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:22 crc kubenswrapper[4843]: I0318 12:14:22.183639 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec08b84-0b49-4fa8-932e-2b18dc727e06-kube-api-access\") pod \"installer-9-crc\" (UID: \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:22 crc kubenswrapper[4843]: I0318 12:14:22.247913 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f"] Mar 18 12:14:22 crc kubenswrapper[4843]: I0318 12:14:22.362818 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:25 crc kubenswrapper[4843]: I0318 12:14:25.528256 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" event={"ID":"d4f52adb-8c9a-4044-a040-e907b51a89fd","Type":"ContainerStarted","Data":"484ae232ec6dd06c6af748d131238fa22fb3a2e354abed9036934b07695e835a"} Mar 18 12:14:26 crc kubenswrapper[4843]: I0318 12:14:26.125416 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:14:26 crc kubenswrapper[4843]: I0318 12:14:26.125740 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:14:26 crc kubenswrapper[4843]: I0318 12:14:26.126053 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:14:26 crc kubenswrapper[4843]: I0318 12:14:26.126081 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:14:26 crc kubenswrapper[4843]: I0318 12:14:26.126178 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-8lfcz" Mar 18 12:14:26 crc kubenswrapper[4843]: I0318 12:14:26.126592 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"b98997f63dc1d0419f0d9c36f269826159cf0b0153cd569869e1bf145fc88548"} pod="openshift-console/downloads-7954f5f757-8lfcz" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 18 12:14:26 crc kubenswrapper[4843]: I0318 12:14:26.126618 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" containerID="cri-o://b98997f63dc1d0419f0d9c36f269826159cf0b0153cd569869e1bf145fc88548" gracePeriod=2 Mar 18 12:14:26 crc kubenswrapper[4843]: I0318 12:14:26.127643 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:14:26 crc kubenswrapper[4843]: I0318 12:14:26.127685 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:14:26 crc kubenswrapper[4843]: I0318 12:14:26.538262 4843 generic.go:334] "Generic (PLEG): container finished" podID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerID="b98997f63dc1d0419f0d9c36f269826159cf0b0153cd569869e1bf145fc88548" exitCode=0 Mar 18 12:14:26 crc kubenswrapper[4843]: I0318 12:14:26.538336 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lfcz" event={"ID":"dc56f3ce-caf3-4d53-9cf3-d909ec3edd16","Type":"ContainerDied","Data":"b98997f63dc1d0419f0d9c36f269826159cf0b0153cd569869e1bf145fc88548"} Mar 18 12:14:26 crc kubenswrapper[4843]: I0318 12:14:26.538397 4843 scope.go:117] "RemoveContainer" containerID="747400a427647f29da7773f37c6dae5b34a9e4b2b0e7d45da2591f2ec708955a" Mar 18 12:14:28 crc kubenswrapper[4843]: E0318 12:14:28.914324 4843 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Mar 18 12:14:36 crc kubenswrapper[4843]: I0318 12:14:36.098308 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:14:36 crc kubenswrapper[4843]: I0318 12:14:36.098963 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:14:36 crc kubenswrapper[4843]: I0318 12:14:36.372040 4843 ???:1] "http: TLS handshake error from 192.168.126.11:39768: no serving certificate available for the kubelet" Mar 18 12:14:40 crc kubenswrapper[4843]: E0318 12:14:40.944972 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 12:14:40 crc kubenswrapper[4843]: E0318 12:14:40.945364 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zmf2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xs5wj_openshift-marketplace(93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:40 crc kubenswrapper[4843]: E0318 12:14:40.946549 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-xs5wj" podUID="93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" Mar 18 12:14:46 crc kubenswrapper[4843]: I0318 12:14:46.097781 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:14:46 crc kubenswrapper[4843]: I0318 12:14:46.098129 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:14:50 crc kubenswrapper[4843]: E0318 12:14:50.175178 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 12:14:50 crc kubenswrapper[4843]: E0318 12:14:50.175688 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4lwgq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dgr8z_openshift-marketplace(feb1dd70-302b-4217-b17c-211aea971073): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:50 crc kubenswrapper[4843]: E0318 12:14:50.176838 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-dgr8z" podUID="feb1dd70-302b-4217-b17c-211aea971073" Mar 18 12:14:52 crc kubenswrapper[4843]: E0318 12:14:52.063293 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dgr8z" podUID="feb1dd70-302b-4217-b17c-211aea971073" Mar 18 12:14:52 crc kubenswrapper[4843]: E0318 12:14:52.427811 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 12:14:52 crc kubenswrapper[4843]: E0318 12:14:52.427961 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-42kbw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-t62l2_openshift-marketplace(ffad2911-2fae-4030-a3ad-d36a5f95fc07): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:52 crc kubenswrapper[4843]: E0318 12:14:52.429100 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-t62l2" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.209077 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-t62l2" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.608071 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.608269 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zc4ck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-52bm4_openshift-marketplace(95b0d7ed-60bd-467a-a1b4-86f2c32096ab): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.609475 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-52bm4" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.705471 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-52bm4" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.789611 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.789980 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.790024 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vv5gs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mrwd9_openshift-marketplace(7aa438f5-fa8b-43f7-93b7-67e592ea698c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.790120 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bt446,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-j2p2b_openshift-marketplace(0dc0843b-8f5c-434e-986e-3aab182caad3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.791278 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-mrwd9" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.791314 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-j2p2b" podUID="0dc0843b-8f5c-434e-986e-3aab182caad3" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.823116 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.823274 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9dp4b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lcsxl_openshift-marketplace(3cef8c60-1bee-4798-9147-4d7f360ae4b2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.824437 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lcsxl" podUID="3cef8c60-1bee-4798-9147-4d7f360ae4b2" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.848697 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.849208 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hh8qh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-rfg7x_openshift-marketplace(431329ed-c93d-4d44-bb49-ebed3a083f1a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:54 crc kubenswrapper[4843]: E0318 12:14:54.850633 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-rfg7x" podUID="431329ed-c93d-4d44-bb49-ebed3a083f1a" Mar 18 12:14:55 crc kubenswrapper[4843]: E0318 12:14:55.750590 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-mrwd9" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" Mar 18 12:14:55 crc kubenswrapper[4843]: E0318 12:14:55.751549 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-rfg7x" podUID="431329ed-c93d-4d44-bb49-ebed3a083f1a" Mar 18 12:14:55 crc kubenswrapper[4843]: E0318 12:14:55.751851 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-j2p2b" podUID="0dc0843b-8f5c-434e-986e-3aab182caad3" Mar 18 12:14:55 crc kubenswrapper[4843]: E0318 12:14:55.779068 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 12:14:55 crc kubenswrapper[4843]: E0318 12:14:55.779295 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:14:55 crc kubenswrapper[4843]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 12:14:55 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tcqm7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29563932-bzf8p_openshift-infra(803876de-64f6-4347-8ea5-6d2d8f87e828): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 12:14:55 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:14:55 crc kubenswrapper[4843]: E0318 12:14:55.780379 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29563932-bzf8p" podUID="803876de-64f6-4347-8ea5-6d2d8f87e828" Mar 18 12:14:55 crc kubenswrapper[4843]: E0318 12:14:55.804826 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 12:14:55 crc kubenswrapper[4843]: E0318 12:14:55.804978 4843 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:14:55 crc kubenswrapper[4843]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 12:14:55 crc kubenswrapper[4843]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fb5nn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29563934-9j5hk_openshift-infra(4f32540f-28b9-46d1-a943-f04368d4cae2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 12:14:55 crc kubenswrapper[4843]: > logger="UnhandledError" Mar 18 12:14:55 crc kubenswrapper[4843]: E0318 12:14:55.806512 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29563934-9j5hk" podUID="4f32540f-28b9-46d1-a943-f04368d4cae2" Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.040967 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bb49457bf-jnhf6"] Mar 18 12:14:56 crc kubenswrapper[4843]: W0318 12:14:56.064268 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e3f316c_7c17_41b8_8a6b_ea894eeefea3.slice/crio-a8356641c878472c38c3b6b704ccda69fe6cb76edbf8803b1f69f727ea57444a WatchSource:0}: Error finding container a8356641c878472c38c3b6b704ccda69fe6cb76edbf8803b1f69f727ea57444a: Status 404 returned error can't find the container with id a8356641c878472c38c3b6b704ccda69fe6cb76edbf8803b1f69f727ea57444a Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.098207 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.098348 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.288143 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 12:14:56 crc kubenswrapper[4843]: W0318 12:14:56.291916 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6fb45905_6d0b_4397_8f3c_e0aad1a88609.slice/crio-3c8ffa72513380907e79fbfe74e6ae61b952526ce8dc849fd1e02916e24fad8f WatchSource:0}: Error finding container 3c8ffa72513380907e79fbfe74e6ae61b952526ce8dc849fd1e02916e24fad8f: Status 404 returned error can't find the container with id 3c8ffa72513380907e79fbfe74e6ae61b952526ce8dc849fd1e02916e24fad8f Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.326591 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 12:14:56 crc kubenswrapper[4843]: W0318 12:14:56.333492 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1ec08b84_0b49_4fa8_932e_2b18dc727e06.slice/crio-df650527a6cff843fc27e6913b44cc8af8ba92a7fbeefec81240fe9bd94d61db WatchSource:0}: Error finding container df650527a6cff843fc27e6913b44cc8af8ba92a7fbeefec81240fe9bd94d61db: Status 404 returned error can't find the container with id df650527a6cff843fc27e6913b44cc8af8ba92a7fbeefec81240fe9bd94d61db Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.695579 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"935c5c8bac02b5ebb6013b7620e469aeb3c950d7626ae59a60e0b050f1d51353"} Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.697852 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs5wj" event={"ID":"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa","Type":"ContainerStarted","Data":"a5f57d2b83168469ebb2c9704bc28b0006eb3b4a9aac5dee7a05aac0888cbf7a"} Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.702081 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" event={"ID":"d4f52adb-8c9a-4044-a040-e907b51a89fd","Type":"ContainerStarted","Data":"bb776b55ec3057902e17cada531f3cf9c59141fc5d52856007929edc944b7a52"} Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.702118 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" podUID="d4f52adb-8c9a-4044-a040-e907b51a89fd" containerName="route-controller-manager" containerID="cri-o://bb776b55ec3057902e17cada531f3cf9c59141fc5d52856007929edc944b7a52" gracePeriod=30 Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.702681 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.710055 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1ec08b84-0b49-4fa8-932e-2b18dc727e06","Type":"ContainerStarted","Data":"df650527a6cff843fc27e6913b44cc8af8ba92a7fbeefec81240fe9bd94d61db"} Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.711934 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lfcz" event={"ID":"dc56f3ce-caf3-4d53-9cf3-d909ec3edd16","Type":"ContainerStarted","Data":"8de1e63e2b14ee9a7b0604ff12271ab7b1d04d5e782d0e473c050e9de4959f09"} Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.712411 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8lfcz" Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.712436 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.712535 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.713721 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6fb45905-6d0b-4397-8f3c-e0aad1a88609","Type":"ContainerStarted","Data":"3c8ffa72513380907e79fbfe74e6ae61b952526ce8dc849fd1e02916e24fad8f"} Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.716257 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" podUID="7e3f316c-7c17-41b8-8a6b-ea894eeefea3" containerName="controller-manager" containerID="cri-o://6c64d3f76a150ca8db16cf025fc00e14cef4e7bdbfed83b3815a217d4f4ffe10" gracePeriod=30 Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.716637 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" event={"ID":"7e3f316c-7c17-41b8-8a6b-ea894eeefea3","Type":"ContainerStarted","Data":"6c64d3f76a150ca8db16cf025fc00e14cef4e7bdbfed83b3815a217d4f4ffe10"} Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.716771 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" event={"ID":"7e3f316c-7c17-41b8-8a6b-ea894eeefea3","Type":"ContainerStarted","Data":"a8356641c878472c38c3b6b704ccda69fe6cb76edbf8803b1f69f727ea57444a"} Mar 18 12:14:56 crc kubenswrapper[4843]: E0318 12:14:56.729797 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29563934-9j5hk" podUID="4f32540f-28b9-46d1-a943-f04368d4cae2" Mar 18 12:14:56 crc kubenswrapper[4843]: E0318 12:14:56.731830 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29563932-bzf8p" podUID="803876de-64f6-4347-8ea5-6d2d8f87e828" Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.732244 4843 patch_prober.go:28] interesting pod/route-controller-manager-5f65587dd-ptg5f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:41804->10.217.0.60:8443: read: connection reset by peer" start-of-body= Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.732845 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" podUID="d4f52adb-8c9a-4044-a040-e907b51a89fd" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:41804->10.217.0.60:8443: read: connection reset by peer" Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.735066 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.777336 4843 patch_prober.go:28] interesting pod/controller-manager-5bb49457bf-jnhf6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:40960->10.217.0.61:8443: read: connection reset by peer" start-of-body= Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.779308 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" podUID="7e3f316c-7c17-41b8-8a6b-ea894eeefea3" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:40960->10.217.0.61:8443: read: connection reset by peer" Mar 18 12:14:56 crc kubenswrapper[4843]: I0318 12:14:56.860132 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" podStartSLOduration=54.860115893 podStartE2EDuration="54.860115893s" podCreationTimestamp="2026-03-18 12:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:56.859547557 +0000 UTC m=+330.575373111" watchObservedRunningTime="2026-03-18 12:14:56.860115893 +0000 UTC m=+330.575941417" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.005449 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" podStartSLOduration=55.005430054 podStartE2EDuration="55.005430054s" podCreationTimestamp="2026-03-18 12:14:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:56.883575974 +0000 UTC m=+330.599401498" watchObservedRunningTime="2026-03-18 12:14:57.005430054 +0000 UTC m=+330.721255578" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.131205 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5f65587dd-ptg5f_d4f52adb-8c9a-4044-a040-e907b51a89fd/route-controller-manager/0.log" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.131496 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.137724 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.172979 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8"] Mar 18 12:14:57 crc kubenswrapper[4843]: E0318 12:14:57.173286 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e3f316c-7c17-41b8-8a6b-ea894eeefea3" containerName="controller-manager" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.173299 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e3f316c-7c17-41b8-8a6b-ea894eeefea3" containerName="controller-manager" Mar 18 12:14:57 crc kubenswrapper[4843]: E0318 12:14:57.173321 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f52adb-8c9a-4044-a040-e907b51a89fd" containerName="route-controller-manager" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.173328 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f52adb-8c9a-4044-a040-e907b51a89fd" containerName="route-controller-manager" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.173424 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4f52adb-8c9a-4044-a040-e907b51a89fd" containerName="route-controller-manager" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.173439 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e3f316c-7c17-41b8-8a6b-ea894eeefea3" containerName="controller-manager" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.173846 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.181178 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8"] Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.240056 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-serving-cert\") pod \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.240105 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbdh4\" (UniqueName: \"kubernetes.io/projected/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-kube-api-access-zbdh4\") pod \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.240166 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-client-ca\") pod \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.240195 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwrqk\" (UniqueName: \"kubernetes.io/projected/d4f52adb-8c9a-4044-a040-e907b51a89fd-kube-api-access-qwrqk\") pod \"d4f52adb-8c9a-4044-a040-e907b51a89fd\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.240211 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-proxy-ca-bundles\") pod \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.240234 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4f52adb-8c9a-4044-a040-e907b51a89fd-config\") pod \"d4f52adb-8c9a-4044-a040-e907b51a89fd\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.240262 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4f52adb-8c9a-4044-a040-e907b51a89fd-serving-cert\") pod \"d4f52adb-8c9a-4044-a040-e907b51a89fd\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.240309 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-config\") pod \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\" (UID: \"7e3f316c-7c17-41b8-8a6b-ea894eeefea3\") " Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.240340 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4f52adb-8c9a-4044-a040-e907b51a89fd-client-ca\") pod \"d4f52adb-8c9a-4044-a040-e907b51a89fd\" (UID: \"d4f52adb-8c9a-4044-a040-e907b51a89fd\") " Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.241555 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7e3f316c-7c17-41b8-8a6b-ea894eeefea3" (UID: "7e3f316c-7c17-41b8-8a6b-ea894eeefea3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.241703 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-client-ca" (OuterVolumeSpecName: "client-ca") pod "7e3f316c-7c17-41b8-8a6b-ea894eeefea3" (UID: "7e3f316c-7c17-41b8-8a6b-ea894eeefea3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.241953 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4f52adb-8c9a-4044-a040-e907b51a89fd-config" (OuterVolumeSpecName: "config") pod "d4f52adb-8c9a-4044-a040-e907b51a89fd" (UID: "d4f52adb-8c9a-4044-a040-e907b51a89fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.242226 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-config" (OuterVolumeSpecName: "config") pod "7e3f316c-7c17-41b8-8a6b-ea894eeefea3" (UID: "7e3f316c-7c17-41b8-8a6b-ea894eeefea3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.242259 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4f52adb-8c9a-4044-a040-e907b51a89fd-client-ca" (OuterVolumeSpecName: "client-ca") pod "d4f52adb-8c9a-4044-a040-e907b51a89fd" (UID: "d4f52adb-8c9a-4044-a040-e907b51a89fd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.246093 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4f52adb-8c9a-4044-a040-e907b51a89fd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d4f52adb-8c9a-4044-a040-e907b51a89fd" (UID: "d4f52adb-8c9a-4044-a040-e907b51a89fd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.246195 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7e3f316c-7c17-41b8-8a6b-ea894eeefea3" (UID: "7e3f316c-7c17-41b8-8a6b-ea894eeefea3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.246350 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4f52adb-8c9a-4044-a040-e907b51a89fd-kube-api-access-qwrqk" (OuterVolumeSpecName: "kube-api-access-qwrqk") pod "d4f52adb-8c9a-4044-a040-e907b51a89fd" (UID: "d4f52adb-8c9a-4044-a040-e907b51a89fd"). InnerVolumeSpecName "kube-api-access-qwrqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.246490 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-kube-api-access-zbdh4" (OuterVolumeSpecName: "kube-api-access-zbdh4") pod "7e3f316c-7c17-41b8-8a6b-ea894eeefea3" (UID: "7e3f316c-7c17-41b8-8a6b-ea894eeefea3"). InnerVolumeSpecName "kube-api-access-zbdh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.342000 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c3f964-ba9f-418c-b916-c51034442ff4-config\") pod \"route-controller-manager-66b5d79dbf-b6rn8\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.342388 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm72h\" (UniqueName: \"kubernetes.io/projected/81c3f964-ba9f-418c-b916-c51034442ff4-kube-api-access-rm72h\") pod \"route-controller-manager-66b5d79dbf-b6rn8\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.342586 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81c3f964-ba9f-418c-b916-c51034442ff4-client-ca\") pod \"route-controller-manager-66b5d79dbf-b6rn8\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.342760 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c3f964-ba9f-418c-b916-c51034442ff4-serving-cert\") pod \"route-controller-manager-66b5d79dbf-b6rn8\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.342925 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.343034 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbdh4\" (UniqueName: \"kubernetes.io/projected/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-kube-api-access-zbdh4\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.343139 4843 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.343220 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwrqk\" (UniqueName: \"kubernetes.io/projected/d4f52adb-8c9a-4044-a040-e907b51a89fd-kube-api-access-qwrqk\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.343305 4843 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.343408 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4f52adb-8c9a-4044-a040-e907b51a89fd-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.343513 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4f52adb-8c9a-4044-a040-e907b51a89fd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.343600 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3f316c-7c17-41b8-8a6b-ea894eeefea3-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.343830 4843 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4f52adb-8c9a-4044-a040-e907b51a89fd-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.445809 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81c3f964-ba9f-418c-b916-c51034442ff4-client-ca\") pod \"route-controller-manager-66b5d79dbf-b6rn8\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.445892 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c3f964-ba9f-418c-b916-c51034442ff4-serving-cert\") pod \"route-controller-manager-66b5d79dbf-b6rn8\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.445952 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c3f964-ba9f-418c-b916-c51034442ff4-config\") pod \"route-controller-manager-66b5d79dbf-b6rn8\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.445989 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm72h\" (UniqueName: \"kubernetes.io/projected/81c3f964-ba9f-418c-b916-c51034442ff4-kube-api-access-rm72h\") pod \"route-controller-manager-66b5d79dbf-b6rn8\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.447436 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81c3f964-ba9f-418c-b916-c51034442ff4-client-ca\") pod \"route-controller-manager-66b5d79dbf-b6rn8\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.447588 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c3f964-ba9f-418c-b916-c51034442ff4-config\") pod \"route-controller-manager-66b5d79dbf-b6rn8\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.450272 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c3f964-ba9f-418c-b916-c51034442ff4-serving-cert\") pod \"route-controller-manager-66b5d79dbf-b6rn8\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.466963 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm72h\" (UniqueName: \"kubernetes.io/projected/81c3f964-ba9f-418c-b916-c51034442ff4-kube-api-access-rm72h\") pod \"route-controller-manager-66b5d79dbf-b6rn8\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.500921 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.691696 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8"] Mar 18 12:14:57 crc kubenswrapper[4843]: W0318 12:14:57.700777 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c3f964_ba9f_418c_b916_c51034442ff4.slice/crio-5ca63f19f7d5b838eaf72f84a24beac71b0aa428a61d4179fe700d277e62e2b7 WatchSource:0}: Error finding container 5ca63f19f7d5b838eaf72f84a24beac71b0aa428a61d4179fe700d277e62e2b7: Status 404 returned error can't find the container with id 5ca63f19f7d5b838eaf72f84a24beac71b0aa428a61d4179fe700d277e62e2b7 Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.720985 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5f65587dd-ptg5f_d4f52adb-8c9a-4044-a040-e907b51a89fd/route-controller-manager/0.log" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.721029 4843 generic.go:334] "Generic (PLEG): container finished" podID="d4f52adb-8c9a-4044-a040-e907b51a89fd" containerID="bb776b55ec3057902e17cada531f3cf9c59141fc5d52856007929edc944b7a52" exitCode=255 Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.721080 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" event={"ID":"d4f52adb-8c9a-4044-a040-e907b51a89fd","Type":"ContainerDied","Data":"bb776b55ec3057902e17cada531f3cf9c59141fc5d52856007929edc944b7a52"} Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.721110 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" event={"ID":"d4f52adb-8c9a-4044-a040-e907b51a89fd","Type":"ContainerDied","Data":"484ae232ec6dd06c6af748d131238fa22fb3a2e354abed9036934b07695e835a"} Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.721125 4843 scope.go:117] "RemoveContainer" containerID="bb776b55ec3057902e17cada531f3cf9c59141fc5d52856007929edc944b7a52" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.721141 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.726755 4843 generic.go:334] "Generic (PLEG): container finished" podID="7e3f316c-7c17-41b8-8a6b-ea894eeefea3" containerID="6c64d3f76a150ca8db16cf025fc00e14cef4e7bdbfed83b3815a217d4f4ffe10" exitCode=0 Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.726809 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.726815 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" event={"ID":"7e3f316c-7c17-41b8-8a6b-ea894eeefea3","Type":"ContainerDied","Data":"6c64d3f76a150ca8db16cf025fc00e14cef4e7bdbfed83b3815a217d4f4ffe10"} Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.726913 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bb49457bf-jnhf6" event={"ID":"7e3f316c-7c17-41b8-8a6b-ea894eeefea3","Type":"ContainerDied","Data":"a8356641c878472c38c3b6b704ccda69fe6cb76edbf8803b1f69f727ea57444a"} Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.735417 4843 generic.go:334] "Generic (PLEG): container finished" podID="93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" containerID="a5f57d2b83168469ebb2c9704bc28b0006eb3b4a9aac5dee7a05aac0888cbf7a" exitCode=0 Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.735463 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs5wj" event={"ID":"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa","Type":"ContainerDied","Data":"a5f57d2b83168469ebb2c9704bc28b0006eb3b4a9aac5dee7a05aac0888cbf7a"} Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.740231 4843 scope.go:117] "RemoveContainer" containerID="bb776b55ec3057902e17cada531f3cf9c59141fc5d52856007929edc944b7a52" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.740554 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1ec08b84-0b49-4fa8-932e-2b18dc727e06","Type":"ContainerStarted","Data":"c3a21dbdc1baa16ce652ee6f854ea6b04d4bbb6f0faad7cd2d74e6c15fcff402"} Mar 18 12:14:57 crc kubenswrapper[4843]: E0318 12:14:57.741014 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb776b55ec3057902e17cada531f3cf9c59141fc5d52856007929edc944b7a52\": container with ID starting with bb776b55ec3057902e17cada531f3cf9c59141fc5d52856007929edc944b7a52 not found: ID does not exist" containerID="bb776b55ec3057902e17cada531f3cf9c59141fc5d52856007929edc944b7a52" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.741044 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb776b55ec3057902e17cada531f3cf9c59141fc5d52856007929edc944b7a52"} err="failed to get container status \"bb776b55ec3057902e17cada531f3cf9c59141fc5d52856007929edc944b7a52\": rpc error: code = NotFound desc = could not find container \"bb776b55ec3057902e17cada531f3cf9c59141fc5d52856007929edc944b7a52\": container with ID starting with bb776b55ec3057902e17cada531f3cf9c59141fc5d52856007929edc944b7a52 not found: ID does not exist" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.741069 4843 scope.go:117] "RemoveContainer" containerID="6c64d3f76a150ca8db16cf025fc00e14cef4e7bdbfed83b3815a217d4f4ffe10" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.744775 4843 generic.go:334] "Generic (PLEG): container finished" podID="6fb45905-6d0b-4397-8f3c-e0aad1a88609" containerID="4a1a9564d5bc21e1a2db699f94a05bde8e3c82370293698afa44c9940060bc75" exitCode=0 Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.745185 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6fb45905-6d0b-4397-8f3c-e0aad1a88609","Type":"ContainerDied","Data":"4a1a9564d5bc21e1a2db699f94a05bde8e3c82370293698afa44c9940060bc75"} Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.746505 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" event={"ID":"81c3f964-ba9f-418c-b916-c51034442ff4","Type":"ContainerStarted","Data":"5ca63f19f7d5b838eaf72f84a24beac71b0aa428a61d4179fe700d277e62e2b7"} Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.747548 4843 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lfcz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.747579 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lfcz" podUID="dc56f3ce-caf3-4d53-9cf3-d909ec3edd16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.761932 4843 scope.go:117] "RemoveContainer" containerID="6c64d3f76a150ca8db16cf025fc00e14cef4e7bdbfed83b3815a217d4f4ffe10" Mar 18 12:14:57 crc kubenswrapper[4843]: E0318 12:14:57.762359 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c64d3f76a150ca8db16cf025fc00e14cef4e7bdbfed83b3815a217d4f4ffe10\": container with ID starting with 6c64d3f76a150ca8db16cf025fc00e14cef4e7bdbfed83b3815a217d4f4ffe10 not found: ID does not exist" containerID="6c64d3f76a150ca8db16cf025fc00e14cef4e7bdbfed83b3815a217d4f4ffe10" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.762394 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c64d3f76a150ca8db16cf025fc00e14cef4e7bdbfed83b3815a217d4f4ffe10"} err="failed to get container status \"6c64d3f76a150ca8db16cf025fc00e14cef4e7bdbfed83b3815a217d4f4ffe10\": rpc error: code = NotFound desc = could not find container \"6c64d3f76a150ca8db16cf025fc00e14cef4e7bdbfed83b3815a217d4f4ffe10\": container with ID starting with 6c64d3f76a150ca8db16cf025fc00e14cef4e7bdbfed83b3815a217d4f4ffe10 not found: ID does not exist" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.807863 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=36.807846239 podStartE2EDuration="36.807846239s" podCreationTimestamp="2026-03-18 12:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:57.800212957 +0000 UTC m=+331.516038491" watchObservedRunningTime="2026-03-18 12:14:57.807846239 +0000 UTC m=+331.523671763" Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.826715 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f"] Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.833176 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f65587dd-ptg5f"] Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.845311 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bb49457bf-jnhf6"] Mar 18 12:14:57 crc kubenswrapper[4843]: I0318 12:14:57.870521 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5bb49457bf-jnhf6"] Mar 18 12:14:58 crc kubenswrapper[4843]: I0318 12:14:58.754243 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" event={"ID":"81c3f964-ba9f-418c-b916-c51034442ff4","Type":"ContainerStarted","Data":"9862b5ba7960c0a3a8b5f4286159b2b3da1ff204a40146351e70c20e5d4d03bc"} Mar 18 12:14:58 crc kubenswrapper[4843]: I0318 12:14:58.754727 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:58 crc kubenswrapper[4843]: I0318 12:14:58.761116 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:14:58 crc kubenswrapper[4843]: I0318 12:14:58.781350 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" podStartSLOduration=36.781330773 podStartE2EDuration="36.781330773s" podCreationTimestamp="2026-03-18 12:14:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:58.779243703 +0000 UTC m=+332.495069227" watchObservedRunningTime="2026-03-18 12:14:58.781330773 +0000 UTC m=+332.497156297" Mar 18 12:14:58 crc kubenswrapper[4843]: I0318 12:14:58.993151 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3f316c-7c17-41b8-8a6b-ea894eeefea3" path="/var/lib/kubelet/pods/7e3f316c-7c17-41b8-8a6b-ea894eeefea3/volumes" Mar 18 12:14:58 crc kubenswrapper[4843]: I0318 12:14:58.994119 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4f52adb-8c9a-4044-a040-e907b51a89fd" path="/var/lib/kubelet/pods/d4f52adb-8c9a-4044-a040-e907b51a89fd/volumes" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.121921 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.184558 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-65997c5d4b-86knq"] Mar 18 12:14:59 crc kubenswrapper[4843]: E0318 12:14:59.185314 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb45905-6d0b-4397-8f3c-e0aad1a88609" containerName="pruner" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.185338 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb45905-6d0b-4397-8f3c-e0aad1a88609" containerName="pruner" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.185711 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb45905-6d0b-4397-8f3c-e0aad1a88609" containerName="pruner" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.186499 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.199623 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.201761 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.201932 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.202210 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.202533 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.202765 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.209969 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65997c5d4b-86knq"] Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.210448 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.274387 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fb45905-6d0b-4397-8f3c-e0aad1a88609-kube-api-access\") pod \"6fb45905-6d0b-4397-8f3c-e0aad1a88609\" (UID: \"6fb45905-6d0b-4397-8f3c-e0aad1a88609\") " Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.274522 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fb45905-6d0b-4397-8f3c-e0aad1a88609-kubelet-dir\") pod \"6fb45905-6d0b-4397-8f3c-e0aad1a88609\" (UID: \"6fb45905-6d0b-4397-8f3c-e0aad1a88609\") " Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.274768 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-proxy-ca-bundles\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.274871 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-config\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.274923 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-client-ca\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.274969 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-serving-cert\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.275007 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shqjg\" (UniqueName: \"kubernetes.io/projected/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-kube-api-access-shqjg\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.275456 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fb45905-6d0b-4397-8f3c-e0aad1a88609-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6fb45905-6d0b-4397-8f3c-e0aad1a88609" (UID: "6fb45905-6d0b-4397-8f3c-e0aad1a88609"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.284422 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fb45905-6d0b-4397-8f3c-e0aad1a88609-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6fb45905-6d0b-4397-8f3c-e0aad1a88609" (UID: "6fb45905-6d0b-4397-8f3c-e0aad1a88609"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.376632 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-proxy-ca-bundles\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.376820 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-config\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.376865 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-client-ca\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.376895 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-serving-cert\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.376930 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shqjg\" (UniqueName: \"kubernetes.io/projected/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-kube-api-access-shqjg\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.376985 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fb45905-6d0b-4397-8f3c-e0aad1a88609-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.377000 4843 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fb45905-6d0b-4397-8f3c-e0aad1a88609-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.378122 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-client-ca\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.378266 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-proxy-ca-bundles\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.378723 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-config\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.380521 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-serving-cert\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.398110 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shqjg\" (UniqueName: \"kubernetes.io/projected/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-kube-api-access-shqjg\") pod \"controller-manager-65997c5d4b-86knq\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.510448 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.723197 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65997c5d4b-86knq"] Mar 18 12:14:59 crc kubenswrapper[4843]: W0318 12:14:59.729103 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60b00a12_ac0c_43c1_b140_2b1170d6a0bf.slice/crio-10567094ddf4f35b56151bb72495c3b191b1650d45ab95026b22129d9f1cd36d WatchSource:0}: Error finding container 10567094ddf4f35b56151bb72495c3b191b1650d45ab95026b22129d9f1cd36d: Status 404 returned error can't find the container with id 10567094ddf4f35b56151bb72495c3b191b1650d45ab95026b22129d9f1cd36d Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.767012 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" event={"ID":"60b00a12-ac0c-43c1-b140-2b1170d6a0bf","Type":"ContainerStarted","Data":"10567094ddf4f35b56151bb72495c3b191b1650d45ab95026b22129d9f1cd36d"} Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.769959 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.770335 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6fb45905-6d0b-4397-8f3c-e0aad1a88609","Type":"ContainerDied","Data":"3c8ffa72513380907e79fbfe74e6ae61b952526ce8dc849fd1e02916e24fad8f"} Mar 18 12:14:59 crc kubenswrapper[4843]: I0318 12:14:59.770368 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c8ffa72513380907e79fbfe74e6ae61b952526ce8dc849fd1e02916e24fad8f" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.148578 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh"] Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.149478 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.151719 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.152741 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.160330 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh"] Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.292964 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t4bd\" (UniqueName: \"kubernetes.io/projected/05b4d009-a68d-4287-b15d-b0470d62d486-kube-api-access-2t4bd\") pod \"collect-profiles-29563935-r8bbh\" (UID: \"05b4d009-a68d-4287-b15d-b0470d62d486\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.293113 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05b4d009-a68d-4287-b15d-b0470d62d486-secret-volume\") pod \"collect-profiles-29563935-r8bbh\" (UID: \"05b4d009-a68d-4287-b15d-b0470d62d486\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.293175 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05b4d009-a68d-4287-b15d-b0470d62d486-config-volume\") pod \"collect-profiles-29563935-r8bbh\" (UID: \"05b4d009-a68d-4287-b15d-b0470d62d486\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.394190 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t4bd\" (UniqueName: \"kubernetes.io/projected/05b4d009-a68d-4287-b15d-b0470d62d486-kube-api-access-2t4bd\") pod \"collect-profiles-29563935-r8bbh\" (UID: \"05b4d009-a68d-4287-b15d-b0470d62d486\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.394279 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05b4d009-a68d-4287-b15d-b0470d62d486-secret-volume\") pod \"collect-profiles-29563935-r8bbh\" (UID: \"05b4d009-a68d-4287-b15d-b0470d62d486\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.394304 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05b4d009-a68d-4287-b15d-b0470d62d486-config-volume\") pod \"collect-profiles-29563935-r8bbh\" (UID: \"05b4d009-a68d-4287-b15d-b0470d62d486\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.395285 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05b4d009-a68d-4287-b15d-b0470d62d486-config-volume\") pod \"collect-profiles-29563935-r8bbh\" (UID: \"05b4d009-a68d-4287-b15d-b0470d62d486\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.405139 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05b4d009-a68d-4287-b15d-b0470d62d486-secret-volume\") pod \"collect-profiles-29563935-r8bbh\" (UID: \"05b4d009-a68d-4287-b15d-b0470d62d486\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.413389 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t4bd\" (UniqueName: \"kubernetes.io/projected/05b4d009-a68d-4287-b15d-b0470d62d486-kube-api-access-2t4bd\") pod \"collect-profiles-29563935-r8bbh\" (UID: \"05b4d009-a68d-4287-b15d-b0470d62d486\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.468879 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.795149 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs5wj" event={"ID":"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa","Type":"ContainerStarted","Data":"c3e34a0f6e1cbd12562cc96c622d1119784108f54d2a7b769b16292db4fe4e09"} Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.802012 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" event={"ID":"60b00a12-ac0c-43c1-b140-2b1170d6a0bf","Type":"ContainerStarted","Data":"939da6e97224a3cafce68e9c030a17f58f8ed9cc76d57b5a80f4590bd59f59be"} Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.802481 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.806435 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.820508 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xs5wj" podStartSLOduration=8.716006569 podStartE2EDuration="1m18.82048399s" podCreationTimestamp="2026-03-18 12:13:42 +0000 UTC" firstStartedPulling="2026-03-18 12:13:49.4283863 +0000 UTC m=+263.144211814" lastFinishedPulling="2026-03-18 12:14:59.532863711 +0000 UTC m=+333.248689235" observedRunningTime="2026-03-18 12:15:00.817126592 +0000 UTC m=+334.532952116" watchObservedRunningTime="2026-03-18 12:15:00.82048399 +0000 UTC m=+334.536309514" Mar 18 12:15:00 crc kubenswrapper[4843]: I0318 12:15:00.839231 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" podStartSLOduration=38.839212324 podStartE2EDuration="38.839212324s" podCreationTimestamp="2026-03-18 12:14:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:00.838135603 +0000 UTC m=+334.553961117" watchObservedRunningTime="2026-03-18 12:15:00.839212324 +0000 UTC m=+334.555037848" Mar 18 12:15:01 crc kubenswrapper[4843]: I0318 12:15:01.019244 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh"] Mar 18 12:15:01 crc kubenswrapper[4843]: I0318 12:15:01.808177 4843 generic.go:334] "Generic (PLEG): container finished" podID="05b4d009-a68d-4287-b15d-b0470d62d486" containerID="a2685add51d20aaddef4cde554d7370356bcd8fc29d993b19abb949b40f50eb5" exitCode=0 Mar 18 12:15:01 crc kubenswrapper[4843]: I0318 12:15:01.808407 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" event={"ID":"05b4d009-a68d-4287-b15d-b0470d62d486","Type":"ContainerDied","Data":"a2685add51d20aaddef4cde554d7370356bcd8fc29d993b19abb949b40f50eb5"} Mar 18 12:15:01 crc kubenswrapper[4843]: I0318 12:15:01.810096 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" event={"ID":"05b4d009-a68d-4287-b15d-b0470d62d486","Type":"ContainerStarted","Data":"e518413656972e76486e94b561050815eb5649a923632795c4526e51096db719"} Mar 18 12:15:01 crc kubenswrapper[4843]: I0318 12:15:01.983791 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65997c5d4b-86knq"] Mar 18 12:15:02 crc kubenswrapper[4843]: I0318 12:15:02.085848 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8"] Mar 18 12:15:02 crc kubenswrapper[4843]: I0318 12:15:02.086120 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" podUID="81c3f964-ba9f-418c-b916-c51034442ff4" containerName="route-controller-manager" containerID="cri-o://9862b5ba7960c0a3a8b5f4286159b2b3da1ff204a40146351e70c20e5d4d03bc" gracePeriod=30 Mar 18 12:15:02 crc kubenswrapper[4843]: I0318 12:15:02.817330 4843 generic.go:334] "Generic (PLEG): container finished" podID="81c3f964-ba9f-418c-b916-c51034442ff4" containerID="9862b5ba7960c0a3a8b5f4286159b2b3da1ff204a40146351e70c20e5d4d03bc" exitCode=0 Mar 18 12:15:02 crc kubenswrapper[4843]: I0318 12:15:02.817413 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" event={"ID":"81c3f964-ba9f-418c-b916-c51034442ff4","Type":"ContainerDied","Data":"9862b5ba7960c0a3a8b5f4286159b2b3da1ff204a40146351e70c20e5d4d03bc"} Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.093934 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.235608 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.249320 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t4bd\" (UniqueName: \"kubernetes.io/projected/05b4d009-a68d-4287-b15d-b0470d62d486-kube-api-access-2t4bd\") pod \"05b4d009-a68d-4287-b15d-b0470d62d486\" (UID: \"05b4d009-a68d-4287-b15d-b0470d62d486\") " Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.249383 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05b4d009-a68d-4287-b15d-b0470d62d486-config-volume\") pod \"05b4d009-a68d-4287-b15d-b0470d62d486\" (UID: \"05b4d009-a68d-4287-b15d-b0470d62d486\") " Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.249448 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05b4d009-a68d-4287-b15d-b0470d62d486-secret-volume\") pod \"05b4d009-a68d-4287-b15d-b0470d62d486\" (UID: \"05b4d009-a68d-4287-b15d-b0470d62d486\") " Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.250269 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05b4d009-a68d-4287-b15d-b0470d62d486-config-volume" (OuterVolumeSpecName: "config-volume") pod "05b4d009-a68d-4287-b15d-b0470d62d486" (UID: "05b4d009-a68d-4287-b15d-b0470d62d486"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.254616 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b4d009-a68d-4287-b15d-b0470d62d486-kube-api-access-2t4bd" (OuterVolumeSpecName: "kube-api-access-2t4bd") pod "05b4d009-a68d-4287-b15d-b0470d62d486" (UID: "05b4d009-a68d-4287-b15d-b0470d62d486"). InnerVolumeSpecName "kube-api-access-2t4bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.257685 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05b4d009-a68d-4287-b15d-b0470d62d486-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "05b4d009-a68d-4287-b15d-b0470d62d486" (UID: "05b4d009-a68d-4287-b15d-b0470d62d486"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.350912 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81c3f964-ba9f-418c-b916-c51034442ff4-client-ca\") pod \"81c3f964-ba9f-418c-b916-c51034442ff4\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.351198 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm72h\" (UniqueName: \"kubernetes.io/projected/81c3f964-ba9f-418c-b916-c51034442ff4-kube-api-access-rm72h\") pod \"81c3f964-ba9f-418c-b916-c51034442ff4\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.351369 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c3f964-ba9f-418c-b916-c51034442ff4-config\") pod \"81c3f964-ba9f-418c-b916-c51034442ff4\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.351512 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c3f964-ba9f-418c-b916-c51034442ff4-serving-cert\") pod \"81c3f964-ba9f-418c-b916-c51034442ff4\" (UID: \"81c3f964-ba9f-418c-b916-c51034442ff4\") " Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.351713 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81c3f964-ba9f-418c-b916-c51034442ff4-client-ca" (OuterVolumeSpecName: "client-ca") pod "81c3f964-ba9f-418c-b916-c51034442ff4" (UID: "81c3f964-ba9f-418c-b916-c51034442ff4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.351858 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81c3f964-ba9f-418c-b916-c51034442ff4-config" (OuterVolumeSpecName: "config") pod "81c3f964-ba9f-418c-b916-c51034442ff4" (UID: "81c3f964-ba9f-418c-b916-c51034442ff4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.352159 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c3f964-ba9f-418c-b916-c51034442ff4-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.352254 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t4bd\" (UniqueName: \"kubernetes.io/projected/05b4d009-a68d-4287-b15d-b0470d62d486-kube-api-access-2t4bd\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.352370 4843 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05b4d009-a68d-4287-b15d-b0470d62d486-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.352447 4843 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05b4d009-a68d-4287-b15d-b0470d62d486-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.352540 4843 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81c3f964-ba9f-418c-b916-c51034442ff4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.355228 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c3f964-ba9f-418c-b916-c51034442ff4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "81c3f964-ba9f-418c-b916-c51034442ff4" (UID: "81c3f964-ba9f-418c-b916-c51034442ff4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.356268 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81c3f964-ba9f-418c-b916-c51034442ff4-kube-api-access-rm72h" (OuterVolumeSpecName: "kube-api-access-rm72h") pod "81c3f964-ba9f-418c-b916-c51034442ff4" (UID: "81c3f964-ba9f-418c-b916-c51034442ff4"). InnerVolumeSpecName "kube-api-access-rm72h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.453904 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm72h\" (UniqueName: \"kubernetes.io/projected/81c3f964-ba9f-418c-b916-c51034442ff4-kube-api-access-rm72h\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.453941 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c3f964-ba9f-418c-b916-c51034442ff4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.797959 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.798022 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.824974 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.826294 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8" event={"ID":"81c3f964-ba9f-418c-b916-c51034442ff4","Type":"ContainerDied","Data":"5ca63f19f7d5b838eaf72f84a24beac71b0aa428a61d4179fe700d277e62e2b7"} Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.826542 4843 scope.go:117] "RemoveContainer" containerID="9862b5ba7960c0a3a8b5f4286159b2b3da1ff204a40146351e70c20e5d4d03bc" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.830707 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" podUID="60b00a12-ac0c-43c1-b140-2b1170d6a0bf" containerName="controller-manager" containerID="cri-o://939da6e97224a3cafce68e9c030a17f58f8ed9cc76d57b5a80f4590bd59f59be" gracePeriod=30 Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.830811 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" event={"ID":"05b4d009-a68d-4287-b15d-b0470d62d486","Type":"ContainerDied","Data":"e518413656972e76486e94b561050815eb5649a923632795c4526e51096db719"} Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.831036 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e518413656972e76486e94b561050815eb5649a923632795c4526e51096db719" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.830832 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh" Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.879099 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8"] Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.882991 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66b5d79dbf-b6rn8"] Mar 18 12:15:03 crc kubenswrapper[4843]: I0318 12:15:03.955779 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.310449 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.477360 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-serving-cert\") pod \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.477524 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-config\") pod \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.477807 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shqjg\" (UniqueName: \"kubernetes.io/projected/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-kube-api-access-shqjg\") pod \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.478838 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-client-ca\") pod \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.478935 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-proxy-ca-bundles\") pod \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\" (UID: \"60b00a12-ac0c-43c1-b140-2b1170d6a0bf\") " Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.479120 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-config" (OuterVolumeSpecName: "config") pod "60b00a12-ac0c-43c1-b140-2b1170d6a0bf" (UID: "60b00a12-ac0c-43c1-b140-2b1170d6a0bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.479450 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-client-ca" (OuterVolumeSpecName: "client-ca") pod "60b00a12-ac0c-43c1-b140-2b1170d6a0bf" (UID: "60b00a12-ac0c-43c1-b140-2b1170d6a0bf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.479640 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.479697 4843 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.480322 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "60b00a12-ac0c-43c1-b140-2b1170d6a0bf" (UID: "60b00a12-ac0c-43c1-b140-2b1170d6a0bf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.481698 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "60b00a12-ac0c-43c1-b140-2b1170d6a0bf" (UID: "60b00a12-ac0c-43c1-b140-2b1170d6a0bf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.481932 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-kube-api-access-shqjg" (OuterVolumeSpecName: "kube-api-access-shqjg") pod "60b00a12-ac0c-43c1-b140-2b1170d6a0bf" (UID: "60b00a12-ac0c-43c1-b140-2b1170d6a0bf"). InnerVolumeSpecName "kube-api-access-shqjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.581481 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shqjg\" (UniqueName: \"kubernetes.io/projected/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-kube-api-access-shqjg\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.581524 4843 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.581538 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60b00a12-ac0c-43c1-b140-2b1170d6a0bf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.836375 4843 generic.go:334] "Generic (PLEG): container finished" podID="60b00a12-ac0c-43c1-b140-2b1170d6a0bf" containerID="939da6e97224a3cafce68e9c030a17f58f8ed9cc76d57b5a80f4590bd59f59be" exitCode=0 Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.836466 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" event={"ID":"60b00a12-ac0c-43c1-b140-2b1170d6a0bf","Type":"ContainerDied","Data":"939da6e97224a3cafce68e9c030a17f58f8ed9cc76d57b5a80f4590bd59f59be"} Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.836522 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" event={"ID":"60b00a12-ac0c-43c1-b140-2b1170d6a0bf","Type":"ContainerDied","Data":"10567094ddf4f35b56151bb72495c3b191b1650d45ab95026b22129d9f1cd36d"} Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.836543 4843 scope.go:117] "RemoveContainer" containerID="939da6e97224a3cafce68e9c030a17f58f8ed9cc76d57b5a80f4590bd59f59be" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.837202 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65997c5d4b-86knq" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.864177 4843 scope.go:117] "RemoveContainer" containerID="939da6e97224a3cafce68e9c030a17f58f8ed9cc76d57b5a80f4590bd59f59be" Mar 18 12:15:04 crc kubenswrapper[4843]: E0318 12:15:04.865077 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"939da6e97224a3cafce68e9c030a17f58f8ed9cc76d57b5a80f4590bd59f59be\": container with ID starting with 939da6e97224a3cafce68e9c030a17f58f8ed9cc76d57b5a80f4590bd59f59be not found: ID does not exist" containerID="939da6e97224a3cafce68e9c030a17f58f8ed9cc76d57b5a80f4590bd59f59be" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.865120 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939da6e97224a3cafce68e9c030a17f58f8ed9cc76d57b5a80f4590bd59f59be"} err="failed to get container status \"939da6e97224a3cafce68e9c030a17f58f8ed9cc76d57b5a80f4590bd59f59be\": rpc error: code = NotFound desc = could not find container \"939da6e97224a3cafce68e9c030a17f58f8ed9cc76d57b5a80f4590bd59f59be\": container with ID starting with 939da6e97224a3cafce68e9c030a17f58f8ed9cc76d57b5a80f4590bd59f59be not found: ID does not exist" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.874077 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65997c5d4b-86knq"] Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.876881 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-65997c5d4b-86knq"] Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.991923 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b00a12-ac0c-43c1-b140-2b1170d6a0bf" path="/var/lib/kubelet/pods/60b00a12-ac0c-43c1-b140-2b1170d6a0bf/volumes" Mar 18 12:15:04 crc kubenswrapper[4843]: I0318 12:15:04.992805 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81c3f964-ba9f-418c-b916-c51034442ff4" path="/var/lib/kubelet/pods/81c3f964-ba9f-418c-b916-c51034442ff4/volumes" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.189043 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq"] Mar 18 12:15:05 crc kubenswrapper[4843]: E0318 12:15:05.189845 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b00a12-ac0c-43c1-b140-2b1170d6a0bf" containerName="controller-manager" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.189955 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b00a12-ac0c-43c1-b140-2b1170d6a0bf" containerName="controller-manager" Mar 18 12:15:05 crc kubenswrapper[4843]: E0318 12:15:05.190078 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c3f964-ba9f-418c-b916-c51034442ff4" containerName="route-controller-manager" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.190190 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c3f964-ba9f-418c-b916-c51034442ff4" containerName="route-controller-manager" Mar 18 12:15:05 crc kubenswrapper[4843]: E0318 12:15:05.190296 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b4d009-a68d-4287-b15d-b0470d62d486" containerName="collect-profiles" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.190373 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b4d009-a68d-4287-b15d-b0470d62d486" containerName="collect-profiles" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.190601 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="05b4d009-a68d-4287-b15d-b0470d62d486" containerName="collect-profiles" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.190804 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c3f964-ba9f-418c-b916-c51034442ff4" containerName="route-controller-manager" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.190889 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b00a12-ac0c-43c1-b140-2b1170d6a0bf" containerName="controller-manager" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.191450 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.194600 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.194669 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.194688 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.194730 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.194773 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.194781 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.195374 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.195742 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.196214 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.196601 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.196832 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.197022 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.198381 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.200623 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh"] Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.201680 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.203396 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.203642 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.204342 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.207294 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.207389 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.212959 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.231548 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq"] Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.231668 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh"] Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.233289 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.233976 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.237349 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.239962 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.241318 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.242480 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.295871 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-client-ca\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.296006 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e643ac6e-2430-4a8e-8fb1-386333d22f87-config\") pod \"route-controller-manager-5b886f8d5b-plzlq\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.296047 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-config\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.296096 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e643ac6e-2430-4a8e-8fb1-386333d22f87-serving-cert\") pod \"route-controller-manager-5b886f8d5b-plzlq\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.296171 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-proxy-ca-bundles\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.296254 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a71fca9e-8650-446f-a3a8-e8b534f879a9-serving-cert\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.296323 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e643ac6e-2430-4a8e-8fb1-386333d22f87-client-ca\") pod \"route-controller-manager-5b886f8d5b-plzlq\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.296347 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qcbj\" (UniqueName: \"kubernetes.io/projected/a71fca9e-8650-446f-a3a8-e8b534f879a9-kube-api-access-4qcbj\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.296382 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwmvg\" (UniqueName: \"kubernetes.io/projected/e643ac6e-2430-4a8e-8fb1-386333d22f87-kube-api-access-dwmvg\") pod \"route-controller-manager-5b886f8d5b-plzlq\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.397943 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-proxy-ca-bundles\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.398221 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a71fca9e-8650-446f-a3a8-e8b534f879a9-serving-cert\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.398344 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e643ac6e-2430-4a8e-8fb1-386333d22f87-client-ca\") pod \"route-controller-manager-5b886f8d5b-plzlq\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.398421 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qcbj\" (UniqueName: \"kubernetes.io/projected/a71fca9e-8650-446f-a3a8-e8b534f879a9-kube-api-access-4qcbj\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.398503 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwmvg\" (UniqueName: \"kubernetes.io/projected/e643ac6e-2430-4a8e-8fb1-386333d22f87-kube-api-access-dwmvg\") pod \"route-controller-manager-5b886f8d5b-plzlq\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.398583 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-client-ca\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.398685 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e643ac6e-2430-4a8e-8fb1-386333d22f87-config\") pod \"route-controller-manager-5b886f8d5b-plzlq\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.398790 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-config\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.398917 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e643ac6e-2430-4a8e-8fb1-386333d22f87-serving-cert\") pod \"route-controller-manager-5b886f8d5b-plzlq\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.399770 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e643ac6e-2430-4a8e-8fb1-386333d22f87-client-ca\") pod \"route-controller-manager-5b886f8d5b-plzlq\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.400091 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-proxy-ca-bundles\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.400132 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e643ac6e-2430-4a8e-8fb1-386333d22f87-config\") pod \"route-controller-manager-5b886f8d5b-plzlq\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.400567 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-client-ca\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.400800 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-config\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.401949 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e643ac6e-2430-4a8e-8fb1-386333d22f87-serving-cert\") pod \"route-controller-manager-5b886f8d5b-plzlq\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.403362 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a71fca9e-8650-446f-a3a8-e8b534f879a9-serving-cert\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.421396 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwmvg\" (UniqueName: \"kubernetes.io/projected/e643ac6e-2430-4a8e-8fb1-386333d22f87-kube-api-access-dwmvg\") pod \"route-controller-manager-5b886f8d5b-plzlq\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.421410 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qcbj\" (UniqueName: \"kubernetes.io/projected/a71fca9e-8650-446f-a3a8-e8b534f879a9-kube-api-access-4qcbj\") pod \"controller-manager-64fbb8d49c-mmnvh\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.498202 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.506637 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.513995 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.576075 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:05 crc kubenswrapper[4843]: I0318 12:15:05.583952 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.137491 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8lfcz" Mar 18 12:15:06 crc kubenswrapper[4843]: W0318 12:15:06.150529 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-c308a8912b55d1669c4ce252b95705d406350fd2d231daed761db35e96ec6e0c WatchSource:0}: Error finding container c308a8912b55d1669c4ce252b95705d406350fd2d231daed761db35e96ec6e0c: Status 404 returned error can't find the container with id c308a8912b55d1669c4ce252b95705d406350fd2d231daed761db35e96ec6e0c Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.342615 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq"] Mar 18 12:15:06 crc kubenswrapper[4843]: W0318 12:15:06.365319 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode643ac6e_2430_4a8e_8fb1_386333d22f87.slice/crio-ab617ee3de9ee1f4280d59fa981e433ef322a9ca4ddedce274c3d6f4526dce52 WatchSource:0}: Error finding container ab617ee3de9ee1f4280d59fa981e433ef322a9ca4ddedce274c3d6f4526dce52: Status 404 returned error can't find the container with id ab617ee3de9ee1f4280d59fa981e433ef322a9ca4ddedce274c3d6f4526dce52 Mar 18 12:15:06 crc kubenswrapper[4843]: W0318 12:15:06.366663 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-bbd773139537a7e6a84110e522ebecff2f9267a983b93f76f2ab3710112f3fd6 WatchSource:0}: Error finding container bbd773139537a7e6a84110e522ebecff2f9267a983b93f76f2ab3710112f3fd6: Status 404 returned error can't find the container with id bbd773139537a7e6a84110e522ebecff2f9267a983b93f76f2ab3710112f3fd6 Mar 18 12:15:06 crc kubenswrapper[4843]: W0318 12:15:06.407033 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-699617b659c31e5fba88cfe7a2b64bbbb2cd14e51f53326e5867190a1a4f2175 WatchSource:0}: Error finding container 699617b659c31e5fba88cfe7a2b64bbbb2cd14e51f53326e5867190a1a4f2175: Status 404 returned error can't find the container with id 699617b659c31e5fba88cfe7a2b64bbbb2cd14e51f53326e5867190a1a4f2175 Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.506972 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh"] Mar 18 12:15:06 crc kubenswrapper[4843]: W0318 12:15:06.516576 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda71fca9e_8650_446f_a3a8_e8b534f879a9.slice/crio-058de66d80057e91518b5d8b5131ab44b669d984170ed36a587e7da4386102e0 WatchSource:0}: Error finding container 058de66d80057e91518b5d8b5131ab44b669d984170ed36a587e7da4386102e0: Status 404 returned error can't find the container with id 058de66d80057e91518b5d8b5131ab44b669d984170ed36a587e7da4386102e0 Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.855253 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"55bc94dc64be1af07bb616376c049f891a37944c3eba131e4442250293c73b17"} Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.855550 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c308a8912b55d1669c4ce252b95705d406350fd2d231daed761db35e96ec6e0c"} Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.857513 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"51c1ad83fd4c66f0b0a3ddc45c0b6e36edf19a02eeba42c51af0e633548ad4a7"} Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.857589 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"699617b659c31e5fba88cfe7a2b64bbbb2cd14e51f53326e5867190a1a4f2175"} Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.860489 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" event={"ID":"e643ac6e-2430-4a8e-8fb1-386333d22f87","Type":"ContainerStarted","Data":"5aa52cbf23f0e9cca1fd5a1ad267919b9c7625d9a6ddbe84cefbfd225fc18fa3"} Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.860543 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" event={"ID":"e643ac6e-2430-4a8e-8fb1-386333d22f87","Type":"ContainerStarted","Data":"ab617ee3de9ee1f4280d59fa981e433ef322a9ca4ddedce274c3d6f4526dce52"} Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.860851 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.862687 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"46e563406f2049fef59fdb0d2168556481901f1ac1ca05c06187714c1c10b2cb"} Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.862725 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"bbd773139537a7e6a84110e522ebecff2f9267a983b93f76f2ab3710112f3fd6"} Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.862905 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.864166 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" event={"ID":"a71fca9e-8650-446f-a3a8-e8b534f879a9","Type":"ContainerStarted","Data":"31eb1b92027dea9ad0cb703095433bf46aa0eb88c2f735f796584c2862d92107"} Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.864200 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" event={"ID":"a71fca9e-8650-446f-a3a8-e8b534f879a9","Type":"ContainerStarted","Data":"058de66d80057e91518b5d8b5131ab44b669d984170ed36a587e7da4386102e0"} Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.864362 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.865598 4843 patch_prober.go:28] interesting pod/controller-manager-64fbb8d49c-mmnvh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" start-of-body= Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.865638 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" podUID="a71fca9e-8650-446f-a3a8-e8b534f879a9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.914459 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" podStartSLOduration=4.914438265 podStartE2EDuration="4.914438265s" podCreationTimestamp="2026-03-18 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:06.91050687 +0000 UTC m=+340.626332394" watchObservedRunningTime="2026-03-18 12:15:06.914438265 +0000 UTC m=+340.630263789" Mar 18 12:15:06 crc kubenswrapper[4843]: I0318 12:15:06.970073 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" podStartSLOduration=4.97005565 podStartE2EDuration="4.97005565s" podCreationTimestamp="2026-03-18 12:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:06.945798365 +0000 UTC m=+340.661623889" watchObservedRunningTime="2026-03-18 12:15:06.97005565 +0000 UTC m=+340.685881174" Mar 18 12:15:07 crc kubenswrapper[4843]: I0318 12:15:07.577809 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:07 crc kubenswrapper[4843]: I0318 12:15:07.875060 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dgr8z" event={"ID":"feb1dd70-302b-4217-b17c-211aea971073","Type":"ContainerStarted","Data":"5a541bdeaa5d99cd556607bba9b15250166a71590ce65a087628f274f3f2e576"} Mar 18 12:15:07 crc kubenswrapper[4843]: I0318 12:15:07.942188 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:08 crc kubenswrapper[4843]: I0318 12:15:08.974945 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcsxl" event={"ID":"3cef8c60-1bee-4798-9147-4d7f360ae4b2","Type":"ContainerStarted","Data":"7e9620ecd6c684ecdf70955bdeecff16f2b85347b482a7bb2d5abf72cb010d21"} Mar 18 12:15:12 crc kubenswrapper[4843]: I0318 12:15:12.000572 4843 generic.go:334] "Generic (PLEG): container finished" podID="3cef8c60-1bee-4798-9147-4d7f360ae4b2" containerID="7e9620ecd6c684ecdf70955bdeecff16f2b85347b482a7bb2d5abf72cb010d21" exitCode=0 Mar 18 12:15:12 crc kubenswrapper[4843]: I0318 12:15:12.000620 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcsxl" event={"ID":"3cef8c60-1bee-4798-9147-4d7f360ae4b2","Type":"ContainerDied","Data":"7e9620ecd6c684ecdf70955bdeecff16f2b85347b482a7bb2d5abf72cb010d21"} Mar 18 12:15:13 crc kubenswrapper[4843]: I0318 12:15:13.009147 4843 generic.go:334] "Generic (PLEG): container finished" podID="feb1dd70-302b-4217-b17c-211aea971073" containerID="5a541bdeaa5d99cd556607bba9b15250166a71590ce65a087628f274f3f2e576" exitCode=0 Mar 18 12:15:13 crc kubenswrapper[4843]: I0318 12:15:13.009255 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dgr8z" event={"ID":"feb1dd70-302b-4217-b17c-211aea971073","Type":"ContainerDied","Data":"5a541bdeaa5d99cd556607bba9b15250166a71590ce65a087628f274f3f2e576"} Mar 18 12:15:13 crc kubenswrapper[4843]: I0318 12:15:13.849159 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:15:15 crc kubenswrapper[4843]: I0318 12:15:15.030321 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfg7x" event={"ID":"431329ed-c93d-4d44-bb49-ebed3a083f1a","Type":"ContainerStarted","Data":"cf38075d7eeb7d44d64e6cc39d5e4b16a981cc72ae0f7f7ef7ce7875313983e0"} Mar 18 12:15:15 crc kubenswrapper[4843]: I0318 12:15:15.149119 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrwd9" event={"ID":"7aa438f5-fa8b-43f7-93b7-67e592ea698c","Type":"ContainerStarted","Data":"b802b2e548d9e3766d47ac21897325244d223dd2e4920b666b8542b21c723f89"} Mar 18 12:15:15 crc kubenswrapper[4843]: I0318 12:15:15.151922 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52bm4" event={"ID":"95b0d7ed-60bd-467a-a1b4-86f2c32096ab","Type":"ContainerStarted","Data":"ea2c6613dfdd829a4cd38d4f7015b18c4bc8c3ddbc488d4a7238901fbeac7989"} Mar 18 12:15:15 crc kubenswrapper[4843]: I0318 12:15:15.155085 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcsxl" event={"ID":"3cef8c60-1bee-4798-9147-4d7f360ae4b2","Type":"ContainerStarted","Data":"83c7b78c077140dfa9f6d8fa459d5803c06cce60e24cf550f8839531bcd9ec8f"} Mar 18 12:15:15 crc kubenswrapper[4843]: I0318 12:15:15.158146 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2p2b" event={"ID":"0dc0843b-8f5c-434e-986e-3aab182caad3","Type":"ContainerStarted","Data":"74926d0a5517c6b81e126a3a3709ef77fac93f750b64a7d3905e48ef93d423f6"} Mar 18 12:15:15 crc kubenswrapper[4843]: I0318 12:15:15.163570 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t62l2" event={"ID":"ffad2911-2fae-4030-a3ad-d36a5f95fc07","Type":"ContainerStarted","Data":"9b0012ca8ef7feaaaec9d8495679246c1c5c9aeebf13472565bec94af511ea85"} Mar 18 12:15:15 crc kubenswrapper[4843]: I0318 12:15:15.165450 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563932-bzf8p" event={"ID":"803876de-64f6-4347-8ea5-6d2d8f87e828","Type":"ContainerStarted","Data":"41a8bab979722aa364412d9f395252731eeafd17df8f23745e560c3fd7f9d30e"} Mar 18 12:15:15 crc kubenswrapper[4843]: I0318 12:15:15.166954 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563934-9j5hk" event={"ID":"4f32540f-28b9-46d1-a943-f04368d4cae2","Type":"ContainerStarted","Data":"817b68362f28f87c80accada443d0e386ec43d01922fa240f9be74fcf5abe4aa"} Mar 18 12:15:15 crc kubenswrapper[4843]: I0318 12:15:15.564050 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563932-bzf8p" podStartSLOduration=94.368879673 podStartE2EDuration="3m15.564033877s" podCreationTimestamp="2026-03-18 12:12:00 +0000 UTC" firstStartedPulling="2026-03-18 12:13:32.321258702 +0000 UTC m=+246.037084226" lastFinishedPulling="2026-03-18 12:15:13.516412906 +0000 UTC m=+347.232238430" observedRunningTime="2026-03-18 12:15:15.560054051 +0000 UTC m=+349.275879575" watchObservedRunningTime="2026-03-18 12:15:15.564033877 +0000 UTC m=+349.279859401" Mar 18 12:15:15 crc kubenswrapper[4843]: I0318 12:15:15.614180 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563934-9j5hk" podStartSLOduration=4.856307199 podStartE2EDuration="1m15.614159242s" podCreationTimestamp="2026-03-18 12:14:00 +0000 UTC" firstStartedPulling="2026-03-18 12:14:02.250117255 +0000 UTC m=+275.965942779" lastFinishedPulling="2026-03-18 12:15:13.007969298 +0000 UTC m=+346.723794822" observedRunningTime="2026-03-18 12:15:15.584589834 +0000 UTC m=+349.300415358" watchObservedRunningTime="2026-03-18 12:15:15.614159242 +0000 UTC m=+349.329984766" Mar 18 12:15:15 crc kubenswrapper[4843]: I0318 12:15:15.696583 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lcsxl" podStartSLOduration=9.158413271 podStartE2EDuration="1m33.696558216s" podCreationTimestamp="2026-03-18 12:13:42 +0000 UTC" firstStartedPulling="2026-03-18 12:13:49.319054129 +0000 UTC m=+263.034879653" lastFinishedPulling="2026-03-18 12:15:13.857199074 +0000 UTC m=+347.573024598" observedRunningTime="2026-03-18 12:15:15.63576865 +0000 UTC m=+349.351594174" watchObservedRunningTime="2026-03-18 12:15:15.696558216 +0000 UTC m=+349.412383740" Mar 18 12:15:16 crc kubenswrapper[4843]: I0318 12:15:16.514190 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dgr8z" event={"ID":"feb1dd70-302b-4217-b17c-211aea971073","Type":"ContainerStarted","Data":"450609aa37892fe84c5d85575e46da946e0bd025d8a1067bcff4d2d7d873ca3d"} Mar 18 12:15:17 crc kubenswrapper[4843]: I0318 12:15:17.281455 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dgr8z" podStartSLOduration=7.150700044 podStartE2EDuration="1m32.281432968s" podCreationTimestamp="2026-03-18 12:13:45 +0000 UTC" firstStartedPulling="2026-03-18 12:13:49.529865814 +0000 UTC m=+263.245691338" lastFinishedPulling="2026-03-18 12:15:14.660598738 +0000 UTC m=+348.376424262" observedRunningTime="2026-03-18 12:15:16.621176021 +0000 UTC m=+350.337001545" watchObservedRunningTime="2026-03-18 12:15:17.281432968 +0000 UTC m=+350.997258492" Mar 18 12:15:17 crc kubenswrapper[4843]: I0318 12:15:17.285771 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xs5wj"] Mar 18 12:15:17 crc kubenswrapper[4843]: I0318 12:15:17.286058 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xs5wj" podUID="93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" containerName="registry-server" containerID="cri-o://c3e34a0f6e1cbd12562cc96c622d1119784108f54d2a7b769b16292db4fe4e09" gracePeriod=2 Mar 18 12:15:17 crc kubenswrapper[4843]: I0318 12:15:17.521836 4843 generic.go:334] "Generic (PLEG): container finished" podID="0dc0843b-8f5c-434e-986e-3aab182caad3" containerID="74926d0a5517c6b81e126a3a3709ef77fac93f750b64a7d3905e48ef93d423f6" exitCode=0 Mar 18 12:15:17 crc kubenswrapper[4843]: I0318 12:15:17.521886 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2p2b" event={"ID":"0dc0843b-8f5c-434e-986e-3aab182caad3","Type":"ContainerDied","Data":"74926d0a5517c6b81e126a3a3709ef77fac93f750b64a7d3905e48ef93d423f6"} Mar 18 12:15:17 crc kubenswrapper[4843]: I0318 12:15:17.677827 4843 csr.go:261] certificate signing request csr-hckvl is approved, waiting to be issued Mar 18 12:15:17 crc kubenswrapper[4843]: I0318 12:15:17.721969 4843 csr.go:257] certificate signing request csr-hckvl is issued Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.538183 4843 generic.go:334] "Generic (PLEG): container finished" podID="803876de-64f6-4347-8ea5-6d2d8f87e828" containerID="41a8bab979722aa364412d9f395252731eeafd17df8f23745e560c3fd7f9d30e" exitCode=0 Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.538247 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563932-bzf8p" event={"ID":"803876de-64f6-4347-8ea5-6d2d8f87e828","Type":"ContainerDied","Data":"41a8bab979722aa364412d9f395252731eeafd17df8f23745e560c3fd7f9d30e"} Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.540483 4843 generic.go:334] "Generic (PLEG): container finished" podID="4f32540f-28b9-46d1-a943-f04368d4cae2" containerID="817b68362f28f87c80accada443d0e386ec43d01922fa240f9be74fcf5abe4aa" exitCode=0 Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.540542 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563934-9j5hk" event={"ID":"4f32540f-28b9-46d1-a943-f04368d4cae2","Type":"ContainerDied","Data":"817b68362f28f87c80accada443d0e386ec43d01922fa240f9be74fcf5abe4aa"} Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.542225 4843 generic.go:334] "Generic (PLEG): container finished" podID="431329ed-c93d-4d44-bb49-ebed3a083f1a" containerID="cf38075d7eeb7d44d64e6cc39d5e4b16a981cc72ae0f7f7ef7ce7875313983e0" exitCode=0 Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.542268 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfg7x" event={"ID":"431329ed-c93d-4d44-bb49-ebed3a083f1a","Type":"ContainerDied","Data":"cf38075d7eeb7d44d64e6cc39d5e4b16a981cc72ae0f7f7ef7ce7875313983e0"} Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.546228 4843 generic.go:334] "Generic (PLEG): container finished" podID="93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" containerID="c3e34a0f6e1cbd12562cc96c622d1119784108f54d2a7b769b16292db4fe4e09" exitCode=0 Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.546284 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs5wj" event={"ID":"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa","Type":"ContainerDied","Data":"c3e34a0f6e1cbd12562cc96c622d1119784108f54d2a7b769b16292db4fe4e09"} Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.548804 4843 generic.go:334] "Generic (PLEG): container finished" podID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" containerID="ea2c6613dfdd829a4cd38d4f7015b18c4bc8c3ddbc488d4a7238901fbeac7989" exitCode=0 Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.548872 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52bm4" event={"ID":"95b0d7ed-60bd-467a-a1b4-86f2c32096ab","Type":"ContainerDied","Data":"ea2c6613dfdd829a4cd38d4f7015b18c4bc8c3ddbc488d4a7238901fbeac7989"} Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.551033 4843 generic.go:334] "Generic (PLEG): container finished" podID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" containerID="9b0012ca8ef7feaaaec9d8495679246c1c5c9aeebf13472565bec94af511ea85" exitCode=0 Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.551053 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t62l2" event={"ID":"ffad2911-2fae-4030-a3ad-d36a5f95fc07","Type":"ContainerDied","Data":"9b0012ca8ef7feaaaec9d8495679246c1c5c9aeebf13472565bec94af511ea85"} Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.723892 4843 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-19 02:46:41.03328112 +0000 UTC Mar 18 12:15:18 crc kubenswrapper[4843]: I0318 12:15:18.723981 4843 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7358h31m22.309304934s for next certificate rotation Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.334115 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.649218 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-utilities\") pod \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\" (UID: \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\") " Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.649288 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-catalog-content\") pod \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\" (UID: \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\") " Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.649328 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmf2j\" (UniqueName: \"kubernetes.io/projected/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-kube-api-access-zmf2j\") pod \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\" (UID: \"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa\") " Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.650250 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-utilities" (OuterVolumeSpecName: "utilities") pod "93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" (UID: "93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.651112 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.666763 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-kube-api-access-zmf2j" (OuterVolumeSpecName: "kube-api-access-zmf2j") pod "93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" (UID: "93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa"). InnerVolumeSpecName "kube-api-access-zmf2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.675695 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs5wj" event={"ID":"93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa","Type":"ContainerDied","Data":"b441ec1c8d0e83f2fc294282a146aebf545026f7692de6b04c6efb65c3fe80c8"} Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.675764 4843 scope.go:117] "RemoveContainer" containerID="c3e34a0f6e1cbd12562cc96c622d1119784108f54d2a7b769b16292db4fe4e09" Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.675777 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs5wj" Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.697855 4843 scope.go:117] "RemoveContainer" containerID="a5f57d2b83168469ebb2c9704bc28b0006eb3b4a9aac5dee7a05aac0888cbf7a" Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.710308 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" (UID: "93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.725246 4843 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-14 01:32:30.945286013 +0000 UTC Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.725307 4843 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5773h17m11.219982954s for next certificate rotation Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.752469 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.752513 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmf2j\" (UniqueName: \"kubernetes.io/projected/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa-kube-api-access-zmf2j\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:19 crc kubenswrapper[4843]: I0318 12:15:19.959210 4843 scope.go:117] "RemoveContainer" containerID="73d0c20023e3fdaa8b51f84bc1b1a2ebcb64c55fef243ef0d5300b17b1d8f0e1" Mar 18 12:15:20 crc kubenswrapper[4843]: I0318 12:15:20.015262 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xs5wj"] Mar 18 12:15:20 crc kubenswrapper[4843]: I0318 12:15:20.018674 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xs5wj"] Mar 18 12:15:20 crc kubenswrapper[4843]: I0318 12:15:20.686280 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2p2b" event={"ID":"0dc0843b-8f5c-434e-986e-3aab182caad3","Type":"ContainerStarted","Data":"fdb4a1b706c6bd18833808d3da2c60aa762b0b897e35392d8784f873901612b2"} Mar 18 12:15:20 crc kubenswrapper[4843]: I0318 12:15:20.688504 4843 generic.go:334] "Generic (PLEG): container finished" podID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" containerID="b802b2e548d9e3766d47ac21897325244d223dd2e4920b666b8542b21c723f89" exitCode=0 Mar 18 12:15:20 crc kubenswrapper[4843]: I0318 12:15:20.688566 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrwd9" event={"ID":"7aa438f5-fa8b-43f7-93b7-67e592ea698c","Type":"ContainerDied","Data":"b802b2e548d9e3766d47ac21897325244d223dd2e4920b666b8542b21c723f89"} Mar 18 12:15:20 crc kubenswrapper[4843]: I0318 12:15:20.791795 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563934-9j5hk" Mar 18 12:15:20 crc kubenswrapper[4843]: I0318 12:15:20.799158 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563932-bzf8p" Mar 18 12:15:20 crc kubenswrapper[4843]: I0318 12:15:20.898623 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb5nn\" (UniqueName: \"kubernetes.io/projected/4f32540f-28b9-46d1-a943-f04368d4cae2-kube-api-access-fb5nn\") pod \"4f32540f-28b9-46d1-a943-f04368d4cae2\" (UID: \"4f32540f-28b9-46d1-a943-f04368d4cae2\") " Mar 18 12:15:20 crc kubenswrapper[4843]: I0318 12:15:20.898716 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcqm7\" (UniqueName: \"kubernetes.io/projected/803876de-64f6-4347-8ea5-6d2d8f87e828-kube-api-access-tcqm7\") pod \"803876de-64f6-4347-8ea5-6d2d8f87e828\" (UID: \"803876de-64f6-4347-8ea5-6d2d8f87e828\") " Mar 18 12:15:20 crc kubenswrapper[4843]: I0318 12:15:20.904977 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803876de-64f6-4347-8ea5-6d2d8f87e828-kube-api-access-tcqm7" (OuterVolumeSpecName: "kube-api-access-tcqm7") pod "803876de-64f6-4347-8ea5-6d2d8f87e828" (UID: "803876de-64f6-4347-8ea5-6d2d8f87e828"). InnerVolumeSpecName "kube-api-access-tcqm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:20 crc kubenswrapper[4843]: I0318 12:15:20.905171 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f32540f-28b9-46d1-a943-f04368d4cae2-kube-api-access-fb5nn" (OuterVolumeSpecName: "kube-api-access-fb5nn") pod "4f32540f-28b9-46d1-a943-f04368d4cae2" (UID: "4f32540f-28b9-46d1-a943-f04368d4cae2"). InnerVolumeSpecName "kube-api-access-fb5nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:20 crc kubenswrapper[4843]: I0318 12:15:20.990469 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" path="/var/lib/kubelet/pods/93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa/volumes" Mar 18 12:15:21 crc kubenswrapper[4843]: I0318 12:15:21.000482 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb5nn\" (UniqueName: \"kubernetes.io/projected/4f32540f-28b9-46d1-a943-f04368d4cae2-kube-api-access-fb5nn\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:21 crc kubenswrapper[4843]: I0318 12:15:21.000523 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcqm7\" (UniqueName: \"kubernetes.io/projected/803876de-64f6-4347-8ea5-6d2d8f87e828-kube-api-access-tcqm7\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:21 crc kubenswrapper[4843]: I0318 12:15:21.703320 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563932-bzf8p" event={"ID":"803876de-64f6-4347-8ea5-6d2d8f87e828","Type":"ContainerDied","Data":"e3c45275eede20f1b5d8794b8ced35da56466a5763fcd86a7fb97b375ee0e26d"} Mar 18 12:15:21 crc kubenswrapper[4843]: I0318 12:15:21.703700 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3c45275eede20f1b5d8794b8ced35da56466a5763fcd86a7fb97b375ee0e26d" Mar 18 12:15:21 crc kubenswrapper[4843]: I0318 12:15:21.703338 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563932-bzf8p" Mar 18 12:15:21 crc kubenswrapper[4843]: I0318 12:15:21.704947 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563934-9j5hk" event={"ID":"4f32540f-28b9-46d1-a943-f04368d4cae2","Type":"ContainerDied","Data":"38cacacff8e32f34521742741680d20cb23c272fe46917a15ee7fcabca071898"} Mar 18 12:15:21 crc kubenswrapper[4843]: I0318 12:15:21.705048 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38cacacff8e32f34521742741680d20cb23c272fe46917a15ee7fcabca071898" Mar 18 12:15:21 crc kubenswrapper[4843]: I0318 12:15:21.704974 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563934-9j5hk" Mar 18 12:15:21 crc kubenswrapper[4843]: I0318 12:15:21.730490 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j2p2b" podStartSLOduration=7.9506237859999995 podStartE2EDuration="1m36.730467378s" podCreationTimestamp="2026-03-18 12:13:45 +0000 UTC" firstStartedPulling="2026-03-18 12:13:49.404981555 +0000 UTC m=+263.120807089" lastFinishedPulling="2026-03-18 12:15:18.184825157 +0000 UTC m=+351.900650681" observedRunningTime="2026-03-18 12:15:21.728267764 +0000 UTC m=+355.444093288" watchObservedRunningTime="2026-03-18 12:15:21.730467378 +0000 UTC m=+355.446292902" Mar 18 12:15:21 crc kubenswrapper[4843]: I0318 12:15:21.982367 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh"] Mar 18 12:15:21 crc kubenswrapper[4843]: I0318 12:15:21.982896 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" podUID="a71fca9e-8650-446f-a3a8-e8b534f879a9" containerName="controller-manager" containerID="cri-o://31eb1b92027dea9ad0cb703095433bf46aa0eb88c2f735f796584c2862d92107" gracePeriod=30 Mar 18 12:15:22 crc kubenswrapper[4843]: I0318 12:15:22.008600 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq"] Mar 18 12:15:22 crc kubenswrapper[4843]: I0318 12:15:22.008844 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" podUID="e643ac6e-2430-4a8e-8fb1-386333d22f87" containerName="route-controller-manager" containerID="cri-o://5aa52cbf23f0e9cca1fd5a1ad267919b9c7625d9a6ddbe84cefbfd225fc18fa3" gracePeriod=30 Mar 18 12:15:22 crc kubenswrapper[4843]: I0318 12:15:22.714210 4843 generic.go:334] "Generic (PLEG): container finished" podID="e643ac6e-2430-4a8e-8fb1-386333d22f87" containerID="5aa52cbf23f0e9cca1fd5a1ad267919b9c7625d9a6ddbe84cefbfd225fc18fa3" exitCode=0 Mar 18 12:15:22 crc kubenswrapper[4843]: I0318 12:15:22.714258 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" event={"ID":"e643ac6e-2430-4a8e-8fb1-386333d22f87","Type":"ContainerDied","Data":"5aa52cbf23f0e9cca1fd5a1ad267919b9c7625d9a6ddbe84cefbfd225fc18fa3"} Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.700287 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.727609 4843 generic.go:334] "Generic (PLEG): container finished" podID="a71fca9e-8650-446f-a3a8-e8b534f879a9" containerID="31eb1b92027dea9ad0cb703095433bf46aa0eb88c2f735f796584c2862d92107" exitCode=0 Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.727704 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" event={"ID":"a71fca9e-8650-446f-a3a8-e8b534f879a9","Type":"ContainerDied","Data":"31eb1b92027dea9ad0cb703095433bf46aa0eb88c2f735f796584c2862d92107"} Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.727769 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" event={"ID":"a71fca9e-8650-446f-a3a8-e8b534f879a9","Type":"ContainerDied","Data":"058de66d80057e91518b5d8b5131ab44b669d984170ed36a587e7da4386102e0"} Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.727806 4843 scope.go:117] "RemoveContainer" containerID="31eb1b92027dea9ad0cb703095433bf46aa0eb88c2f735f796584c2862d92107" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.728264 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.747760 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79c76b6c7f-j224n"] Mar 18 12:15:23 crc kubenswrapper[4843]: E0318 12:15:23.748086 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71fca9e-8650-446f-a3a8-e8b534f879a9" containerName="controller-manager" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.748111 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71fca9e-8650-446f-a3a8-e8b534f879a9" containerName="controller-manager" Mar 18 12:15:23 crc kubenswrapper[4843]: E0318 12:15:23.748123 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f32540f-28b9-46d1-a943-f04368d4cae2" containerName="oc" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.748132 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f32540f-28b9-46d1-a943-f04368d4cae2" containerName="oc" Mar 18 12:15:23 crc kubenswrapper[4843]: E0318 12:15:23.748143 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803876de-64f6-4347-8ea5-6d2d8f87e828" containerName="oc" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.748151 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="803876de-64f6-4347-8ea5-6d2d8f87e828" containerName="oc" Mar 18 12:15:23 crc kubenswrapper[4843]: E0318 12:15:23.748162 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" containerName="registry-server" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.748170 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" containerName="registry-server" Mar 18 12:15:23 crc kubenswrapper[4843]: E0318 12:15:23.748234 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" containerName="extract-content" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.748244 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" containerName="extract-content" Mar 18 12:15:23 crc kubenswrapper[4843]: E0318 12:15:23.748267 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" containerName="extract-utilities" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.748275 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" containerName="extract-utilities" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.748406 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ed2d4f-4e0a-4de3-9bc3-b7b5cc93defa" containerName="registry-server" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.748424 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="a71fca9e-8650-446f-a3a8-e8b534f879a9" containerName="controller-manager" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.748458 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="803876de-64f6-4347-8ea5-6d2d8f87e828" containerName="oc" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.748467 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f32540f-28b9-46d1-a943-f04368d4cae2" containerName="oc" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.749098 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.756158 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.756232 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.760379 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79c76b6c7f-j224n"] Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.763853 4843 scope.go:117] "RemoveContainer" containerID="31eb1b92027dea9ad0cb703095433bf46aa0eb88c2f735f796584c2862d92107" Mar 18 12:15:23 crc kubenswrapper[4843]: E0318 12:15:23.765957 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31eb1b92027dea9ad0cb703095433bf46aa0eb88c2f735f796584c2862d92107\": container with ID starting with 31eb1b92027dea9ad0cb703095433bf46aa0eb88c2f735f796584c2862d92107 not found: ID does not exist" containerID="31eb1b92027dea9ad0cb703095433bf46aa0eb88c2f735f796584c2862d92107" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.766000 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31eb1b92027dea9ad0cb703095433bf46aa0eb88c2f735f796584c2862d92107"} err="failed to get container status \"31eb1b92027dea9ad0cb703095433bf46aa0eb88c2f735f796584c2862d92107\": rpc error: code = NotFound desc = could not find container \"31eb1b92027dea9ad0cb703095433bf46aa0eb88c2f735f796584c2862d92107\": container with ID starting with 31eb1b92027dea9ad0cb703095433bf46aa0eb88c2f735f796584c2862d92107 not found: ID does not exist" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.809941 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.812228 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-client-ca\") pod \"a71fca9e-8650-446f-a3a8-e8b534f879a9\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.812326 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-proxy-ca-bundles\") pod \"a71fca9e-8650-446f-a3a8-e8b534f879a9\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.812419 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qcbj\" (UniqueName: \"kubernetes.io/projected/a71fca9e-8650-446f-a3a8-e8b534f879a9-kube-api-access-4qcbj\") pod \"a71fca9e-8650-446f-a3a8-e8b534f879a9\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.812456 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-config\") pod \"a71fca9e-8650-446f-a3a8-e8b534f879a9\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.812484 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a71fca9e-8650-446f-a3a8-e8b534f879a9-serving-cert\") pod \"a71fca9e-8650-446f-a3a8-e8b534f879a9\" (UID: \"a71fca9e-8650-446f-a3a8-e8b534f879a9\") " Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.812699 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grj55\" (UniqueName: \"kubernetes.io/projected/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-kube-api-access-grj55\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.812742 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-serving-cert\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.812767 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-proxy-ca-bundles\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.812833 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-config\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.812895 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-client-ca\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.813601 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-client-ca" (OuterVolumeSpecName: "client-ca") pod "a71fca9e-8650-446f-a3a8-e8b534f879a9" (UID: "a71fca9e-8650-446f-a3a8-e8b534f879a9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.814140 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a71fca9e-8650-446f-a3a8-e8b534f879a9" (UID: "a71fca9e-8650-446f-a3a8-e8b534f879a9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.814896 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-config" (OuterVolumeSpecName: "config") pod "a71fca9e-8650-446f-a3a8-e8b534f879a9" (UID: "a71fca9e-8650-446f-a3a8-e8b534f879a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.819244 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a71fca9e-8650-446f-a3a8-e8b534f879a9-kube-api-access-4qcbj" (OuterVolumeSpecName: "kube-api-access-4qcbj") pod "a71fca9e-8650-446f-a3a8-e8b534f879a9" (UID: "a71fca9e-8650-446f-a3a8-e8b534f879a9"). InnerVolumeSpecName "kube-api-access-4qcbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.819771 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a71fca9e-8650-446f-a3a8-e8b534f879a9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a71fca9e-8650-446f-a3a8-e8b534f879a9" (UID: "a71fca9e-8650-446f-a3a8-e8b534f879a9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.914210 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-config\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.914300 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-client-ca\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.914337 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grj55\" (UniqueName: \"kubernetes.io/projected/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-kube-api-access-grj55\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.914370 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-serving-cert\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.914391 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-proxy-ca-bundles\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.914470 4843 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.914485 4843 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.914499 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qcbj\" (UniqueName: \"kubernetes.io/projected/a71fca9e-8650-446f-a3a8-e8b534f879a9-kube-api-access-4qcbj\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.914511 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71fca9e-8650-446f-a3a8-e8b534f879a9-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.914521 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a71fca9e-8650-446f-a3a8-e8b534f879a9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.915608 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-proxy-ca-bundles\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.915978 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-client-ca\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.916295 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-config\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.923112 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-serving-cert\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:23 crc kubenswrapper[4843]: I0318 12:15:23.935356 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grj55\" (UniqueName: \"kubernetes.io/projected/f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb-kube-api-access-grj55\") pod \"controller-manager-79c76b6c7f-j224n\" (UID: \"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb\") " pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.013277 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.068363 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh"] Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.072587 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64fbb8d49c-mmnvh"] Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.073439 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.116899 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwmvg\" (UniqueName: \"kubernetes.io/projected/e643ac6e-2430-4a8e-8fb1-386333d22f87-kube-api-access-dwmvg\") pod \"e643ac6e-2430-4a8e-8fb1-386333d22f87\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.117052 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e643ac6e-2430-4a8e-8fb1-386333d22f87-config\") pod \"e643ac6e-2430-4a8e-8fb1-386333d22f87\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.117074 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e643ac6e-2430-4a8e-8fb1-386333d22f87-client-ca\") pod \"e643ac6e-2430-4a8e-8fb1-386333d22f87\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.117103 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e643ac6e-2430-4a8e-8fb1-386333d22f87-serving-cert\") pod \"e643ac6e-2430-4a8e-8fb1-386333d22f87\" (UID: \"e643ac6e-2430-4a8e-8fb1-386333d22f87\") " Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.118161 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e643ac6e-2430-4a8e-8fb1-386333d22f87-client-ca" (OuterVolumeSpecName: "client-ca") pod "e643ac6e-2430-4a8e-8fb1-386333d22f87" (UID: "e643ac6e-2430-4a8e-8fb1-386333d22f87"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.118249 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e643ac6e-2430-4a8e-8fb1-386333d22f87-config" (OuterVolumeSpecName: "config") pod "e643ac6e-2430-4a8e-8fb1-386333d22f87" (UID: "e643ac6e-2430-4a8e-8fb1-386333d22f87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.119971 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e643ac6e-2430-4a8e-8fb1-386333d22f87-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e643ac6e-2430-4a8e-8fb1-386333d22f87" (UID: "e643ac6e-2430-4a8e-8fb1-386333d22f87"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.120271 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e643ac6e-2430-4a8e-8fb1-386333d22f87-kube-api-access-dwmvg" (OuterVolumeSpecName: "kube-api-access-dwmvg") pod "e643ac6e-2430-4a8e-8fb1-386333d22f87" (UID: "e643ac6e-2430-4a8e-8fb1-386333d22f87"). InnerVolumeSpecName "kube-api-access-dwmvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.218747 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e643ac6e-2430-4a8e-8fb1-386333d22f87-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.218791 4843 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e643ac6e-2430-4a8e-8fb1-386333d22f87-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.218806 4843 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e643ac6e-2430-4a8e-8fb1-386333d22f87-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.218820 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwmvg\" (UniqueName: \"kubernetes.io/projected/e643ac6e-2430-4a8e-8fb1-386333d22f87-kube-api-access-dwmvg\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.487787 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79c76b6c7f-j224n"] Mar 18 12:15:24 crc kubenswrapper[4843]: W0318 12:15:24.503116 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5d7e2c2_9b45_4bf4_84e9_e6417ca53cdb.slice/crio-3b80a5f6b28c900eb7297904a5effd122744eabc94b637dc1348038a32c9fa1a WatchSource:0}: Error finding container 3b80a5f6b28c900eb7297904a5effd122744eabc94b637dc1348038a32c9fa1a: Status 404 returned error can't find the container with id 3b80a5f6b28c900eb7297904a5effd122744eabc94b637dc1348038a32c9fa1a Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.733732 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" event={"ID":"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb","Type":"ContainerStarted","Data":"3b80a5f6b28c900eb7297904a5effd122744eabc94b637dc1348038a32c9fa1a"} Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.735621 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfg7x" event={"ID":"431329ed-c93d-4d44-bb49-ebed3a083f1a","Type":"ContainerStarted","Data":"6bf2ff7dca8731b8759c4bb84b80a1ed2dbc0edcf55da25949f1c59f43095601"} Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.737707 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.737715 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq" event={"ID":"e643ac6e-2430-4a8e-8fb1-386333d22f87","Type":"ContainerDied","Data":"ab617ee3de9ee1f4280d59fa981e433ef322a9ca4ddedce274c3d6f4526dce52"} Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.737835 4843 scope.go:117] "RemoveContainer" containerID="5aa52cbf23f0e9cca1fd5a1ad267919b9c7625d9a6ddbe84cefbfd225fc18fa3" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.774739 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq"] Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.779944 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b886f8d5b-plzlq"] Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.786428 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.991730 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a71fca9e-8650-446f-a3a8-e8b534f879a9" path="/var/lib/kubelet/pods/a71fca9e-8650-446f-a3a8-e8b534f879a9/volumes" Mar 18 12:15:24 crc kubenswrapper[4843]: I0318 12:15:24.992538 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e643ac6e-2430-4a8e-8fb1-386333d22f87" path="/var/lib/kubelet/pods/e643ac6e-2430-4a8e-8fb1-386333d22f87/volumes" Mar 18 12:15:25 crc kubenswrapper[4843]: I0318 12:15:25.636129 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:15:25 crc kubenswrapper[4843]: I0318 12:15:25.636509 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:15:25 crc kubenswrapper[4843]: I0318 12:15:25.774842 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rfg7x" podStartSLOduration=8.435946251 podStartE2EDuration="1m43.774823445s" podCreationTimestamp="2026-03-18 12:13:42 +0000 UTC" firstStartedPulling="2026-03-18 12:13:47.830841436 +0000 UTC m=+261.546666960" lastFinishedPulling="2026-03-18 12:15:23.16971863 +0000 UTC m=+356.885544154" observedRunningTime="2026-03-18 12:15:25.772713383 +0000 UTC m=+359.488538917" watchObservedRunningTime="2026-03-18 12:15:25.774823445 +0000 UTC m=+359.490648969" Mar 18 12:15:25 crc kubenswrapper[4843]: I0318 12:15:25.993287 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:15:25 crc kubenswrapper[4843]: I0318 12:15:25.993611 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.045605 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.363398 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6"] Mar 18 12:15:26 crc kubenswrapper[4843]: E0318 12:15:26.363670 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e643ac6e-2430-4a8e-8fb1-386333d22f87" containerName="route-controller-manager" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.363686 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="e643ac6e-2430-4a8e-8fb1-386333d22f87" containerName="route-controller-manager" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.363795 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="e643ac6e-2430-4a8e-8fb1-386333d22f87" containerName="route-controller-manager" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.364200 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.374772 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.374827 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.374825 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.374876 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.380380 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.380532 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.385105 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6"] Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.522105 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hllcc"] Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.546632 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e3ee223-0231-40f0-911b-fe651e958b6e-client-ca\") pod \"route-controller-manager-66dcd55869-xczn6\" (UID: \"8e3ee223-0231-40f0-911b-fe651e958b6e\") " pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.546707 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqjzh\" (UniqueName: \"kubernetes.io/projected/8e3ee223-0231-40f0-911b-fe651e958b6e-kube-api-access-dqjzh\") pod \"route-controller-manager-66dcd55869-xczn6\" (UID: \"8e3ee223-0231-40f0-911b-fe651e958b6e\") " pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.546735 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e3ee223-0231-40f0-911b-fe651e958b6e-serving-cert\") pod \"route-controller-manager-66dcd55869-xczn6\" (UID: \"8e3ee223-0231-40f0-911b-fe651e958b6e\") " pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.546756 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e3ee223-0231-40f0-911b-fe651e958b6e-config\") pod \"route-controller-manager-66dcd55869-xczn6\" (UID: \"8e3ee223-0231-40f0-911b-fe651e958b6e\") " pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.647969 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e3ee223-0231-40f0-911b-fe651e958b6e-client-ca\") pod \"route-controller-manager-66dcd55869-xczn6\" (UID: \"8e3ee223-0231-40f0-911b-fe651e958b6e\") " pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.648032 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqjzh\" (UniqueName: \"kubernetes.io/projected/8e3ee223-0231-40f0-911b-fe651e958b6e-kube-api-access-dqjzh\") pod \"route-controller-manager-66dcd55869-xczn6\" (UID: \"8e3ee223-0231-40f0-911b-fe651e958b6e\") " pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.648055 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e3ee223-0231-40f0-911b-fe651e958b6e-serving-cert\") pod \"route-controller-manager-66dcd55869-xczn6\" (UID: \"8e3ee223-0231-40f0-911b-fe651e958b6e\") " pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.648093 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e3ee223-0231-40f0-911b-fe651e958b6e-config\") pod \"route-controller-manager-66dcd55869-xczn6\" (UID: \"8e3ee223-0231-40f0-911b-fe651e958b6e\") " pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.649126 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e3ee223-0231-40f0-911b-fe651e958b6e-client-ca\") pod \"route-controller-manager-66dcd55869-xczn6\" (UID: \"8e3ee223-0231-40f0-911b-fe651e958b6e\") " pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.649244 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e3ee223-0231-40f0-911b-fe651e958b6e-config\") pod \"route-controller-manager-66dcd55869-xczn6\" (UID: \"8e3ee223-0231-40f0-911b-fe651e958b6e\") " pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.655454 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e3ee223-0231-40f0-911b-fe651e958b6e-serving-cert\") pod \"route-controller-manager-66dcd55869-xczn6\" (UID: \"8e3ee223-0231-40f0-911b-fe651e958b6e\") " pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.672343 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqjzh\" (UniqueName: \"kubernetes.io/projected/8e3ee223-0231-40f0-911b-fe651e958b6e-kube-api-access-dqjzh\") pod \"route-controller-manager-66dcd55869-xczn6\" (UID: \"8e3ee223-0231-40f0-911b-fe651e958b6e\") " pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.680144 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.832336 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dgr8z" podUID="feb1dd70-302b-4217-b17c-211aea971073" containerName="registry-server" probeResult="failure" output=< Mar 18 12:15:26 crc kubenswrapper[4843]: timeout: failed to connect service ":50051" within 1s Mar 18 12:15:26 crc kubenswrapper[4843]: > Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.893138 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrwd9" event={"ID":"7aa438f5-fa8b-43f7-93b7-67e592ea698c","Type":"ContainerStarted","Data":"394b6781b9a6b3fe002f89e9319bec82cf5b13ffd6a5c92c09102cb1dd5fe733"} Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.921948 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t62l2" event={"ID":"ffad2911-2fae-4030-a3ad-d36a5f95fc07","Type":"ContainerStarted","Data":"35a8d8b8d8b0c2b7c4e0d4c2299a98969e95467b83da197d3a158508b3c0e1e4"} Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.929573 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mrwd9" podStartSLOduration=6.163966548 podStartE2EDuration="1m41.929523042s" podCreationTimestamp="2026-03-18 12:13:45 +0000 UTC" firstStartedPulling="2026-03-18 12:13:50.595045204 +0000 UTC m=+264.310870728" lastFinishedPulling="2026-03-18 12:15:26.360601698 +0000 UTC m=+360.076427222" observedRunningTime="2026-03-18 12:15:26.929241694 +0000 UTC m=+360.645067238" watchObservedRunningTime="2026-03-18 12:15:26.929523042 +0000 UTC m=+360.645348566" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.930122 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" event={"ID":"f5d7e2c2-9b45-4bf4-84e9-e6417ca53cdb","Type":"ContainerStarted","Data":"5b1c58993959885812a3d46224bf07af24dac8f3da28ffad8b47f2db3736018d"} Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.930502 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:26 crc kubenswrapper[4843]: I0318 12:15:26.935931 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" Mar 18 12:15:27 crc kubenswrapper[4843]: I0318 12:15:27.170519 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t62l2" podStartSLOduration=8.69816286 podStartE2EDuration="1m45.170502941s" podCreationTimestamp="2026-03-18 12:13:42 +0000 UTC" firstStartedPulling="2026-03-18 12:13:49.421296365 +0000 UTC m=+263.137121889" lastFinishedPulling="2026-03-18 12:15:25.893636446 +0000 UTC m=+359.609461970" observedRunningTime="2026-03-18 12:15:27.169564344 +0000 UTC m=+360.885389858" watchObservedRunningTime="2026-03-18 12:15:27.170502941 +0000 UTC m=+360.886328465" Mar 18 12:15:27 crc kubenswrapper[4843]: I0318 12:15:27.219899 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:15:27 crc kubenswrapper[4843]: I0318 12:15:27.272141 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79c76b6c7f-j224n" podStartSLOduration=6.272122883 podStartE2EDuration="6.272122883s" podCreationTimestamp="2026-03-18 12:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:27.220458712 +0000 UTC m=+360.936284236" watchObservedRunningTime="2026-03-18 12:15:27.272122883 +0000 UTC m=+360.987948407" Mar 18 12:15:27 crc kubenswrapper[4843]: I0318 12:15:27.603566 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6"] Mar 18 12:15:27 crc kubenswrapper[4843]: W0318 12:15:27.611141 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e3ee223_0231_40f0_911b_fe651e958b6e.slice/crio-20695e45903cb8d323a2a74a3b46bcdd24f895c5a6b6f79d5f9e1103a2b9852b WatchSource:0}: Error finding container 20695e45903cb8d323a2a74a3b46bcdd24f895c5a6b6f79d5f9e1103a2b9852b: Status 404 returned error can't find the container with id 20695e45903cb8d323a2a74a3b46bcdd24f895c5a6b6f79d5f9e1103a2b9852b Mar 18 12:15:27 crc kubenswrapper[4843]: I0318 12:15:27.997728 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" event={"ID":"8e3ee223-0231-40f0-911b-fe651e958b6e","Type":"ContainerStarted","Data":"7337a49f6efc4bf85fa5d706e8a3548e0b363b1a3c39e38aa6e226dbc14915f9"} Mar 18 12:15:27 crc kubenswrapper[4843]: I0318 12:15:27.997970 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" event={"ID":"8e3ee223-0231-40f0-911b-fe651e958b6e","Type":"ContainerStarted","Data":"20695e45903cb8d323a2a74a3b46bcdd24f895c5a6b6f79d5f9e1103a2b9852b"} Mar 18 12:15:28 crc kubenswrapper[4843]: I0318 12:15:27.998403 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:28 crc kubenswrapper[4843]: I0318 12:15:28.000551 4843 patch_prober.go:28] interesting pod/route-controller-manager-66dcd55869-xczn6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" start-of-body= Mar 18 12:15:28 crc kubenswrapper[4843]: I0318 12:15:28.000601 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" podUID="8e3ee223-0231-40f0-911b-fe651e958b6e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" Mar 18 12:15:28 crc kubenswrapper[4843]: I0318 12:15:28.000855 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52bm4" event={"ID":"95b0d7ed-60bd-467a-a1b4-86f2c32096ab","Type":"ContainerStarted","Data":"a502b43887b50c3bdcb346481dd97e3e84845c8491bcde98c5b5d3d6db0d6777"} Mar 18 12:15:28 crc kubenswrapper[4843]: I0318 12:15:28.023968 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" podStartSLOduration=6.023946749 podStartE2EDuration="6.023946749s" podCreationTimestamp="2026-03-18 12:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:28.022748054 +0000 UTC m=+361.738573598" watchObservedRunningTime="2026-03-18 12:15:28.023946749 +0000 UTC m=+361.739772273" Mar 18 12:15:28 crc kubenswrapper[4843]: I0318 12:15:28.042505 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-52bm4" podStartSLOduration=7.036433634 podStartE2EDuration="1m43.042484387s" podCreationTimestamp="2026-03-18 12:13:45 +0000 UTC" firstStartedPulling="2026-03-18 12:13:50.567533821 +0000 UTC m=+264.283359345" lastFinishedPulling="2026-03-18 12:15:26.573584564 +0000 UTC m=+360.289410098" observedRunningTime="2026-03-18 12:15:28.039363487 +0000 UTC m=+361.755189021" watchObservedRunningTime="2026-03-18 12:15:28.042484387 +0000 UTC m=+361.758309921" Mar 18 12:15:29 crc kubenswrapper[4843]: I0318 12:15:29.011148 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" Mar 18 12:15:33 crc kubenswrapper[4843]: I0318 12:15:33.167465 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:15:33 crc kubenswrapper[4843]: I0318 12:15:33.167983 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:15:33 crc kubenswrapper[4843]: I0318 12:15:33.212079 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.100018 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.517059 4843 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.517825 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619" gracePeriod=15 Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.517905 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734" gracePeriod=15 Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.517953 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e" gracePeriod=15 Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.518044 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889" gracePeriod=15 Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.518154 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05" gracePeriod=15 Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.518763 4843 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:15:34 crc kubenswrapper[4843]: E0318 12:15:34.519020 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519033 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 12:15:34 crc kubenswrapper[4843]: E0318 12:15:34.519048 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519056 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 12:15:34 crc kubenswrapper[4843]: E0318 12:15:34.519071 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519079 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: E0318 12:15:34.519091 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519098 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: E0318 12:15:34.519156 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519164 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: E0318 12:15:34.519175 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519183 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: E0318 12:15:34.519193 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519200 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 12:15:34 crc kubenswrapper[4843]: E0318 12:15:34.519211 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519217 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 12:15:34 crc kubenswrapper[4843]: E0318 12:15:34.519227 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519233 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519366 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519375 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519384 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519396 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519407 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519417 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519425 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 12:15:34 crc kubenswrapper[4843]: E0318 12:15:34.519531 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519541 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519718 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.519731 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.522783 4843 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.524001 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.531085 4843 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.574356 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.615993 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.616068 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.616169 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.616234 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.616268 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.616296 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.616338 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.616421 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718529 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718592 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718677 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718686 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718721 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718698 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718772 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718793 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718745 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718828 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.719153 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718851 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718846 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718943 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.718845 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.719321 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.809291 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.809351 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.850447 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.851417 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.851714 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:34 crc kubenswrapper[4843]: I0318 12:15:34.877022 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:34 crc kubenswrapper[4843]: E0318 12:15:34.910471 4843 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.205:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189dee91070ffaeb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:15:34.909999851 +0000 UTC m=+368.625825375,LastTimestamp:2026-03-18 12:15:34.909999851 +0000 UTC m=+368.625825375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.046348 4843 generic.go:334] "Generic (PLEG): container finished" podID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" containerID="c3a21dbdc1baa16ce652ee6f854ea6b04d4bbb6f0faad7cd2d74e6c15fcff402" exitCode=0 Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.046440 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1ec08b84-0b49-4fa8-932e-2b18dc727e06","Type":"ContainerDied","Data":"c3a21dbdc1baa16ce652ee6f854ea6b04d4bbb6f0faad7cd2d74e6c15fcff402"} Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.047488 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.047772 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.048030 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.048560 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"80dd4cd53f34eb5368be287a857039b3b416dabd6a7bd9b5d7afa2666325b0d8"} Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.052276 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.054362 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.055116 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734" exitCode=0 Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.055145 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05" exitCode=0 Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.055157 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e" exitCode=0 Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.055166 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889" exitCode=2 Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.055227 4843 scope.go:117] "RemoveContainer" containerID="0ef73f5d18e263c21b67c7403902e0e9e4500e60d9e0b782a83c639ae3dcb9b8" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.100863 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.101451 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.101823 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.101983 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.691221 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.692110 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.692740 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.693207 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.693772 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.733491 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.734157 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.734460 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.734729 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4843]: I0318 12:15:35.734900 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.022989 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.023055 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.062363 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"30985177cb4103814d2cd4c61f95ad78901060bcca004ab786c7957c15ba1918"} Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.063329 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.063808 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.064153 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.064606 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.067077 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.067346 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.067877 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.068171 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.068491 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.069285 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.069839 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.111211 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.112190 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.112886 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.113392 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.113699 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.114150 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.348547 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.348935 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.390625 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.391914 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.392369 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.392696 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.392987 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.393682 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.394529 4843 status_manager.go:851] "Failed to get status for pod" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" pod="openshift-marketplace/redhat-marketplace-52bm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-52bm4\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.420899 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.421630 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.422058 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.422519 4843 status_manager.go:851] "Failed to get status for pod" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" pod="openshift-marketplace/redhat-marketplace-52bm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-52bm4\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.422971 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.423392 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.423906 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.543874 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec08b84-0b49-4fa8-932e-2b18dc727e06-kube-api-access\") pod \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\" (UID: \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\") " Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.543949 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec08b84-0b49-4fa8-932e-2b18dc727e06-var-lock\") pod \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\" (UID: \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\") " Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.544066 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec08b84-0b49-4fa8-932e-2b18dc727e06-kubelet-dir\") pod \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\" (UID: \"1ec08b84-0b49-4fa8-932e-2b18dc727e06\") " Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.544128 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ec08b84-0b49-4fa8-932e-2b18dc727e06-var-lock" (OuterVolumeSpecName: "var-lock") pod "1ec08b84-0b49-4fa8-932e-2b18dc727e06" (UID: "1ec08b84-0b49-4fa8-932e-2b18dc727e06"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.544249 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ec08b84-0b49-4fa8-932e-2b18dc727e06-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1ec08b84-0b49-4fa8-932e-2b18dc727e06" (UID: "1ec08b84-0b49-4fa8-932e-2b18dc727e06"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.544451 4843 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec08b84-0b49-4fa8-932e-2b18dc727e06-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.544466 4843 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec08b84-0b49-4fa8-932e-2b18dc727e06-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.551317 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec08b84-0b49-4fa8-932e-2b18dc727e06-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1ec08b84-0b49-4fa8-932e-2b18dc727e06" (UID: "1ec08b84-0b49-4fa8-932e-2b18dc727e06"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.646225 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec08b84-0b49-4fa8-932e-2b18dc727e06-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.989192 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.989597 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.989818 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.990034 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.990281 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.990538 4843 status_manager.go:851] "Failed to get status for pod" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" pod="openshift-marketplace/redhat-marketplace-52bm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-52bm4\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.995441 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.996305 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.996938 4843 status_manager.go:851] "Failed to get status for pod" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" pod="openshift-marketplace/redhat-marketplace-52bm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-52bm4\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.997288 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.997595 4843 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.997889 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.998164 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.998426 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4843]: I0318 12:15:36.998644 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.051386 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.051488 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.051568 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.051620 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.051634 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.051763 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.051918 4843 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.051941 4843 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.051951 4843 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.080322 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1ec08b84-0b49-4fa8-932e-2b18dc727e06","Type":"ContainerDied","Data":"df650527a6cff843fc27e6913b44cc8af8ba92a7fbeefec81240fe9bd94d61db"} Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.080594 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df650527a6cff843fc27e6913b44cc8af8ba92a7fbeefec81240fe9bd94d61db" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.080360 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.083981 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.084683 4843 status_manager.go:851] "Failed to get status for pod" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" pod="openshift-marketplace/redhat-marketplace-52bm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-52bm4\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.084860 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619" exitCode=0 Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.084936 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.085012 4843 scope.go:117] "RemoveContainer" containerID="995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.085208 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.085512 4843 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.085711 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.085937 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.086106 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.086427 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.102413 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.102916 4843 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.103305 4843 scope.go:117] "RemoveContainer" containerID="ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.103443 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.103830 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.104046 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.104240 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.104468 4843 status_manager.go:851] "Failed to get status for pod" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" pod="openshift-marketplace/redhat-marketplace-52bm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-52bm4\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.119110 4843 scope.go:117] "RemoveContainer" containerID="6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.134678 4843 scope.go:117] "RemoveContainer" containerID="0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.147601 4843 scope.go:117] "RemoveContainer" containerID="f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.164117 4843 scope.go:117] "RemoveContainer" containerID="2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.180087 4843 scope.go:117] "RemoveContainer" containerID="995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734" Mar 18 12:15:37 crc kubenswrapper[4843]: E0318 12:15:37.180637 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\": container with ID starting with 995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734 not found: ID does not exist" containerID="995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.180705 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734"} err="failed to get container status \"995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\": rpc error: code = NotFound desc = could not find container \"995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734\": container with ID starting with 995451006bc2b5ff05859a61cc63a3924c53b69bcb165ba737d76c24606e2734 not found: ID does not exist" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.180752 4843 scope.go:117] "RemoveContainer" containerID="ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05" Mar 18 12:15:37 crc kubenswrapper[4843]: E0318 12:15:37.181834 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\": container with ID starting with ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05 not found: ID does not exist" containerID="ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.181900 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05"} err="failed to get container status \"ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\": rpc error: code = NotFound desc = could not find container \"ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05\": container with ID starting with ca494800ba63fb83a4dee122637b05db3a7e5497a829ee9da1bc30df3ad22f05 not found: ID does not exist" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.181932 4843 scope.go:117] "RemoveContainer" containerID="6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e" Mar 18 12:15:37 crc kubenswrapper[4843]: E0318 12:15:37.182244 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\": container with ID starting with 6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e not found: ID does not exist" containerID="6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.182274 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e"} err="failed to get container status \"6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\": rpc error: code = NotFound desc = could not find container \"6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e\": container with ID starting with 6acd6dabcb88b7f7b5f60d237e4c90a12778edb0e7032742ab08c3156df33a0e not found: ID does not exist" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.182290 4843 scope.go:117] "RemoveContainer" containerID="0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889" Mar 18 12:15:37 crc kubenswrapper[4843]: E0318 12:15:37.182684 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\": container with ID starting with 0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889 not found: ID does not exist" containerID="0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.182718 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889"} err="failed to get container status \"0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\": rpc error: code = NotFound desc = could not find container \"0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889\": container with ID starting with 0ea1d73a107882bfad3a3df2c5b840429b98d61810f26c6ce95e8712e20e2889 not found: ID does not exist" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.182737 4843 scope.go:117] "RemoveContainer" containerID="f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619" Mar 18 12:15:37 crc kubenswrapper[4843]: E0318 12:15:37.183028 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\": container with ID starting with f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619 not found: ID does not exist" containerID="f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.183056 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619"} err="failed to get container status \"f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\": rpc error: code = NotFound desc = could not find container \"f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619\": container with ID starting with f02ce5b8c9e76a4657a1b10dfc411124d1ee78650f2f41b32f95794f92d4d619 not found: ID does not exist" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.183073 4843 scope.go:117] "RemoveContainer" containerID="2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8" Mar 18 12:15:37 crc kubenswrapper[4843]: E0318 12:15:37.184108 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\": container with ID starting with 2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8 not found: ID does not exist" containerID="2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.184178 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8"} err="failed to get container status \"2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\": rpc error: code = NotFound desc = could not find container \"2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8\": container with ID starting with 2237133ebd4da97cc22fe3d945ec85e5113f33d066d1fa81663723ab9ee6c8c8 not found: ID does not exist" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.194181 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.194841 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.195125 4843 status_manager.go:851] "Failed to get status for pod" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" pod="openshift-marketplace/redhat-marketplace-52bm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-52bm4\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.195400 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.195643 4843 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.195928 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.196177 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:37 crc kubenswrapper[4843]: I0318 12:15:37.196543 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:38 crc kubenswrapper[4843]: I0318 12:15:38.996053 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 12:15:40 crc kubenswrapper[4843]: E0318 12:15:40.786760 4843 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.205:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189dee91070ffaeb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:15:34.909999851 +0000 UTC m=+368.625825375,LastTimestamp:2026-03-18 12:15:34.909999851 +0000 UTC m=+368.625825375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:15:43 crc kubenswrapper[4843]: E0318 12:15:43.415505 4843 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:43 crc kubenswrapper[4843]: E0318 12:15:43.416028 4843 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:43 crc kubenswrapper[4843]: E0318 12:15:43.416301 4843 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:43 crc kubenswrapper[4843]: E0318 12:15:43.416576 4843 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:43 crc kubenswrapper[4843]: E0318 12:15:43.416874 4843 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:43 crc kubenswrapper[4843]: I0318 12:15:43.416906 4843 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 12:15:43 crc kubenswrapper[4843]: E0318 12:15:43.417233 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" interval="200ms" Mar 18 12:15:43 crc kubenswrapper[4843]: E0318 12:15:43.618610 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" interval="400ms" Mar 18 12:15:44 crc kubenswrapper[4843]: E0318 12:15:44.019926 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" interval="800ms" Mar 18 12:15:44 crc kubenswrapper[4843]: E0318 12:15:44.821912 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" interval="1.6s" Mar 18 12:15:45 crc kubenswrapper[4843]: I0318 12:15:45.504828 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:15:45 crc kubenswrapper[4843]: I0318 12:15:45.506989 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:45 crc kubenswrapper[4843]: I0318 12:15:45.507196 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:45 crc kubenswrapper[4843]: I0318 12:15:45.507455 4843 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:45 crc kubenswrapper[4843]: I0318 12:15:45.507788 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:45 crc kubenswrapper[4843]: I0318 12:15:45.508480 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:45 crc kubenswrapper[4843]: I0318 12:15:45.508724 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:45 crc kubenswrapper[4843]: I0318 12:15:45.509035 4843 status_manager.go:851] "Failed to get status for pod" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" pod="openshift-marketplace/redhat-marketplace-52bm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-52bm4\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:46 crc kubenswrapper[4843]: E0318 12:15:46.424068 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" interval="3.2s" Mar 18 12:15:46 crc kubenswrapper[4843]: I0318 12:15:46.987300 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:46 crc kubenswrapper[4843]: I0318 12:15:46.987844 4843 status_manager.go:851] "Failed to get status for pod" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" pod="openshift-marketplace/redhat-marketplace-52bm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-52bm4\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:46 crc kubenswrapper[4843]: I0318 12:15:46.988275 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:46 crc kubenswrapper[4843]: I0318 12:15:46.988763 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:46 crc kubenswrapper[4843]: I0318 12:15:46.989223 4843 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:46 crc kubenswrapper[4843]: I0318 12:15:46.989695 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:46 crc kubenswrapper[4843]: I0318 12:15:46.990051 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:48 crc kubenswrapper[4843]: I0318 12:15:48.983523 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:48 crc kubenswrapper[4843]: I0318 12:15:48.985364 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:48 crc kubenswrapper[4843]: I0318 12:15:48.986711 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:48 crc kubenswrapper[4843]: I0318 12:15:48.987353 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:48 crc kubenswrapper[4843]: I0318 12:15:48.988232 4843 status_manager.go:851] "Failed to get status for pod" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" pod="openshift-marketplace/redhat-marketplace-52bm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-52bm4\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:48 crc kubenswrapper[4843]: I0318 12:15:48.988860 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:48 crc kubenswrapper[4843]: I0318 12:15:48.989273 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:48 crc kubenswrapper[4843]: I0318 12:15:48.989632 4843 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:48 crc kubenswrapper[4843]: I0318 12:15:48.999479 4843 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f49b56c-dbaf-4a37-9508-d8e894da9149" Mar 18 12:15:48 crc kubenswrapper[4843]: I0318 12:15:48.999519 4843 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f49b56c-dbaf-4a37-9508-d8e894da9149" Mar 18 12:15:49 crc kubenswrapper[4843]: E0318 12:15:49.000616 4843 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.001192 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.159348 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"220ca399a9552e2fb8f352922f37c9dc1122d57b4ec680df4ee2d8509fea9210"} Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.161887 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.163221 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.163286 4843 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e" exitCode=1 Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.163335 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e"} Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.164066 4843 scope.go:117] "RemoveContainer" containerID="07fbcf3bceb61d275b47d4d252105653eb996aa8158391b964025a10ba90a75e" Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.164530 4843 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.165027 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.165563 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.165957 4843 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.166412 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.166737 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.166978 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:49 crc kubenswrapper[4843]: I0318 12:15:49.167213 4843 status_manager.go:851] "Failed to get status for pod" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" pod="openshift-marketplace/redhat-marketplace-52bm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-52bm4\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:49 crc kubenswrapper[4843]: E0318 12:15:49.625448 4843 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.205:6443: connect: connection refused" interval="6.4s" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.172092 4843 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="9829345f0eacd912ed87619f8ec3b38ecdfe6049fe629afb0f7c38388a411f0f" exitCode=0 Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.172174 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"9829345f0eacd912ed87619f8ec3b38ecdfe6049fe629afb0f7c38388a411f0f"} Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.172407 4843 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f49b56c-dbaf-4a37-9508-d8e894da9149" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.172433 4843 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f49b56c-dbaf-4a37-9508-d8e894da9149" Mar 18 12:15:50 crc kubenswrapper[4843]: E0318 12:15:50.172897 4843 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.173109 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.173382 4843 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.173611 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.173888 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.174138 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.174448 4843 status_manager.go:851] "Failed to get status for pod" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" pod="openshift-marketplace/redhat-marketplace-52bm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-52bm4\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.174540 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.174969 4843 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.175277 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.176001 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.176047 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a87ccf378b547d92535ed7ed5ae77ee1a88669e505741116a4fdd99f3bc192d5"} Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.176603 4843 status_manager.go:851] "Failed to get status for pod" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" pod="openshift-marketplace/redhat-operators-mrwd9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mrwd9\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.177096 4843 status_manager.go:851] "Failed to get status for pod" podUID="feb1dd70-302b-4217-b17c-211aea971073" pod="openshift-marketplace/redhat-operators-dgr8z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-dgr8z\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.177485 4843 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.177800 4843 status_manager.go:851] "Failed to get status for pod" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" pod="openshift-marketplace/redhat-marketplace-52bm4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-52bm4\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.178171 4843 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.178683 4843 status_manager.go:851] "Failed to get status for pod" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.179271 4843 status_manager.go:851] "Failed to get status for pod" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" pod="openshift-marketplace/certified-operators-t62l2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-t62l2\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:50 crc kubenswrapper[4843]: I0318 12:15:50.179880 4843 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.129.56.205:6443: connect: connection refused" Mar 18 12:15:51 crc kubenswrapper[4843]: I0318 12:15:51.186953 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0d9288215d659f3b37ea5532878638504a32346da694b32451948628408c8542"} Mar 18 12:15:51 crc kubenswrapper[4843]: I0318 12:15:51.187243 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"29277fa22cc678dfdaa5bf0642da00f8e4c33b14cd6f8180134298e37ebab02f"} Mar 18 12:15:51 crc kubenswrapper[4843]: I0318 12:15:51.187253 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"45aebd7bf107a19b2bbdd7c7eb9dc99d4fa7eb67295704ff66278fafab7f112b"} Mar 18 12:15:51 crc kubenswrapper[4843]: I0318 12:15:51.561359 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" podUID="d23cfc00-6762-41fb-bf10-e8aa0eda250b" containerName="oauth-openshift" containerID="cri-o://247bc2a7727d0ad2ea7fc84497909b28d97d02d823d34e3a7070a26ad920e270" gracePeriod=15 Mar 18 12:15:51 crc kubenswrapper[4843]: I0318 12:15:51.859258 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.050349 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154238 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d23cfc00-6762-41fb-bf10-e8aa0eda250b-audit-dir\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154338 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d23cfc00-6762-41fb-bf10-e8aa0eda250b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154345 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-router-certs\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154376 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-ocp-branding-template\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154421 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-login\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154450 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-service-ca\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154468 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-trusted-ca-bundle\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154505 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-error\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154553 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-idp-0-file-data\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154571 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmfwp\" (UniqueName: \"kubernetes.io/projected/d23cfc00-6762-41fb-bf10-e8aa0eda250b-kube-api-access-hmfwp\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154596 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-cliconfig\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154613 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-provider-selection\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154635 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-session\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154674 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-serving-cert\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.154694 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-audit-policies\") pod \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\" (UID: \"d23cfc00-6762-41fb-bf10-e8aa0eda250b\") " Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.155008 4843 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d23cfc00-6762-41fb-bf10-e8aa0eda250b-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.155175 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.155364 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.155597 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.155709 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.163320 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.166414 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d23cfc00-6762-41fb-bf10-e8aa0eda250b-kube-api-access-hmfwp" (OuterVolumeSpecName: "kube-api-access-hmfwp") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "kube-api-access-hmfwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.166467 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.185635 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.185671 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.185831 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.186315 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.186823 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.186914 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d23cfc00-6762-41fb-bf10-e8aa0eda250b" (UID: "d23cfc00-6762-41fb-bf10-e8aa0eda250b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.196636 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c73300a1046f964a9e4138173e9bba6cc46c626be29950b3ad6125b8a00ed38f"} Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.197544 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9a4c91701913aadfa1e0acac6e41be62567810f705880fdbe5accd8a3df07da3"} Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.196925 4843 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f49b56c-dbaf-4a37-9508-d8e894da9149" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.197628 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.197726 4843 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f49b56c-dbaf-4a37-9508-d8e894da9149" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.198234 4843 generic.go:334] "Generic (PLEG): container finished" podID="d23cfc00-6762-41fb-bf10-e8aa0eda250b" containerID="247bc2a7727d0ad2ea7fc84497909b28d97d02d823d34e3a7070a26ad920e270" exitCode=0 Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.198267 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" event={"ID":"d23cfc00-6762-41fb-bf10-e8aa0eda250b","Type":"ContainerDied","Data":"247bc2a7727d0ad2ea7fc84497909b28d97d02d823d34e3a7070a26ad920e270"} Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.198312 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.198334 4843 scope.go:117] "RemoveContainer" containerID="247bc2a7727d0ad2ea7fc84497909b28d97d02d823d34e3a7070a26ad920e270" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.198312 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hllcc" event={"ID":"d23cfc00-6762-41fb-bf10-e8aa0eda250b","Type":"ContainerDied","Data":"ccbe2ab738871631d1d9e6345d31623a613815921ac57258333ae6dff7fd4812"} Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.226845 4843 scope.go:117] "RemoveContainer" containerID="247bc2a7727d0ad2ea7fc84497909b28d97d02d823d34e3a7070a26ad920e270" Mar 18 12:15:52 crc kubenswrapper[4843]: E0318 12:15:52.230793 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"247bc2a7727d0ad2ea7fc84497909b28d97d02d823d34e3a7070a26ad920e270\": container with ID starting with 247bc2a7727d0ad2ea7fc84497909b28d97d02d823d34e3a7070a26ad920e270 not found: ID does not exist" containerID="247bc2a7727d0ad2ea7fc84497909b28d97d02d823d34e3a7070a26ad920e270" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.230842 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247bc2a7727d0ad2ea7fc84497909b28d97d02d823d34e3a7070a26ad920e270"} err="failed to get container status \"247bc2a7727d0ad2ea7fc84497909b28d97d02d823d34e3a7070a26ad920e270\": rpc error: code = NotFound desc = could not find container \"247bc2a7727d0ad2ea7fc84497909b28d97d02d823d34e3a7070a26ad920e270\": container with ID starting with 247bc2a7727d0ad2ea7fc84497909b28d97d02d823d34e3a7070a26ad920e270 not found: ID does not exist" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.256058 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.256092 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.256105 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.256116 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.256125 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.256134 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.256143 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.256153 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmfwp\" (UniqueName: \"kubernetes.io/projected/d23cfc00-6762-41fb-bf10-e8aa0eda250b-kube-api-access-hmfwp\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.256162 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.256172 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.256183 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.256192 4843 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d23cfc00-6762-41fb-bf10-e8aa0eda250b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:52 crc kubenswrapper[4843]: I0318 12:15:52.256202 4843 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d23cfc00-6762-41fb-bf10-e8aa0eda250b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:54 crc kubenswrapper[4843]: I0318 12:15:54.001766 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:54 crc kubenswrapper[4843]: I0318 12:15:54.002130 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:54 crc kubenswrapper[4843]: I0318 12:15:54.008052 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:56 crc kubenswrapper[4843]: I0318 12:15:56.185354 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:15:56 crc kubenswrapper[4843]: I0318 12:15:56.191033 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:15:57 crc kubenswrapper[4843]: I0318 12:15:57.324321 4843 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:57 crc kubenswrapper[4843]: I0318 12:15:57.957076 4843 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="845f5b1a-8af3-442d-80ab-a1ca5d2c6d29" Mar 18 12:15:58 crc kubenswrapper[4843]: I0318 12:15:58.238797 4843 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f49b56c-dbaf-4a37-9508-d8e894da9149" Mar 18 12:15:58 crc kubenswrapper[4843]: I0318 12:15:58.238833 4843 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f49b56c-dbaf-4a37-9508-d8e894da9149" Mar 18 12:15:58 crc kubenswrapper[4843]: I0318 12:15:58.243191 4843 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="845f5b1a-8af3-442d-80ab-a1ca5d2c6d29" Mar 18 12:15:58 crc kubenswrapper[4843]: I0318 12:15:58.243439 4843 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://45aebd7bf107a19b2bbdd7c7eb9dc99d4fa7eb67295704ff66278fafab7f112b" Mar 18 12:15:58 crc kubenswrapper[4843]: I0318 12:15:58.243469 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:59 crc kubenswrapper[4843]: I0318 12:15:59.258524 4843 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f49b56c-dbaf-4a37-9508-d8e894da9149" Mar 18 12:15:59 crc kubenswrapper[4843]: I0318 12:15:59.258573 4843 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f49b56c-dbaf-4a37-9508-d8e894da9149" Mar 18 12:15:59 crc kubenswrapper[4843]: I0318 12:15:59.263888 4843 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="845f5b1a-8af3-442d-80ab-a1ca5d2c6d29" Mar 18 12:16:01 crc kubenswrapper[4843]: I0318 12:16:01.861830 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:16:08 crc kubenswrapper[4843]: I0318 12:16:08.004464 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 12:16:08 crc kubenswrapper[4843]: I0318 12:16:08.115770 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 12:16:08 crc kubenswrapper[4843]: I0318 12:16:08.255027 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 12:16:08 crc kubenswrapper[4843]: I0318 12:16:08.643186 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 12:16:08 crc kubenswrapper[4843]: I0318 12:16:08.672855 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 12:16:08 crc kubenswrapper[4843]: I0318 12:16:08.834342 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 12:16:08 crc kubenswrapper[4843]: I0318 12:16:08.898173 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 12:16:09 crc kubenswrapper[4843]: I0318 12:16:09.060441 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 12:16:09 crc kubenswrapper[4843]: I0318 12:16:09.254046 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 12:16:09 crc kubenswrapper[4843]: I0318 12:16:09.330520 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 12:16:09 crc kubenswrapper[4843]: I0318 12:16:09.351881 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 12:16:09 crc kubenswrapper[4843]: I0318 12:16:09.483312 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 12:16:09 crc kubenswrapper[4843]: I0318 12:16:09.647812 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 12:16:09 crc kubenswrapper[4843]: I0318 12:16:09.683770 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 12:16:09 crc kubenswrapper[4843]: I0318 12:16:09.718623 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 12:16:10 crc kubenswrapper[4843]: I0318 12:16:10.163291 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 12:16:10 crc kubenswrapper[4843]: I0318 12:16:10.228036 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 12:16:10 crc kubenswrapper[4843]: I0318 12:16:10.316843 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 12:16:10 crc kubenswrapper[4843]: I0318 12:16:10.359517 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 12:16:10 crc kubenswrapper[4843]: I0318 12:16:10.528528 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 12:16:10 crc kubenswrapper[4843]: I0318 12:16:10.581644 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 12:16:10 crc kubenswrapper[4843]: I0318 12:16:10.672024 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 12:16:10 crc kubenswrapper[4843]: I0318 12:16:10.690425 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 12:16:10 crc kubenswrapper[4843]: I0318 12:16:10.945962 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.007137 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.033707 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.137447 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.207968 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.225751 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.346324 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.388982 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.398521 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.485499 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.588997 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.811006 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.880007 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.912697 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 12:16:11 crc kubenswrapper[4843]: I0318 12:16:11.988971 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 12:16:12 crc kubenswrapper[4843]: I0318 12:16:12.165016 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 12:16:12 crc kubenswrapper[4843]: I0318 12:16:12.264745 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 12:16:12 crc kubenswrapper[4843]: I0318 12:16:12.591468 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 12:16:12 crc kubenswrapper[4843]: I0318 12:16:12.599282 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 12:16:12 crc kubenswrapper[4843]: I0318 12:16:12.688538 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 12:16:12 crc kubenswrapper[4843]: I0318 12:16:12.694595 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 12:16:12 crc kubenswrapper[4843]: I0318 12:16:12.707283 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 12:16:12 crc kubenswrapper[4843]: I0318 12:16:12.786900 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 12:16:12 crc kubenswrapper[4843]: I0318 12:16:12.843672 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 12:16:12 crc kubenswrapper[4843]: I0318 12:16:12.865469 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 12:16:12 crc kubenswrapper[4843]: I0318 12:16:12.932915 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.039090 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.144106 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.187601 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.211414 4843 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.234502 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.346175 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.376422 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.381613 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.466487 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.497446 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.517914 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.632479 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.639858 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.697644 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.706711 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.743705 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.760732 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.827404 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.932109 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.948330 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.948767 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 12:16:13 crc kubenswrapper[4843]: I0318 12:16:13.975915 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.069111 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.186531 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.215868 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.230821 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.247982 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.257052 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.350199 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.376303 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.426096 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.515169 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.527336 4843 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.529317 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=40.529296646 podStartE2EDuration="40.529296646s" podCreationTimestamp="2026-03-18 12:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:57.50325496 +0000 UTC m=+391.219080484" watchObservedRunningTime="2026-03-18 12:16:14.529296646 +0000 UTC m=+408.245122190" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.535584 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-hllcc"] Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.535981 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563936-xkdrn","openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.536510 4843 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f49b56c-dbaf-4a37-9508-d8e894da9149" Mar 18 12:16:14 crc kubenswrapper[4843]: E0318 12:16:14.536744 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" containerName="installer" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.536789 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" containerName="installer" Mar 18 12:16:14 crc kubenswrapper[4843]: E0318 12:16:14.536811 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d23cfc00-6762-41fb-bf10-e8aa0eda250b" containerName="oauth-openshift" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.536822 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23cfc00-6762-41fb-bf10-e8aa0eda250b" containerName="oauth-openshift" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.536969 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec08b84-0b49-4fa8-932e-2b18dc727e06" containerName="installer" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.536984 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d23cfc00-6762-41fb-bf10-e8aa0eda250b" containerName="oauth-openshift" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.537484 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563936-xkdrn" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.536754 4843 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3f49b56c-dbaf-4a37-9508-d8e894da9149" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.539365 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.540039 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.542699 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.545240 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.557954 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.559372 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.559355252 podStartE2EDuration="17.559355252s" podCreationTimestamp="2026-03-18 12:15:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:16:14.556938583 +0000 UTC m=+408.272764107" watchObservedRunningTime="2026-03-18 12:16:14.559355252 +0000 UTC m=+408.275180776" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.572596 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzbzd\" (UniqueName: \"kubernetes.io/projected/1c22c026-c039-414d-aa42-cfbcd1799c74-kube-api-access-zzbzd\") pod \"auto-csr-approver-29563936-xkdrn\" (UID: \"1c22c026-c039-414d-aa42-cfbcd1799c74\") " pod="openshift-infra/auto-csr-approver-29563936-xkdrn" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.595642 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.626242 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.656810 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.674909 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzbzd\" (UniqueName: \"kubernetes.io/projected/1c22c026-c039-414d-aa42-cfbcd1799c74-kube-api-access-zzbzd\") pod \"auto-csr-approver-29563936-xkdrn\" (UID: \"1c22c026-c039-414d-aa42-cfbcd1799c74\") " pod="openshift-infra/auto-csr-approver-29563936-xkdrn" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.695298 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzbzd\" (UniqueName: \"kubernetes.io/projected/1c22c026-c039-414d-aa42-cfbcd1799c74-kube-api-access-zzbzd\") pod \"auto-csr-approver-29563936-xkdrn\" (UID: \"1c22c026-c039-414d-aa42-cfbcd1799c74\") " pod="openshift-infra/auto-csr-approver-29563936-xkdrn" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.748029 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.775732 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.857844 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563936-xkdrn" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.864943 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 12:16:14 crc kubenswrapper[4843]: I0318 12:16:14.997233 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d23cfc00-6762-41fb-bf10-e8aa0eda250b" path="/var/lib/kubelet/pods/d23cfc00-6762-41fb-bf10-e8aa0eda250b/volumes" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.005119 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.007092 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.109305 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.159950 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.193165 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.220372 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.247356 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.285282 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563936-xkdrn"] Mar 18 12:16:15 crc kubenswrapper[4843]: W0318 12:16:15.289393 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c22c026_c039_414d_aa42_cfbcd1799c74.slice/crio-c1ca1c49dd8a4fb6af01d57656dc94a0038c4ab187e7cc0d9d8e714f2de565bc WatchSource:0}: Error finding container c1ca1c49dd8a4fb6af01d57656dc94a0038c4ab187e7cc0d9d8e714f2de565bc: Status 404 returned error can't find the container with id c1ca1c49dd8a4fb6af01d57656dc94a0038c4ab187e7cc0d9d8e714f2de565bc Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.293060 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.300903 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.326069 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.353568 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563936-xkdrn" event={"ID":"1c22c026-c039-414d-aa42-cfbcd1799c74","Type":"ContainerStarted","Data":"c1ca1c49dd8a4fb6af01d57656dc94a0038c4ab187e7cc0d9d8e714f2de565bc"} Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.371628 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.401440 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.406054 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.417804 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.462445 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.475160 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.565930 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.589714 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.694385 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.718719 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.723936 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.826608 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.835958 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 12:16:15 crc kubenswrapper[4843]: I0318 12:16:15.915232 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.030038 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.037622 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.086791 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.105198 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.119840 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.352022 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.401073 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.407061 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.488450 4843 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.491470 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.504377 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.518241 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.669109 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-f6658f7c8-d8sx7"] Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.669810 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.676199 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.676442 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.676512 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.676597 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.676643 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.677095 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.677382 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.677579 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.677978 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.678070 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.678402 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.678522 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.684122 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.686797 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.686996 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f6658f7c8-d8sx7"] Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.688297 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.694282 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.736568 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.811585 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-user-template-error\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.811638 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.811685 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-service-ca\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.811714 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.811735 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-user-template-login\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.811758 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.811815 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.811859 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4bb5625-fb97-4fb1-aa84-98731c796ce4-audit-dir\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.811884 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-router-certs\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.811990 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.812016 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mv5n\" (UniqueName: \"kubernetes.io/projected/f4bb5625-fb97-4fb1-aa84-98731c796ce4-kube-api-access-5mv5n\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.812038 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f4bb5625-fb97-4fb1-aa84-98731c796ce4-audit-policies\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.812165 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.812234 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-session\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.819760 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.831338 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.833890 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.890914 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913297 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4bb5625-fb97-4fb1-aa84-98731c796ce4-audit-dir\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913357 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-router-certs\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913386 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913405 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mv5n\" (UniqueName: \"kubernetes.io/projected/f4bb5625-fb97-4fb1-aa84-98731c796ce4-kube-api-access-5mv5n\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913427 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f4bb5625-fb97-4fb1-aa84-98731c796ce4-audit-policies\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913450 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913469 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-session\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913494 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-user-template-error\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913510 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913531 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-service-ca\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913554 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913572 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-user-template-login\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913594 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.913609 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.914098 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4bb5625-fb97-4fb1-aa84-98731c796ce4-audit-dir\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.915802 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f4bb5625-fb97-4fb1-aa84-98731c796ce4-audit-policies\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.916287 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-service-ca\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.916381 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.917376 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.920513 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.921101 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-user-template-login\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.921886 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.922920 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-user-template-error\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.923298 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.930441 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-session\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.930927 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.931036 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-system-router-certs\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.933273 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.952406 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f4bb5625-fb97-4fb1-aa84-98731c796ce4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.952483 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mv5n\" (UniqueName: \"kubernetes.io/projected/f4bb5625-fb97-4fb1-aa84-98731c796ce4-kube-api-access-5mv5n\") pod \"oauth-openshift-f6658f7c8-d8sx7\" (UID: \"f4bb5625-fb97-4fb1-aa84-98731c796ce4\") " pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:16 crc kubenswrapper[4843]: I0318 12:16:16.972217 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.000725 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.007549 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.046760 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.087480 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.097208 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.213511 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.258078 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f6658f7c8-d8sx7"] Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.349285 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.352349 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.369783 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" event={"ID":"f4bb5625-fb97-4fb1-aa84-98731c796ce4","Type":"ContainerStarted","Data":"34321f864548a701d38ab9f001bc7796b28ae54a37e6bdc9a28e72cdcc8ea0b0"} Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.373852 4843 generic.go:334] "Generic (PLEG): container finished" podID="1c22c026-c039-414d-aa42-cfbcd1799c74" containerID="36866618c1f9501a88cf88087a428b34a0727f9f12b83f17d7be3e7af7d7ca01" exitCode=0 Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.373903 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563936-xkdrn" event={"ID":"1c22c026-c039-414d-aa42-cfbcd1799c74","Type":"ContainerDied","Data":"36866618c1f9501a88cf88087a428b34a0727f9f12b83f17d7be3e7af7d7ca01"} Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.439315 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.453410 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.620634 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.644320 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.752852 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.805044 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.898308 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 12:16:17 crc kubenswrapper[4843]: I0318 12:16:17.991567 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.024207 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.059111 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.125269 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.140328 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.142898 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.252988 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.262791 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.311439 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.521057 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.525533 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.525818 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.525977 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.529933 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.534419 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" event={"ID":"f4bb5625-fb97-4fb1-aa84-98731c796ce4","Type":"ContainerStarted","Data":"3d23f130944022f6ab8a1c6a763b05ae624ff2a8bae1131852cd6152dc8128c1"} Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.534791 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.535333 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.544761 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.564067 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-f6658f7c8-d8sx7" podStartSLOduration=52.5640413 podStartE2EDuration="52.5640413s" podCreationTimestamp="2026-03-18 12:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:16:18.562247031 +0000 UTC m=+412.278072555" watchObservedRunningTime="2026-03-18 12:16:18.5640413 +0000 UTC m=+412.279866844" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.670360 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.680272 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.764510 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.782329 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.785237 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.804025 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563936-xkdrn" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.842113 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzbzd\" (UniqueName: \"kubernetes.io/projected/1c22c026-c039-414d-aa42-cfbcd1799c74-kube-api-access-zzbzd\") pod \"1c22c026-c039-414d-aa42-cfbcd1799c74\" (UID: \"1c22c026-c039-414d-aa42-cfbcd1799c74\") " Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.847269 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c22c026-c039-414d-aa42-cfbcd1799c74-kube-api-access-zzbzd" (OuterVolumeSpecName: "kube-api-access-zzbzd") pod "1c22c026-c039-414d-aa42-cfbcd1799c74" (UID: "1c22c026-c039-414d-aa42-cfbcd1799c74"). InnerVolumeSpecName "kube-api-access-zzbzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.871067 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.878605 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.927217 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.939003 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.944056 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzbzd\" (UniqueName: \"kubernetes.io/projected/1c22c026-c039-414d-aa42-cfbcd1799c74-kube-api-access-zzbzd\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.953256 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 12:16:18 crc kubenswrapper[4843]: I0318 12:16:18.996115 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.097696 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.159207 4843 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.233729 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.340887 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.380402 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.407761 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.411856 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.441343 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.539913 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563936-xkdrn" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.539902 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563936-xkdrn" event={"ID":"1c22c026-c039-414d-aa42-cfbcd1799c74","Type":"ContainerDied","Data":"c1ca1c49dd8a4fb6af01d57656dc94a0038c4ab187e7cc0d9d8e714f2de565bc"} Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.540142 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1ca1c49dd8a4fb6af01d57656dc94a0038c4ab187e7cc0d9d8e714f2de565bc" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.554625 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.597781 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.641343 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.652871 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.667260 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.732303 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.733891 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.932398 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.932772 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 12:16:19 crc kubenswrapper[4843]: I0318 12:16:19.933068 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.010042 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.035803 4843 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.036045 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://30985177cb4103814d2cd4c61f95ad78901060bcca004ab786c7957c15ba1918" gracePeriod=5 Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.104925 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.148924 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.152754 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.179869 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.253834 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.374426 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.508071 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.513156 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.588485 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.601547 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.822832 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 12:16:20 crc kubenswrapper[4843]: I0318 12:16:20.927767 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 12:16:21 crc kubenswrapper[4843]: I0318 12:16:21.130748 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 12:16:21 crc kubenswrapper[4843]: I0318 12:16:21.259965 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 12:16:21 crc kubenswrapper[4843]: I0318 12:16:21.539998 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 12:16:21 crc kubenswrapper[4843]: I0318 12:16:21.574042 4843 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 12:16:21 crc kubenswrapper[4843]: I0318 12:16:21.580278 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 12:16:21 crc kubenswrapper[4843]: I0318 12:16:21.634072 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 12:16:21 crc kubenswrapper[4843]: I0318 12:16:21.653805 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 12:16:21 crc kubenswrapper[4843]: I0318 12:16:21.678254 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 12:16:21 crc kubenswrapper[4843]: I0318 12:16:21.699250 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 12:16:21 crc kubenswrapper[4843]: I0318 12:16:21.846232 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 12:16:21 crc kubenswrapper[4843]: I0318 12:16:21.899542 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.154053 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.176428 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.243475 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.253923 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.275888 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.378487 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.401025 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.498406 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.500761 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.582570 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.598429 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.639021 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.703951 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:16:22 crc kubenswrapper[4843]: I0318 12:16:22.736274 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 12:16:23 crc kubenswrapper[4843]: I0318 12:16:23.554938 4843 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.203754 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.204013 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.334108 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.334191 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.334230 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.334267 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.334293 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.334366 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.334428 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.334458 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.334558 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.334639 4843 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.334673 4843 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.334685 4843 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.334696 4843 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.342216 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.435718 4843 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.592219 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.592300 4843 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="30985177cb4103814d2cd4c61f95ad78901060bcca004ab786c7957c15ba1918" exitCode=137 Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.592351 4843 scope.go:117] "RemoveContainer" containerID="30985177cb4103814d2cd4c61f95ad78901060bcca004ab786c7957c15ba1918" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.592414 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.609982 4843 scope.go:117] "RemoveContainer" containerID="30985177cb4103814d2cd4c61f95ad78901060bcca004ab786c7957c15ba1918" Mar 18 12:16:25 crc kubenswrapper[4843]: E0318 12:16:25.610524 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30985177cb4103814d2cd4c61f95ad78901060bcca004ab786c7957c15ba1918\": container with ID starting with 30985177cb4103814d2cd4c61f95ad78901060bcca004ab786c7957c15ba1918 not found: ID does not exist" containerID="30985177cb4103814d2cd4c61f95ad78901060bcca004ab786c7957c15ba1918" Mar 18 12:16:25 crc kubenswrapper[4843]: I0318 12:16:25.610565 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30985177cb4103814d2cd4c61f95ad78901060bcca004ab786c7957c15ba1918"} err="failed to get container status \"30985177cb4103814d2cd4c61f95ad78901060bcca004ab786c7957c15ba1918\": rpc error: code = NotFound desc = could not find container \"30985177cb4103814d2cd4c61f95ad78901060bcca004ab786c7957c15ba1918\": container with ID starting with 30985177cb4103814d2cd4c61f95ad78901060bcca004ab786c7957c15ba1918 not found: ID does not exist" Mar 18 12:16:27 crc kubenswrapper[4843]: I0318 12:16:27.006245 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 12:16:27 crc kubenswrapper[4843]: I0318 12:16:27.006666 4843 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 18 12:16:27 crc kubenswrapper[4843]: I0318 12:16:27.017890 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 12:16:27 crc kubenswrapper[4843]: I0318 12:16:27.017930 4843 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="26d115c2-4f44-4ee6-9134-89c986ec985c" Mar 18 12:16:27 crc kubenswrapper[4843]: I0318 12:16:27.022897 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 12:16:27 crc kubenswrapper[4843]: I0318 12:16:27.022934 4843 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="26d115c2-4f44-4ee6-9134-89c986ec985c" Mar 18 12:16:41 crc kubenswrapper[4843]: I0318 12:16:41.479288 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 12:16:42 crc kubenswrapper[4843]: I0318 12:16:42.114176 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 12:16:48 crc kubenswrapper[4843]: I0318 12:16:48.073856 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 12:16:52 crc kubenswrapper[4843]: I0318 12:16:52.243591 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 12:16:55 crc kubenswrapper[4843]: I0318 12:16:55.579164 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 12:16:57 crc kubenswrapper[4843]: I0318 12:16:57.211941 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 12:17:20 crc kubenswrapper[4843]: I0318 12:17:20.035765 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:17:20 crc kubenswrapper[4843]: I0318 12:17:20.036859 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:17:31 crc kubenswrapper[4843]: I0318 12:17:31.547036 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t62l2"] Mar 18 12:17:31 crc kubenswrapper[4843]: I0318 12:17:31.547736 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t62l2" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" containerName="registry-server" containerID="cri-o://35a8d8b8d8b0c2b7c4e0d4c2299a98969e95467b83da197d3a158508b3c0e1e4" gracePeriod=2 Mar 18 12:17:31 crc kubenswrapper[4843]: I0318 12:17:31.983397 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.033506 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffad2911-2fae-4030-a3ad-d36a5f95fc07-utilities\") pod \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\" (UID: \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\") " Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.033569 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42kbw\" (UniqueName: \"kubernetes.io/projected/ffad2911-2fae-4030-a3ad-d36a5f95fc07-kube-api-access-42kbw\") pod \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\" (UID: \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\") " Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.033731 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffad2911-2fae-4030-a3ad-d36a5f95fc07-catalog-content\") pod \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\" (UID: \"ffad2911-2fae-4030-a3ad-d36a5f95fc07\") " Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.034596 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffad2911-2fae-4030-a3ad-d36a5f95fc07-utilities" (OuterVolumeSpecName: "utilities") pod "ffad2911-2fae-4030-a3ad-d36a5f95fc07" (UID: "ffad2911-2fae-4030-a3ad-d36a5f95fc07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.047794 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffad2911-2fae-4030-a3ad-d36a5f95fc07-kube-api-access-42kbw" (OuterVolumeSpecName: "kube-api-access-42kbw") pod "ffad2911-2fae-4030-a3ad-d36a5f95fc07" (UID: "ffad2911-2fae-4030-a3ad-d36a5f95fc07"). InnerVolumeSpecName "kube-api-access-42kbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.094411 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffad2911-2fae-4030-a3ad-d36a5f95fc07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffad2911-2fae-4030-a3ad-d36a5f95fc07" (UID: "ffad2911-2fae-4030-a3ad-d36a5f95fc07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.135142 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffad2911-2fae-4030-a3ad-d36a5f95fc07-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.135173 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42kbw\" (UniqueName: \"kubernetes.io/projected/ffad2911-2fae-4030-a3ad-d36a5f95fc07-kube-api-access-42kbw\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.135182 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffad2911-2fae-4030-a3ad-d36a5f95fc07-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.522672 4843 generic.go:334] "Generic (PLEG): container finished" podID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" containerID="35a8d8b8d8b0c2b7c4e0d4c2299a98969e95467b83da197d3a158508b3c0e1e4" exitCode=0 Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.522727 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t62l2" event={"ID":"ffad2911-2fae-4030-a3ad-d36a5f95fc07","Type":"ContainerDied","Data":"35a8d8b8d8b0c2b7c4e0d4c2299a98969e95467b83da197d3a158508b3c0e1e4"} Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.522801 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t62l2" event={"ID":"ffad2911-2fae-4030-a3ad-d36a5f95fc07","Type":"ContainerDied","Data":"c810d230efbe57b5b85506c28282397425602a35cbf86d7794d21d0df1aac1c0"} Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.522799 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t62l2" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.522822 4843 scope.go:117] "RemoveContainer" containerID="35a8d8b8d8b0c2b7c4e0d4c2299a98969e95467b83da197d3a158508b3c0e1e4" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.548263 4843 scope.go:117] "RemoveContainer" containerID="9b0012ca8ef7feaaaec9d8495679246c1c5c9aeebf13472565bec94af511ea85" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.562456 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t62l2"] Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.575620 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t62l2"] Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.580628 4843 scope.go:117] "RemoveContainer" containerID="71e453afbd09fc8b0f42a5ebd24e7bfe39d37bb2f7182a38d72b253b91b3c210" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.596908 4843 scope.go:117] "RemoveContainer" containerID="35a8d8b8d8b0c2b7c4e0d4c2299a98969e95467b83da197d3a158508b3c0e1e4" Mar 18 12:17:32 crc kubenswrapper[4843]: E0318 12:17:32.597400 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35a8d8b8d8b0c2b7c4e0d4c2299a98969e95467b83da197d3a158508b3c0e1e4\": container with ID starting with 35a8d8b8d8b0c2b7c4e0d4c2299a98969e95467b83da197d3a158508b3c0e1e4 not found: ID does not exist" containerID="35a8d8b8d8b0c2b7c4e0d4c2299a98969e95467b83da197d3a158508b3c0e1e4" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.597448 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35a8d8b8d8b0c2b7c4e0d4c2299a98969e95467b83da197d3a158508b3c0e1e4"} err="failed to get container status \"35a8d8b8d8b0c2b7c4e0d4c2299a98969e95467b83da197d3a158508b3c0e1e4\": rpc error: code = NotFound desc = could not find container \"35a8d8b8d8b0c2b7c4e0d4c2299a98969e95467b83da197d3a158508b3c0e1e4\": container with ID starting with 35a8d8b8d8b0c2b7c4e0d4c2299a98969e95467b83da197d3a158508b3c0e1e4 not found: ID does not exist" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.597485 4843 scope.go:117] "RemoveContainer" containerID="9b0012ca8ef7feaaaec9d8495679246c1c5c9aeebf13472565bec94af511ea85" Mar 18 12:17:32 crc kubenswrapper[4843]: E0318 12:17:32.598227 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b0012ca8ef7feaaaec9d8495679246c1c5c9aeebf13472565bec94af511ea85\": container with ID starting with 9b0012ca8ef7feaaaec9d8495679246c1c5c9aeebf13472565bec94af511ea85 not found: ID does not exist" containerID="9b0012ca8ef7feaaaec9d8495679246c1c5c9aeebf13472565bec94af511ea85" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.598260 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0012ca8ef7feaaaec9d8495679246c1c5c9aeebf13472565bec94af511ea85"} err="failed to get container status \"9b0012ca8ef7feaaaec9d8495679246c1c5c9aeebf13472565bec94af511ea85\": rpc error: code = NotFound desc = could not find container \"9b0012ca8ef7feaaaec9d8495679246c1c5c9aeebf13472565bec94af511ea85\": container with ID starting with 9b0012ca8ef7feaaaec9d8495679246c1c5c9aeebf13472565bec94af511ea85 not found: ID does not exist" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.598286 4843 scope.go:117] "RemoveContainer" containerID="71e453afbd09fc8b0f42a5ebd24e7bfe39d37bb2f7182a38d72b253b91b3c210" Mar 18 12:17:32 crc kubenswrapper[4843]: E0318 12:17:32.598604 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e453afbd09fc8b0f42a5ebd24e7bfe39d37bb2f7182a38d72b253b91b3c210\": container with ID starting with 71e453afbd09fc8b0f42a5ebd24e7bfe39d37bb2f7182a38d72b253b91b3c210 not found: ID does not exist" containerID="71e453afbd09fc8b0f42a5ebd24e7bfe39d37bb2f7182a38d72b253b91b3c210" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.598630 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e453afbd09fc8b0f42a5ebd24e7bfe39d37bb2f7182a38d72b253b91b3c210"} err="failed to get container status \"71e453afbd09fc8b0f42a5ebd24e7bfe39d37bb2f7182a38d72b253b91b3c210\": rpc error: code = NotFound desc = could not find container \"71e453afbd09fc8b0f42a5ebd24e7bfe39d37bb2f7182a38d72b253b91b3c210\": container with ID starting with 71e453afbd09fc8b0f42a5ebd24e7bfe39d37bb2f7182a38d72b253b91b3c210 not found: ID does not exist" Mar 18 12:17:32 crc kubenswrapper[4843]: I0318 12:17:32.992125 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" path="/var/lib/kubelet/pods/ffad2911-2fae-4030-a3ad-d36a5f95fc07/volumes" Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.343284 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-52bm4"] Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.343620 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-52bm4" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" containerName="registry-server" containerID="cri-o://a502b43887b50c3bdcb346481dd97e3e84845c8491bcde98c5b5d3d6db0d6777" gracePeriod=2 Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.529204 4843 generic.go:334] "Generic (PLEG): container finished" podID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" containerID="a502b43887b50c3bdcb346481dd97e3e84845c8491bcde98c5b5d3d6db0d6777" exitCode=0 Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.529304 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52bm4" event={"ID":"95b0d7ed-60bd-467a-a1b4-86f2c32096ab","Type":"ContainerDied","Data":"a502b43887b50c3bdcb346481dd97e3e84845c8491bcde98c5b5d3d6db0d6777"} Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.709202 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.856465 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-catalog-content\") pod \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\" (UID: \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\") " Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.856526 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc4ck\" (UniqueName: \"kubernetes.io/projected/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-kube-api-access-zc4ck\") pod \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\" (UID: \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\") " Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.856592 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-utilities\") pod \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\" (UID: \"95b0d7ed-60bd-467a-a1b4-86f2c32096ab\") " Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.859027 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-utilities" (OuterVolumeSpecName: "utilities") pod "95b0d7ed-60bd-467a-a1b4-86f2c32096ab" (UID: "95b0d7ed-60bd-467a-a1b4-86f2c32096ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.867558 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-kube-api-access-zc4ck" (OuterVolumeSpecName: "kube-api-access-zc4ck") pod "95b0d7ed-60bd-467a-a1b4-86f2c32096ab" (UID: "95b0d7ed-60bd-467a-a1b4-86f2c32096ab"). InnerVolumeSpecName "kube-api-access-zc4ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.893941 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95b0d7ed-60bd-467a-a1b4-86f2c32096ab" (UID: "95b0d7ed-60bd-467a-a1b4-86f2c32096ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.944874 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mrwd9"] Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.945231 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mrwd9" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" containerName="registry-server" containerID="cri-o://394b6781b9a6b3fe002f89e9319bec82cf5b13ffd6a5c92c09102cb1dd5fe733" gracePeriod=2 Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.958293 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.958335 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc4ck\" (UniqueName: \"kubernetes.io/projected/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-kube-api-access-zc4ck\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:33 crc kubenswrapper[4843]: I0318 12:17:33.958352 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95b0d7ed-60bd-467a-a1b4-86f2c32096ab-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.553561 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52bm4" Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.553561 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52bm4" event={"ID":"95b0d7ed-60bd-467a-a1b4-86f2c32096ab","Type":"ContainerDied","Data":"7836da3f8aca48fd4cca7333f3b44430842c9618822020ea23eb7bf226156589"} Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.553763 4843 scope.go:117] "RemoveContainer" containerID="a502b43887b50c3bdcb346481dd97e3e84845c8491bcde98c5b5d3d6db0d6777" Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.559942 4843 generic.go:334] "Generic (PLEG): container finished" podID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" containerID="394b6781b9a6b3fe002f89e9319bec82cf5b13ffd6a5c92c09102cb1dd5fe733" exitCode=0 Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.560010 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrwd9" event={"ID":"7aa438f5-fa8b-43f7-93b7-67e592ea698c","Type":"ContainerDied","Data":"394b6781b9a6b3fe002f89e9319bec82cf5b13ffd6a5c92c09102cb1dd5fe733"} Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.580881 4843 scope.go:117] "RemoveContainer" containerID="ea2c6613dfdd829a4cd38d4f7015b18c4bc8c3ddbc488d4a7238901fbeac7989" Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.615805 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-52bm4"] Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.622150 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-52bm4"] Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.656263 4843 scope.go:117] "RemoveContainer" containerID="7e43f0dd207ab0a15dbc5c3e5b7d992a7fcce8194846826e81d2935102b4b930" Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.725924 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.870940 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa438f5-fa8b-43f7-93b7-67e592ea698c-catalog-content\") pod \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\" (UID: \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\") " Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.871011 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv5gs\" (UniqueName: \"kubernetes.io/projected/7aa438f5-fa8b-43f7-93b7-67e592ea698c-kube-api-access-vv5gs\") pod \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\" (UID: \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\") " Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.871075 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa438f5-fa8b-43f7-93b7-67e592ea698c-utilities\") pod \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\" (UID: \"7aa438f5-fa8b-43f7-93b7-67e592ea698c\") " Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.872198 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa438f5-fa8b-43f7-93b7-67e592ea698c-utilities" (OuterVolumeSpecName: "utilities") pod "7aa438f5-fa8b-43f7-93b7-67e592ea698c" (UID: "7aa438f5-fa8b-43f7-93b7-67e592ea698c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.877411 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa438f5-fa8b-43f7-93b7-67e592ea698c-kube-api-access-vv5gs" (OuterVolumeSpecName: "kube-api-access-vv5gs") pod "7aa438f5-fa8b-43f7-93b7-67e592ea698c" (UID: "7aa438f5-fa8b-43f7-93b7-67e592ea698c"). InnerVolumeSpecName "kube-api-access-vv5gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.972603 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa438f5-fa8b-43f7-93b7-67e592ea698c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.972674 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv5gs\" (UniqueName: \"kubernetes.io/projected/7aa438f5-fa8b-43f7-93b7-67e592ea698c-kube-api-access-vv5gs\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:34 crc kubenswrapper[4843]: I0318 12:17:34.992225 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" path="/var/lib/kubelet/pods/95b0d7ed-60bd-467a-a1b4-86f2c32096ab/volumes" Mar 18 12:17:35 crc kubenswrapper[4843]: I0318 12:17:35.008068 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa438f5-fa8b-43f7-93b7-67e592ea698c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7aa438f5-fa8b-43f7-93b7-67e592ea698c" (UID: "7aa438f5-fa8b-43f7-93b7-67e592ea698c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:35 crc kubenswrapper[4843]: I0318 12:17:35.073692 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa438f5-fa8b-43f7-93b7-67e592ea698c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:35 crc kubenswrapper[4843]: I0318 12:17:35.571828 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrwd9" event={"ID":"7aa438f5-fa8b-43f7-93b7-67e592ea698c","Type":"ContainerDied","Data":"be65cb05c3eed2feb6571f468eb09586652a93c779f4894d07f32893f50bb666"} Mar 18 12:17:35 crc kubenswrapper[4843]: I0318 12:17:35.571944 4843 scope.go:117] "RemoveContainer" containerID="394b6781b9a6b3fe002f89e9319bec82cf5b13ffd6a5c92c09102cb1dd5fe733" Mar 18 12:17:35 crc kubenswrapper[4843]: I0318 12:17:35.571954 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrwd9" Mar 18 12:17:35 crc kubenswrapper[4843]: I0318 12:17:35.587171 4843 scope.go:117] "RemoveContainer" containerID="b802b2e548d9e3766d47ac21897325244d223dd2e4920b666b8542b21c723f89" Mar 18 12:17:35 crc kubenswrapper[4843]: I0318 12:17:35.603790 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mrwd9"] Mar 18 12:17:35 crc kubenswrapper[4843]: I0318 12:17:35.608385 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mrwd9"] Mar 18 12:17:35 crc kubenswrapper[4843]: I0318 12:17:35.629506 4843 scope.go:117] "RemoveContainer" containerID="297bf8b562fbb8821a53dd8001ec896cd26fdf2dae429dfa2d511194f3361d8d" Mar 18 12:17:36 crc kubenswrapper[4843]: I0318 12:17:36.991456 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" path="/var/lib/kubelet/pods/7aa438f5-fa8b-43f7-93b7-67e592ea698c/volumes" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.815810 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rfg7x"] Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.816629 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rfg7x" podUID="431329ed-c93d-4d44-bb49-ebed3a083f1a" containerName="registry-server" containerID="cri-o://6bf2ff7dca8731b8759c4bb84b80a1ed2dbc0edcf55da25949f1c59f43095601" gracePeriod=30 Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.830110 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lcsxl"] Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.830469 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lcsxl" podUID="3cef8c60-1bee-4798-9147-4d7f360ae4b2" containerName="registry-server" containerID="cri-o://83c7b78c077140dfa9f6d8fa459d5803c06cce60e24cf550f8839531bcd9ec8f" gracePeriod=30 Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.837258 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4fqc5"] Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.837589 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" podUID="016cbd62-23a6-413f-82b5-b806746e2b01" containerName="marketplace-operator" containerID="cri-o://dcb1aacff14f14eda524bbbce7c735928a7013c6a883c2246818cf8b8b346e99" gracePeriod=30 Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.855551 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2p2b"] Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.855810 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j2p2b" podUID="0dc0843b-8f5c-434e-986e-3aab182caad3" containerName="registry-server" containerID="cri-o://fdb4a1b706c6bd18833808d3da2c60aa762b0b897e35392d8784f873901612b2" gracePeriod=30 Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.859941 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dgr8z"] Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.860293 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dgr8z" podUID="feb1dd70-302b-4217-b17c-211aea971073" containerName="registry-server" containerID="cri-o://450609aa37892fe84c5d85575e46da946e0bd025d8a1067bcff4d2d7d873ca3d" gracePeriod=30 Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870199 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x8vmv"] Mar 18 12:17:38 crc kubenswrapper[4843]: E0318 12:17:38.870501 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" containerName="extract-content" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870517 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" containerName="extract-content" Mar 18 12:17:38 crc kubenswrapper[4843]: E0318 12:17:38.870536 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" containerName="extract-content" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870543 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" containerName="extract-content" Mar 18 12:17:38 crc kubenswrapper[4843]: E0318 12:17:38.870558 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" containerName="extract-content" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870565 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" containerName="extract-content" Mar 18 12:17:38 crc kubenswrapper[4843]: E0318 12:17:38.870576 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c22c026-c039-414d-aa42-cfbcd1799c74" containerName="oc" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870583 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c22c026-c039-414d-aa42-cfbcd1799c74" containerName="oc" Mar 18 12:17:38 crc kubenswrapper[4843]: E0318 12:17:38.870594 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870602 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 12:17:38 crc kubenswrapper[4843]: E0318 12:17:38.870618 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" containerName="registry-server" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870624 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" containerName="registry-server" Mar 18 12:17:38 crc kubenswrapper[4843]: E0318 12:17:38.870636 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" containerName="registry-server" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870643 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" containerName="registry-server" Mar 18 12:17:38 crc kubenswrapper[4843]: E0318 12:17:38.870667 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" containerName="extract-utilities" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870676 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" containerName="extract-utilities" Mar 18 12:17:38 crc kubenswrapper[4843]: E0318 12:17:38.870687 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" containerName="registry-server" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870695 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" containerName="registry-server" Mar 18 12:17:38 crc kubenswrapper[4843]: E0318 12:17:38.870704 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" containerName="extract-utilities" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870712 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" containerName="extract-utilities" Mar 18 12:17:38 crc kubenswrapper[4843]: E0318 12:17:38.870725 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" containerName="extract-utilities" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870732 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" containerName="extract-utilities" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870859 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b0d7ed-60bd-467a-a1b4-86f2c32096ab" containerName="registry-server" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870874 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c22c026-c039-414d-aa42-cfbcd1799c74" containerName="oc" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870888 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffad2911-2fae-4030-a3ad-d36a5f95fc07" containerName="registry-server" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870900 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.870911 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa438f5-fa8b-43f7-93b7-67e592ea698c" containerName="registry-server" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.871449 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" Mar 18 12:17:38 crc kubenswrapper[4843]: I0318 12:17:38.881188 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x8vmv"] Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.596416 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05781490-ae8b-484f-9c0f-93409e2d5850-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x8vmv\" (UID: \"05781490-ae8b-484f-9c0f-93409e2d5850\") " pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.596787 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/05781490-ae8b-484f-9c0f-93409e2d5850-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x8vmv\" (UID: \"05781490-ae8b-484f-9c0f-93409e2d5850\") " pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.596973 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbqlq\" (UniqueName: \"kubernetes.io/projected/05781490-ae8b-484f-9c0f-93409e2d5850-kube-api-access-xbqlq\") pod \"marketplace-operator-79b997595-x8vmv\" (UID: \"05781490-ae8b-484f-9c0f-93409e2d5850\") " pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.626116 4843 generic.go:334] "Generic (PLEG): container finished" podID="3cef8c60-1bee-4798-9147-4d7f360ae4b2" containerID="83c7b78c077140dfa9f6d8fa459d5803c06cce60e24cf550f8839531bcd9ec8f" exitCode=0 Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.626189 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcsxl" event={"ID":"3cef8c60-1bee-4798-9147-4d7f360ae4b2","Type":"ContainerDied","Data":"83c7b78c077140dfa9f6d8fa459d5803c06cce60e24cf550f8839531bcd9ec8f"} Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.647303 4843 generic.go:334] "Generic (PLEG): container finished" podID="0dc0843b-8f5c-434e-986e-3aab182caad3" containerID="fdb4a1b706c6bd18833808d3da2c60aa762b0b897e35392d8784f873901612b2" exitCode=0 Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.647374 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2p2b" event={"ID":"0dc0843b-8f5c-434e-986e-3aab182caad3","Type":"ContainerDied","Data":"fdb4a1b706c6bd18833808d3da2c60aa762b0b897e35392d8784f873901612b2"} Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.662959 4843 generic.go:334] "Generic (PLEG): container finished" podID="feb1dd70-302b-4217-b17c-211aea971073" containerID="450609aa37892fe84c5d85575e46da946e0bd025d8a1067bcff4d2d7d873ca3d" exitCode=0 Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.663020 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dgr8z" event={"ID":"feb1dd70-302b-4217-b17c-211aea971073","Type":"ContainerDied","Data":"450609aa37892fe84c5d85575e46da946e0bd025d8a1067bcff4d2d7d873ca3d"} Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.664387 4843 generic.go:334] "Generic (PLEG): container finished" podID="016cbd62-23a6-413f-82b5-b806746e2b01" containerID="dcb1aacff14f14eda524bbbce7c735928a7013c6a883c2246818cf8b8b346e99" exitCode=0 Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.664437 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" event={"ID":"016cbd62-23a6-413f-82b5-b806746e2b01","Type":"ContainerDied","Data":"dcb1aacff14f14eda524bbbce7c735928a7013c6a883c2246818cf8b8b346e99"} Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.666486 4843 generic.go:334] "Generic (PLEG): container finished" podID="431329ed-c93d-4d44-bb49-ebed3a083f1a" containerID="6bf2ff7dca8731b8759c4bb84b80a1ed2dbc0edcf55da25949f1c59f43095601" exitCode=0 Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.666540 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfg7x" event={"ID":"431329ed-c93d-4d44-bb49-ebed3a083f1a","Type":"ContainerDied","Data":"6bf2ff7dca8731b8759c4bb84b80a1ed2dbc0edcf55da25949f1c59f43095601"} Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.698669 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05781490-ae8b-484f-9c0f-93409e2d5850-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x8vmv\" (UID: \"05781490-ae8b-484f-9c0f-93409e2d5850\") " pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.698713 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/05781490-ae8b-484f-9c0f-93409e2d5850-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x8vmv\" (UID: \"05781490-ae8b-484f-9c0f-93409e2d5850\") " pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.698768 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbqlq\" (UniqueName: \"kubernetes.io/projected/05781490-ae8b-484f-9c0f-93409e2d5850-kube-api-access-xbqlq\") pod \"marketplace-operator-79b997595-x8vmv\" (UID: \"05781490-ae8b-484f-9c0f-93409e2d5850\") " pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.700508 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05781490-ae8b-484f-9c0f-93409e2d5850-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x8vmv\" (UID: \"05781490-ae8b-484f-9c0f-93409e2d5850\") " pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.707488 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/05781490-ae8b-484f-9c0f-93409e2d5850-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x8vmv\" (UID: \"05781490-ae8b-484f-9c0f-93409e2d5850\") " pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.717556 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbqlq\" (UniqueName: \"kubernetes.io/projected/05781490-ae8b-484f-9c0f-93409e2d5850-kube-api-access-xbqlq\") pod \"marketplace-operator-79b997595-x8vmv\" (UID: \"05781490-ae8b-484f-9c0f-93409e2d5850\") " pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.912987 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.916213 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.921736 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:17:39 crc kubenswrapper[4843]: I0318 12:17:39.927822 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.104930 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/431329ed-c93d-4d44-bb49-ebed3a083f1a-catalog-content\") pod \"431329ed-c93d-4d44-bb49-ebed3a083f1a\" (UID: \"431329ed-c93d-4d44-bb49-ebed3a083f1a\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.104977 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/016cbd62-23a6-413f-82b5-b806746e2b01-marketplace-operator-metrics\") pod \"016cbd62-23a6-413f-82b5-b806746e2b01\" (UID: \"016cbd62-23a6-413f-82b5-b806746e2b01\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.105067 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh8qh\" (UniqueName: \"kubernetes.io/projected/431329ed-c93d-4d44-bb49-ebed3a083f1a-kube-api-access-hh8qh\") pod \"431329ed-c93d-4d44-bb49-ebed3a083f1a\" (UID: \"431329ed-c93d-4d44-bb49-ebed3a083f1a\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.105089 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/431329ed-c93d-4d44-bb49-ebed3a083f1a-utilities\") pod \"431329ed-c93d-4d44-bb49-ebed3a083f1a\" (UID: \"431329ed-c93d-4d44-bb49-ebed3a083f1a\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.105129 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dp4b\" (UniqueName: \"kubernetes.io/projected/3cef8c60-1bee-4798-9147-4d7f360ae4b2-kube-api-access-9dp4b\") pod \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\" (UID: \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.105143 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cef8c60-1bee-4798-9147-4d7f360ae4b2-catalog-content\") pod \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\" (UID: \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.105177 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/016cbd62-23a6-413f-82b5-b806746e2b01-marketplace-trusted-ca\") pod \"016cbd62-23a6-413f-82b5-b806746e2b01\" (UID: \"016cbd62-23a6-413f-82b5-b806746e2b01\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.105242 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv422\" (UniqueName: \"kubernetes.io/projected/016cbd62-23a6-413f-82b5-b806746e2b01-kube-api-access-zv422\") pod \"016cbd62-23a6-413f-82b5-b806746e2b01\" (UID: \"016cbd62-23a6-413f-82b5-b806746e2b01\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.105257 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cef8c60-1bee-4798-9147-4d7f360ae4b2-utilities\") pod \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\" (UID: \"3cef8c60-1bee-4798-9147-4d7f360ae4b2\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.107134 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/431329ed-c93d-4d44-bb49-ebed3a083f1a-utilities" (OuterVolumeSpecName: "utilities") pod "431329ed-c93d-4d44-bb49-ebed3a083f1a" (UID: "431329ed-c93d-4d44-bb49-ebed3a083f1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.107579 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cef8c60-1bee-4798-9147-4d7f360ae4b2-utilities" (OuterVolumeSpecName: "utilities") pod "3cef8c60-1bee-4798-9147-4d7f360ae4b2" (UID: "3cef8c60-1bee-4798-9147-4d7f360ae4b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.110328 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/431329ed-c93d-4d44-bb49-ebed3a083f1a-kube-api-access-hh8qh" (OuterVolumeSpecName: "kube-api-access-hh8qh") pod "431329ed-c93d-4d44-bb49-ebed3a083f1a" (UID: "431329ed-c93d-4d44-bb49-ebed3a083f1a"). InnerVolumeSpecName "kube-api-access-hh8qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.110420 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/016cbd62-23a6-413f-82b5-b806746e2b01-kube-api-access-zv422" (OuterVolumeSpecName: "kube-api-access-zv422") pod "016cbd62-23a6-413f-82b5-b806746e2b01" (UID: "016cbd62-23a6-413f-82b5-b806746e2b01"). InnerVolumeSpecName "kube-api-access-zv422". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.110725 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cef8c60-1bee-4798-9147-4d7f360ae4b2-kube-api-access-9dp4b" (OuterVolumeSpecName: "kube-api-access-9dp4b") pod "3cef8c60-1bee-4798-9147-4d7f360ae4b2" (UID: "3cef8c60-1bee-4798-9147-4d7f360ae4b2"). InnerVolumeSpecName "kube-api-access-9dp4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.115169 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/016cbd62-23a6-413f-82b5-b806746e2b01-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "016cbd62-23a6-413f-82b5-b806746e2b01" (UID: "016cbd62-23a6-413f-82b5-b806746e2b01"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.143509 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016cbd62-23a6-413f-82b5-b806746e2b01-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "016cbd62-23a6-413f-82b5-b806746e2b01" (UID: "016cbd62-23a6-413f-82b5-b806746e2b01"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.159702 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x8vmv"] Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.202083 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cef8c60-1bee-4798-9147-4d7f360ae4b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cef8c60-1bee-4798-9147-4d7f360ae4b2" (UID: "3cef8c60-1bee-4798-9147-4d7f360ae4b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.208124 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv422\" (UniqueName: \"kubernetes.io/projected/016cbd62-23a6-413f-82b5-b806746e2b01-kube-api-access-zv422\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.208166 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cef8c60-1bee-4798-9147-4d7f360ae4b2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.208182 4843 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/016cbd62-23a6-413f-82b5-b806746e2b01-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.208197 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh8qh\" (UniqueName: \"kubernetes.io/projected/431329ed-c93d-4d44-bb49-ebed3a083f1a-kube-api-access-hh8qh\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.208208 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/431329ed-c93d-4d44-bb49-ebed3a083f1a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.208221 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dp4b\" (UniqueName: \"kubernetes.io/projected/3cef8c60-1bee-4798-9147-4d7f360ae4b2-kube-api-access-9dp4b\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.208233 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cef8c60-1bee-4798-9147-4d7f360ae4b2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.208245 4843 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/016cbd62-23a6-413f-82b5-b806746e2b01-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.234680 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/431329ed-c93d-4d44-bb49-ebed3a083f1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "431329ed-c93d-4d44-bb49-ebed3a083f1a" (UID: "431329ed-c93d-4d44-bb49-ebed3a083f1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.306372 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.310585 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/431329ed-c93d-4d44-bb49-ebed3a083f1a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.334666 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.412130 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc0843b-8f5c-434e-986e-3aab182caad3-catalog-content\") pod \"0dc0843b-8f5c-434e-986e-3aab182caad3\" (UID: \"0dc0843b-8f5c-434e-986e-3aab182caad3\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.412235 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc0843b-8f5c-434e-986e-3aab182caad3-utilities\") pod \"0dc0843b-8f5c-434e-986e-3aab182caad3\" (UID: \"0dc0843b-8f5c-434e-986e-3aab182caad3\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.412396 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt446\" (UniqueName: \"kubernetes.io/projected/0dc0843b-8f5c-434e-986e-3aab182caad3-kube-api-access-bt446\") pod \"0dc0843b-8f5c-434e-986e-3aab182caad3\" (UID: \"0dc0843b-8f5c-434e-986e-3aab182caad3\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.412897 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dc0843b-8f5c-434e-986e-3aab182caad3-utilities" (OuterVolumeSpecName: "utilities") pod "0dc0843b-8f5c-434e-986e-3aab182caad3" (UID: "0dc0843b-8f5c-434e-986e-3aab182caad3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.416949 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc0843b-8f5c-434e-986e-3aab182caad3-kube-api-access-bt446" (OuterVolumeSpecName: "kube-api-access-bt446") pod "0dc0843b-8f5c-434e-986e-3aab182caad3" (UID: "0dc0843b-8f5c-434e-986e-3aab182caad3"). InnerVolumeSpecName "kube-api-access-bt446". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.453976 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dc0843b-8f5c-434e-986e-3aab182caad3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0dc0843b-8f5c-434e-986e-3aab182caad3" (UID: "0dc0843b-8f5c-434e-986e-3aab182caad3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.514226 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb1dd70-302b-4217-b17c-211aea971073-utilities\") pod \"feb1dd70-302b-4217-b17c-211aea971073\" (UID: \"feb1dd70-302b-4217-b17c-211aea971073\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.514340 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lwgq\" (UniqueName: \"kubernetes.io/projected/feb1dd70-302b-4217-b17c-211aea971073-kube-api-access-4lwgq\") pod \"feb1dd70-302b-4217-b17c-211aea971073\" (UID: \"feb1dd70-302b-4217-b17c-211aea971073\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.514434 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb1dd70-302b-4217-b17c-211aea971073-catalog-content\") pod \"feb1dd70-302b-4217-b17c-211aea971073\" (UID: \"feb1dd70-302b-4217-b17c-211aea971073\") " Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.514635 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc0843b-8f5c-434e-986e-3aab182caad3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.514668 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc0843b-8f5c-434e-986e-3aab182caad3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.514689 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt446\" (UniqueName: \"kubernetes.io/projected/0dc0843b-8f5c-434e-986e-3aab182caad3-kube-api-access-bt446\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.515424 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feb1dd70-302b-4217-b17c-211aea971073-utilities" (OuterVolumeSpecName: "utilities") pod "feb1dd70-302b-4217-b17c-211aea971073" (UID: "feb1dd70-302b-4217-b17c-211aea971073"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.517736 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb1dd70-302b-4217-b17c-211aea971073-kube-api-access-4lwgq" (OuterVolumeSpecName: "kube-api-access-4lwgq") pod "feb1dd70-302b-4217-b17c-211aea971073" (UID: "feb1dd70-302b-4217-b17c-211aea971073"). InnerVolumeSpecName "kube-api-access-4lwgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.615917 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb1dd70-302b-4217-b17c-211aea971073-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.615955 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lwgq\" (UniqueName: \"kubernetes.io/projected/feb1dd70-302b-4217-b17c-211aea971073-kube-api-access-4lwgq\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.640552 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feb1dd70-302b-4217-b17c-211aea971073-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "feb1dd70-302b-4217-b17c-211aea971073" (UID: "feb1dd70-302b-4217-b17c-211aea971073"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.673306 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.673316 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4fqc5" event={"ID":"016cbd62-23a6-413f-82b5-b806746e2b01","Type":"ContainerDied","Data":"f0517b9df50f797f9f3a825136f056837b06342aff2b135302e168d4190d6a72"} Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.673364 4843 scope.go:117] "RemoveContainer" containerID="dcb1aacff14f14eda524bbbce7c735928a7013c6a883c2246818cf8b8b346e99" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.675435 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfg7x" event={"ID":"431329ed-c93d-4d44-bb49-ebed3a083f1a","Type":"ContainerDied","Data":"f4345fc1eacef1bfb0aa089076a77a76a1704e5cccd4b32880640c99eeab4386"} Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.675482 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfg7x" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.678868 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcsxl" event={"ID":"3cef8c60-1bee-4798-9147-4d7f360ae4b2","Type":"ContainerDied","Data":"d956f0fa72ebbed74079ef69b2db5051d1194160057bf61685b5a073ed89b3c9"} Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.678968 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcsxl" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.681557 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2p2b" event={"ID":"0dc0843b-8f5c-434e-986e-3aab182caad3","Type":"ContainerDied","Data":"271a34b99b3b195fe59f39c3cc71821b24ccfa1d415a7271277350604398fe6e"} Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.681641 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2p2b" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.686041 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" event={"ID":"05781490-ae8b-484f-9c0f-93409e2d5850","Type":"ContainerStarted","Data":"09c65708891976611eff85ce57335e45c0439a530413451a1680b8ae794884b3"} Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.686072 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" event={"ID":"05781490-ae8b-484f-9c0f-93409e2d5850","Type":"ContainerStarted","Data":"d260f47688493aefcec13082b49dba613b39ff56757e2e0dbcef4308034640bf"} Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.686859 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.690187 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dgr8z" event={"ID":"feb1dd70-302b-4217-b17c-211aea971073","Type":"ContainerDied","Data":"bf0d10651cf85304026ad33c5deebf391ff15abde3e28ab8bcf5071834b865e7"} Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.690246 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dgr8z" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.694201 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.695014 4843 scope.go:117] "RemoveContainer" containerID="6bf2ff7dca8731b8759c4bb84b80a1ed2dbc0edcf55da25949f1c59f43095601" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.712912 4843 scope.go:117] "RemoveContainer" containerID="cf38075d7eeb7d44d64e6cc39d5e4b16a981cc72ae0f7f7ef7ce7875313983e0" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.717111 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb1dd70-302b-4217-b17c-211aea971073-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.745320 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-x8vmv" podStartSLOduration=2.74529345 podStartE2EDuration="2.74529345s" podCreationTimestamp="2026-03-18 12:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:17:40.720905744 +0000 UTC m=+494.436731378" watchObservedRunningTime="2026-03-18 12:17:40.74529345 +0000 UTC m=+494.461118974" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.746612 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rfg7x"] Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.756151 4843 scope.go:117] "RemoveContainer" containerID="36322762d4c9361b7defc234d8f0c2ef1e90e49e143b2c1c30da89b0c1937d0b" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.757634 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rfg7x"] Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.763042 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dgr8z"] Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.768624 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dgr8z"] Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.772961 4843 scope.go:117] "RemoveContainer" containerID="83c7b78c077140dfa9f6d8fa459d5803c06cce60e24cf550f8839531bcd9ec8f" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.813250 4843 scope.go:117] "RemoveContainer" containerID="7e9620ecd6c684ecdf70955bdeecff16f2b85347b482a7bb2d5abf72cb010d21" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.813366 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lcsxl"] Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.816780 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lcsxl"] Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.823987 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2p2b"] Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.832737 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2p2b"] Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.833375 4843 scope.go:117] "RemoveContainer" containerID="be74ec7f4cdc3f4dbb14e146f7c29360a43a82bcf7f7ff5f935c417a497c53b2" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.837105 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4fqc5"] Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.840294 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4fqc5"] Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.855997 4843 scope.go:117] "RemoveContainer" containerID="fdb4a1b706c6bd18833808d3da2c60aa762b0b897e35392d8784f873901612b2" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.878156 4843 scope.go:117] "RemoveContainer" containerID="74926d0a5517c6b81e126a3a3709ef77fac93f750b64a7d3905e48ef93d423f6" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.904844 4843 scope.go:117] "RemoveContainer" containerID="4697ff225f5b17311de3b67b15466aa056abc008617715defe9df7c22e7fc06c" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.919744 4843 scope.go:117] "RemoveContainer" containerID="450609aa37892fe84c5d85575e46da946e0bd025d8a1067bcff4d2d7d873ca3d" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.933507 4843 scope.go:117] "RemoveContainer" containerID="5a541bdeaa5d99cd556607bba9b15250166a71590ce65a087628f274f3f2e576" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.949958 4843 scope.go:117] "RemoveContainer" containerID="3e36dbb19f5b18461a56a84bb106487e92925feaafbe6b7c685b98a88b1c3d5b" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.991500 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="016cbd62-23a6-413f-82b5-b806746e2b01" path="/var/lib/kubelet/pods/016cbd62-23a6-413f-82b5-b806746e2b01/volumes" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.992079 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dc0843b-8f5c-434e-986e-3aab182caad3" path="/var/lib/kubelet/pods/0dc0843b-8f5c-434e-986e-3aab182caad3/volumes" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.992640 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cef8c60-1bee-4798-9147-4d7f360ae4b2" path="/var/lib/kubelet/pods/3cef8c60-1bee-4798-9147-4d7f360ae4b2/volumes" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.993783 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="431329ed-c93d-4d44-bb49-ebed3a083f1a" path="/var/lib/kubelet/pods/431329ed-c93d-4d44-bb49-ebed3a083f1a/volumes" Mar 18 12:17:40 crc kubenswrapper[4843]: I0318 12:17:40.994392 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb1dd70-302b-4217-b17c-211aea971073" path="/var/lib/kubelet/pods/feb1dd70-302b-4217-b17c-211aea971073/volumes" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.551605 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xrxxz"] Mar 18 12:17:41 crc kubenswrapper[4843]: E0318 12:17:41.551914 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cef8c60-1bee-4798-9147-4d7f360ae4b2" containerName="extract-utilities" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.551938 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cef8c60-1bee-4798-9147-4d7f360ae4b2" containerName="extract-utilities" Mar 18 12:17:41 crc kubenswrapper[4843]: E0318 12:17:41.551954 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="431329ed-c93d-4d44-bb49-ebed3a083f1a" containerName="extract-content" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.551965 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="431329ed-c93d-4d44-bb49-ebed3a083f1a" containerName="extract-content" Mar 18 12:17:41 crc kubenswrapper[4843]: E0318 12:17:41.551977 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="431329ed-c93d-4d44-bb49-ebed3a083f1a" containerName="registry-server" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.551985 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="431329ed-c93d-4d44-bb49-ebed3a083f1a" containerName="registry-server" Mar 18 12:17:41 crc kubenswrapper[4843]: E0318 12:17:41.551998 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb1dd70-302b-4217-b17c-211aea971073" containerName="extract-content" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552006 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb1dd70-302b-4217-b17c-211aea971073" containerName="extract-content" Mar 18 12:17:41 crc kubenswrapper[4843]: E0318 12:17:41.552015 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc0843b-8f5c-434e-986e-3aab182caad3" containerName="extract-utilities" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552022 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc0843b-8f5c-434e-986e-3aab182caad3" containerName="extract-utilities" Mar 18 12:17:41 crc kubenswrapper[4843]: E0318 12:17:41.552033 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cef8c60-1bee-4798-9147-4d7f360ae4b2" containerName="registry-server" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552040 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cef8c60-1bee-4798-9147-4d7f360ae4b2" containerName="registry-server" Mar 18 12:17:41 crc kubenswrapper[4843]: E0318 12:17:41.552047 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="016cbd62-23a6-413f-82b5-b806746e2b01" containerName="marketplace-operator" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552054 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="016cbd62-23a6-413f-82b5-b806746e2b01" containerName="marketplace-operator" Mar 18 12:17:41 crc kubenswrapper[4843]: E0318 12:17:41.552064 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc0843b-8f5c-434e-986e-3aab182caad3" containerName="extract-content" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552072 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc0843b-8f5c-434e-986e-3aab182caad3" containerName="extract-content" Mar 18 12:17:41 crc kubenswrapper[4843]: E0318 12:17:41.552083 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb1dd70-302b-4217-b17c-211aea971073" containerName="extract-utilities" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552090 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb1dd70-302b-4217-b17c-211aea971073" containerName="extract-utilities" Mar 18 12:17:41 crc kubenswrapper[4843]: E0318 12:17:41.552101 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb1dd70-302b-4217-b17c-211aea971073" containerName="registry-server" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552107 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb1dd70-302b-4217-b17c-211aea971073" containerName="registry-server" Mar 18 12:17:41 crc kubenswrapper[4843]: E0318 12:17:41.552116 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="431329ed-c93d-4d44-bb49-ebed3a083f1a" containerName="extract-utilities" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552123 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="431329ed-c93d-4d44-bb49-ebed3a083f1a" containerName="extract-utilities" Mar 18 12:17:41 crc kubenswrapper[4843]: E0318 12:17:41.552133 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cef8c60-1bee-4798-9147-4d7f360ae4b2" containerName="extract-content" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552140 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cef8c60-1bee-4798-9147-4d7f360ae4b2" containerName="extract-content" Mar 18 12:17:41 crc kubenswrapper[4843]: E0318 12:17:41.552151 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc0843b-8f5c-434e-986e-3aab182caad3" containerName="registry-server" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552159 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc0843b-8f5c-434e-986e-3aab182caad3" containerName="registry-server" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552267 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc0843b-8f5c-434e-986e-3aab182caad3" containerName="registry-server" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552285 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="016cbd62-23a6-413f-82b5-b806746e2b01" containerName="marketplace-operator" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552296 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="431329ed-c93d-4d44-bb49-ebed3a083f1a" containerName="registry-server" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552308 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cef8c60-1bee-4798-9147-4d7f360ae4b2" containerName="registry-server" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.552319 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb1dd70-302b-4217-b17c-211aea971073" containerName="registry-server" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.553153 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.555686 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 12:17:41 crc kubenswrapper[4843]: I0318 12:17:41.562986 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xrxxz"] Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.082520 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bded68f0-358f-4215-a313-6f28ef9b506c-utilities\") pod \"community-operators-xrxxz\" (UID: \"bded68f0-358f-4215-a313-6f28ef9b506c\") " pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.082567 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bded68f0-358f-4215-a313-6f28ef9b506c-catalog-content\") pod \"community-operators-xrxxz\" (UID: \"bded68f0-358f-4215-a313-6f28ef9b506c\") " pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.082637 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drlr4\" (UniqueName: \"kubernetes.io/projected/bded68f0-358f-4215-a313-6f28ef9b506c-kube-api-access-drlr4\") pod \"community-operators-xrxxz\" (UID: \"bded68f0-358f-4215-a313-6f28ef9b506c\") " pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.183596 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bded68f0-358f-4215-a313-6f28ef9b506c-utilities\") pod \"community-operators-xrxxz\" (UID: \"bded68f0-358f-4215-a313-6f28ef9b506c\") " pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.183674 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bded68f0-358f-4215-a313-6f28ef9b506c-catalog-content\") pod \"community-operators-xrxxz\" (UID: \"bded68f0-358f-4215-a313-6f28ef9b506c\") " pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.183952 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drlr4\" (UniqueName: \"kubernetes.io/projected/bded68f0-358f-4215-a313-6f28ef9b506c-kube-api-access-drlr4\") pod \"community-operators-xrxxz\" (UID: \"bded68f0-358f-4215-a313-6f28ef9b506c\") " pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.184519 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bded68f0-358f-4215-a313-6f28ef9b506c-catalog-content\") pod \"community-operators-xrxxz\" (UID: \"bded68f0-358f-4215-a313-6f28ef9b506c\") " pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.184762 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bded68f0-358f-4215-a313-6f28ef9b506c-utilities\") pod \"community-operators-xrxxz\" (UID: \"bded68f0-358f-4215-a313-6f28ef9b506c\") " pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.205045 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drlr4\" (UniqueName: \"kubernetes.io/projected/bded68f0-358f-4215-a313-6f28ef9b506c-kube-api-access-drlr4\") pod \"community-operators-xrxxz\" (UID: \"bded68f0-358f-4215-a313-6f28ef9b506c\") " pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.406350 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.632421 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xrxxz"] Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.751812 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zmr7t"] Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.753290 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.757093 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.769838 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmr7t"] Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.790521 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514ca135-154e-44f9-b4e6-de6b600085b7-catalog-content\") pod \"redhat-marketplace-zmr7t\" (UID: \"514ca135-154e-44f9-b4e6-de6b600085b7\") " pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.790582 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514ca135-154e-44f9-b4e6-de6b600085b7-utilities\") pod \"redhat-marketplace-zmr7t\" (UID: \"514ca135-154e-44f9-b4e6-de6b600085b7\") " pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.790626 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhvpl\" (UniqueName: \"kubernetes.io/projected/514ca135-154e-44f9-b4e6-de6b600085b7-kube-api-access-mhvpl\") pod \"redhat-marketplace-zmr7t\" (UID: \"514ca135-154e-44f9-b4e6-de6b600085b7\") " pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.891981 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514ca135-154e-44f9-b4e6-de6b600085b7-catalog-content\") pod \"redhat-marketplace-zmr7t\" (UID: \"514ca135-154e-44f9-b4e6-de6b600085b7\") " pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.892380 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514ca135-154e-44f9-b4e6-de6b600085b7-utilities\") pod \"redhat-marketplace-zmr7t\" (UID: \"514ca135-154e-44f9-b4e6-de6b600085b7\") " pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.892430 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhvpl\" (UniqueName: \"kubernetes.io/projected/514ca135-154e-44f9-b4e6-de6b600085b7-kube-api-access-mhvpl\") pod \"redhat-marketplace-zmr7t\" (UID: \"514ca135-154e-44f9-b4e6-de6b600085b7\") " pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.892809 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514ca135-154e-44f9-b4e6-de6b600085b7-catalog-content\") pod \"redhat-marketplace-zmr7t\" (UID: \"514ca135-154e-44f9-b4e6-de6b600085b7\") " pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.893339 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514ca135-154e-44f9-b4e6-de6b600085b7-utilities\") pod \"redhat-marketplace-zmr7t\" (UID: \"514ca135-154e-44f9-b4e6-de6b600085b7\") " pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:42 crc kubenswrapper[4843]: I0318 12:17:42.917587 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhvpl\" (UniqueName: \"kubernetes.io/projected/514ca135-154e-44f9-b4e6-de6b600085b7-kube-api-access-mhvpl\") pod \"redhat-marketplace-zmr7t\" (UID: \"514ca135-154e-44f9-b4e6-de6b600085b7\") " pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:43 crc kubenswrapper[4843]: I0318 12:17:43.078210 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:43 crc kubenswrapper[4843]: I0318 12:17:43.128962 4843 generic.go:334] "Generic (PLEG): container finished" podID="bded68f0-358f-4215-a313-6f28ef9b506c" containerID="15db3baff555d99810b2e083d3d67d9963cb98dac018d09fb6d59859de005680" exitCode=0 Mar 18 12:17:43 crc kubenswrapper[4843]: I0318 12:17:43.129082 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrxxz" event={"ID":"bded68f0-358f-4215-a313-6f28ef9b506c","Type":"ContainerDied","Data":"15db3baff555d99810b2e083d3d67d9963cb98dac018d09fb6d59859de005680"} Mar 18 12:17:43 crc kubenswrapper[4843]: I0318 12:17:43.129123 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrxxz" event={"ID":"bded68f0-358f-4215-a313-6f28ef9b506c","Type":"ContainerStarted","Data":"f1e0d6974295d5267d4410ca622338dd3f91df6bd4314b47673939336cc63a56"} Mar 18 12:17:43 crc kubenswrapper[4843]: I0318 12:17:43.542644 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmr7t"] Mar 18 12:17:43 crc kubenswrapper[4843]: W0318 12:17:43.546109 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod514ca135_154e_44f9_b4e6_de6b600085b7.slice/crio-8c033f8071cb88df25a57aa40990f78dc2551b184d63305f817cb81e6bd2bae0 WatchSource:0}: Error finding container 8c033f8071cb88df25a57aa40990f78dc2551b184d63305f817cb81e6bd2bae0: Status 404 returned error can't find the container with id 8c033f8071cb88df25a57aa40990f78dc2551b184d63305f817cb81e6bd2bae0 Mar 18 12:17:43 crc kubenswrapper[4843]: I0318 12:17:43.761804 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l4ncn"] Mar 18 12:17:43 crc kubenswrapper[4843]: I0318 12:17:43.763126 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:43 crc kubenswrapper[4843]: I0318 12:17:43.765548 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 12:17:43 crc kubenswrapper[4843]: I0318 12:17:43.783727 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l4ncn"] Mar 18 12:17:43 crc kubenswrapper[4843]: I0318 12:17:43.903603 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00cc77a8-dc0e-44dc-905e-8ed09c5646a3-utilities\") pod \"redhat-operators-l4ncn\" (UID: \"00cc77a8-dc0e-44dc-905e-8ed09c5646a3\") " pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:43 crc kubenswrapper[4843]: I0318 12:17:43.903721 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgrbc\" (UniqueName: \"kubernetes.io/projected/00cc77a8-dc0e-44dc-905e-8ed09c5646a3-kube-api-access-lgrbc\") pod \"redhat-operators-l4ncn\" (UID: \"00cc77a8-dc0e-44dc-905e-8ed09c5646a3\") " pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:43 crc kubenswrapper[4843]: I0318 12:17:43.903752 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00cc77a8-dc0e-44dc-905e-8ed09c5646a3-catalog-content\") pod \"redhat-operators-l4ncn\" (UID: \"00cc77a8-dc0e-44dc-905e-8ed09c5646a3\") " pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.004733 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00cc77a8-dc0e-44dc-905e-8ed09c5646a3-utilities\") pod \"redhat-operators-l4ncn\" (UID: \"00cc77a8-dc0e-44dc-905e-8ed09c5646a3\") " pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.004817 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgrbc\" (UniqueName: \"kubernetes.io/projected/00cc77a8-dc0e-44dc-905e-8ed09c5646a3-kube-api-access-lgrbc\") pod \"redhat-operators-l4ncn\" (UID: \"00cc77a8-dc0e-44dc-905e-8ed09c5646a3\") " pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.004848 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00cc77a8-dc0e-44dc-905e-8ed09c5646a3-catalog-content\") pod \"redhat-operators-l4ncn\" (UID: \"00cc77a8-dc0e-44dc-905e-8ed09c5646a3\") " pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.005443 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00cc77a8-dc0e-44dc-905e-8ed09c5646a3-utilities\") pod \"redhat-operators-l4ncn\" (UID: \"00cc77a8-dc0e-44dc-905e-8ed09c5646a3\") " pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.005466 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00cc77a8-dc0e-44dc-905e-8ed09c5646a3-catalog-content\") pod \"redhat-operators-l4ncn\" (UID: \"00cc77a8-dc0e-44dc-905e-8ed09c5646a3\") " pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.028491 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgrbc\" (UniqueName: \"kubernetes.io/projected/00cc77a8-dc0e-44dc-905e-8ed09c5646a3-kube-api-access-lgrbc\") pod \"redhat-operators-l4ncn\" (UID: \"00cc77a8-dc0e-44dc-905e-8ed09c5646a3\") " pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.086720 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.149081 4843 generic.go:334] "Generic (PLEG): container finished" podID="514ca135-154e-44f9-b4e6-de6b600085b7" containerID="560fa04d7d522642d1f4ce3df2696a20fb7e26b91431a7d4fcfaa3dfb193aa90" exitCode=0 Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.149130 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmr7t" event={"ID":"514ca135-154e-44f9-b4e6-de6b600085b7","Type":"ContainerDied","Data":"560fa04d7d522642d1f4ce3df2696a20fb7e26b91431a7d4fcfaa3dfb193aa90"} Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.149155 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmr7t" event={"ID":"514ca135-154e-44f9-b4e6-de6b600085b7","Type":"ContainerStarted","Data":"8c033f8071cb88df25a57aa40990f78dc2551b184d63305f817cb81e6bd2bae0"} Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.297100 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l4ncn"] Mar 18 12:17:44 crc kubenswrapper[4843]: W0318 12:17:44.316504 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00cc77a8_dc0e_44dc_905e_8ed09c5646a3.slice/crio-05808c670c966adc9b73be2b1a931811cd9a21fb97bc1b4ead3fa927573af786 WatchSource:0}: Error finding container 05808c670c966adc9b73be2b1a931811cd9a21fb97bc1b4ead3fa927573af786: Status 404 returned error can't find the container with id 05808c670c966adc9b73be2b1a931811cd9a21fb97bc1b4ead3fa927573af786 Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.961200 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-927pm"] Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.963244 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.967047 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 12:17:44 crc kubenswrapper[4843]: I0318 12:17:44.976293 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-927pm"] Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.151222 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65efad8c-4b82-445e-b2a6-25c6035182da-catalog-content\") pod \"certified-operators-927pm\" (UID: \"65efad8c-4b82-445e-b2a6-25c6035182da\") " pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.151309 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mnpq\" (UniqueName: \"kubernetes.io/projected/65efad8c-4b82-445e-b2a6-25c6035182da-kube-api-access-8mnpq\") pod \"certified-operators-927pm\" (UID: \"65efad8c-4b82-445e-b2a6-25c6035182da\") " pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.151349 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65efad8c-4b82-445e-b2a6-25c6035182da-utilities\") pod \"certified-operators-927pm\" (UID: \"65efad8c-4b82-445e-b2a6-25c6035182da\") " pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.160915 4843 generic.go:334] "Generic (PLEG): container finished" podID="00cc77a8-dc0e-44dc-905e-8ed09c5646a3" containerID="776b6f71b69cb5648a97dece4f5637de8c4481692d09361354583ace9954e0e7" exitCode=0 Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.161123 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4ncn" event={"ID":"00cc77a8-dc0e-44dc-905e-8ed09c5646a3","Type":"ContainerDied","Data":"776b6f71b69cb5648a97dece4f5637de8c4481692d09361354583ace9954e0e7"} Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.161171 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4ncn" event={"ID":"00cc77a8-dc0e-44dc-905e-8ed09c5646a3","Type":"ContainerStarted","Data":"05808c670c966adc9b73be2b1a931811cd9a21fb97bc1b4ead3fa927573af786"} Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.164370 4843 generic.go:334] "Generic (PLEG): container finished" podID="bded68f0-358f-4215-a313-6f28ef9b506c" containerID="4b0e3659c121ec30e07af4656a9f3f0ca7fd0845faf08b1d3239f856c1a22cff" exitCode=0 Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.164395 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrxxz" event={"ID":"bded68f0-358f-4215-a313-6f28ef9b506c","Type":"ContainerDied","Data":"4b0e3659c121ec30e07af4656a9f3f0ca7fd0845faf08b1d3239f856c1a22cff"} Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.252984 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65efad8c-4b82-445e-b2a6-25c6035182da-catalog-content\") pod \"certified-operators-927pm\" (UID: \"65efad8c-4b82-445e-b2a6-25c6035182da\") " pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.253072 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mnpq\" (UniqueName: \"kubernetes.io/projected/65efad8c-4b82-445e-b2a6-25c6035182da-kube-api-access-8mnpq\") pod \"certified-operators-927pm\" (UID: \"65efad8c-4b82-445e-b2a6-25c6035182da\") " pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.253090 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65efad8c-4b82-445e-b2a6-25c6035182da-utilities\") pod \"certified-operators-927pm\" (UID: \"65efad8c-4b82-445e-b2a6-25c6035182da\") " pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.253941 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65efad8c-4b82-445e-b2a6-25c6035182da-catalog-content\") pod \"certified-operators-927pm\" (UID: \"65efad8c-4b82-445e-b2a6-25c6035182da\") " pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.254752 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65efad8c-4b82-445e-b2a6-25c6035182da-utilities\") pod \"certified-operators-927pm\" (UID: \"65efad8c-4b82-445e-b2a6-25c6035182da\") " pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.281238 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mnpq\" (UniqueName: \"kubernetes.io/projected/65efad8c-4b82-445e-b2a6-25c6035182da-kube-api-access-8mnpq\") pod \"certified-operators-927pm\" (UID: \"65efad8c-4b82-445e-b2a6-25c6035182da\") " pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.333610 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.537204 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-927pm"] Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.900219 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6w9n2"] Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.901597 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:45 crc kubenswrapper[4843]: I0318 12:17:45.913951 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6w9n2"] Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.063207 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6698fd5-08f6-4f44-b075-313bbc44ea38-trusted-ca\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.063553 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bth6d\" (UniqueName: \"kubernetes.io/projected/f6698fd5-08f6-4f44-b075-313bbc44ea38-kube-api-access-bth6d\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.063745 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6698fd5-08f6-4f44-b075-313bbc44ea38-registry-tls\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.063779 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6698fd5-08f6-4f44-b075-313bbc44ea38-bound-sa-token\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.063923 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.063971 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6698fd5-08f6-4f44-b075-313bbc44ea38-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.064025 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6698fd5-08f6-4f44-b075-313bbc44ea38-registry-certificates\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.064101 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6698fd5-08f6-4f44-b075-313bbc44ea38-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.090943 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.164931 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6698fd5-08f6-4f44-b075-313bbc44ea38-bound-sa-token\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.165008 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6698fd5-08f6-4f44-b075-313bbc44ea38-registry-tls\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.165050 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6698fd5-08f6-4f44-b075-313bbc44ea38-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.165089 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6698fd5-08f6-4f44-b075-313bbc44ea38-registry-certificates\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.165112 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6698fd5-08f6-4f44-b075-313bbc44ea38-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.165163 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6698fd5-08f6-4f44-b075-313bbc44ea38-trusted-ca\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.165177 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bth6d\" (UniqueName: \"kubernetes.io/projected/f6698fd5-08f6-4f44-b075-313bbc44ea38-kube-api-access-bth6d\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.165534 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6698fd5-08f6-4f44-b075-313bbc44ea38-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.166927 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6698fd5-08f6-4f44-b075-313bbc44ea38-trusted-ca\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.167284 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6698fd5-08f6-4f44-b075-313bbc44ea38-registry-certificates\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.172587 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6698fd5-08f6-4f44-b075-313bbc44ea38-registry-tls\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.173683 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6698fd5-08f6-4f44-b075-313bbc44ea38-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.182142 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrxxz" event={"ID":"bded68f0-358f-4215-a313-6f28ef9b506c","Type":"ContainerStarted","Data":"3f4156208cbbdd4f80f6f5e8d85d16a7f10ade4a70d6de5382c4815c7f2cb7d2"} Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.196770 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bth6d\" (UniqueName: \"kubernetes.io/projected/f6698fd5-08f6-4f44-b075-313bbc44ea38-kube-api-access-bth6d\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.197410 4843 generic.go:334] "Generic (PLEG): container finished" podID="514ca135-154e-44f9-b4e6-de6b600085b7" containerID="e499738888f51944747ae48e3ee25d6bd95baaf3e0bbdc7ce876692b3aea5a71" exitCode=0 Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.197496 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmr7t" event={"ID":"514ca135-154e-44f9-b4e6-de6b600085b7","Type":"ContainerDied","Data":"e499738888f51944747ae48e3ee25d6bd95baaf3e0bbdc7ce876692b3aea5a71"} Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.198001 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6698fd5-08f6-4f44-b075-313bbc44ea38-bound-sa-token\") pod \"image-registry-66df7c8f76-6w9n2\" (UID: \"f6698fd5-08f6-4f44-b075-313bbc44ea38\") " pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.205063 4843 generic.go:334] "Generic (PLEG): container finished" podID="65efad8c-4b82-445e-b2a6-25c6035182da" containerID="68aa0510d56a8acf7a29d61794ab6d3dfdd9e9591cf91ce090f75da771590b58" exitCode=0 Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.205100 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-927pm" event={"ID":"65efad8c-4b82-445e-b2a6-25c6035182da","Type":"ContainerDied","Data":"68aa0510d56a8acf7a29d61794ab6d3dfdd9e9591cf91ce090f75da771590b58"} Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.205126 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-927pm" event={"ID":"65efad8c-4b82-445e-b2a6-25c6035182da","Type":"ContainerStarted","Data":"2af87dd973cf7463e3680a3e292d847871d5c6cea05c978b1a491d51dba7fefe"} Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.209975 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xrxxz" podStartSLOduration=2.7365913600000003 podStartE2EDuration="5.209950538s" podCreationTimestamp="2026-03-18 12:17:41 +0000 UTC" firstStartedPulling="2026-03-18 12:17:43.131204486 +0000 UTC m=+496.847030030" lastFinishedPulling="2026-03-18 12:17:45.604563694 +0000 UTC m=+499.320389208" observedRunningTime="2026-03-18 12:17:46.204746542 +0000 UTC m=+499.920572066" watchObservedRunningTime="2026-03-18 12:17:46.209950538 +0000 UTC m=+499.925776072" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.224571 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:46 crc kubenswrapper[4843]: I0318 12:17:46.475056 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6w9n2"] Mar 18 12:17:47 crc kubenswrapper[4843]: I0318 12:17:47.225235 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmr7t" event={"ID":"514ca135-154e-44f9-b4e6-de6b600085b7","Type":"ContainerStarted","Data":"2a299d48619968f450191ef837a33a5ae2157f47f8aff0f89a786086ba7504f6"} Mar 18 12:17:47 crc kubenswrapper[4843]: I0318 12:17:47.227070 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" event={"ID":"f6698fd5-08f6-4f44-b075-313bbc44ea38","Type":"ContainerStarted","Data":"466ab50e1d0657b4f893d6198a3faf67ab1079cde65276afc8736e5bf82cc00e"} Mar 18 12:17:47 crc kubenswrapper[4843]: I0318 12:17:47.227165 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:17:47 crc kubenswrapper[4843]: I0318 12:17:47.227181 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" event={"ID":"f6698fd5-08f6-4f44-b075-313bbc44ea38","Type":"ContainerStarted","Data":"0dd9ec02862379aa45aa71bc456162d3be7c346d06be4d4d5fe3ee7c8e888cad"} Mar 18 12:17:47 crc kubenswrapper[4843]: I0318 12:17:47.228750 4843 generic.go:334] "Generic (PLEG): container finished" podID="00cc77a8-dc0e-44dc-905e-8ed09c5646a3" containerID="a5958df25953f06975f46d6cc3ac0d76cebe2f462dce987266ab60f3547a14d3" exitCode=0 Mar 18 12:17:47 crc kubenswrapper[4843]: I0318 12:17:47.228776 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4ncn" event={"ID":"00cc77a8-dc0e-44dc-905e-8ed09c5646a3","Type":"ContainerDied","Data":"a5958df25953f06975f46d6cc3ac0d76cebe2f462dce987266ab60f3547a14d3"} Mar 18 12:17:47 crc kubenswrapper[4843]: I0318 12:17:47.273307 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zmr7t" podStartSLOduration=2.44407774 podStartE2EDuration="5.27327665s" podCreationTimestamp="2026-03-18 12:17:42 +0000 UTC" firstStartedPulling="2026-03-18 12:17:44.182499999 +0000 UTC m=+497.898325523" lastFinishedPulling="2026-03-18 12:17:47.011698909 +0000 UTC m=+500.727524433" observedRunningTime="2026-03-18 12:17:47.248916864 +0000 UTC m=+500.964742408" watchObservedRunningTime="2026-03-18 12:17:47.27327665 +0000 UTC m=+500.989102194" Mar 18 12:17:47 crc kubenswrapper[4843]: I0318 12:17:47.276116 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" podStartSLOduration=2.276100669 podStartE2EDuration="2.276100669s" podCreationTimestamp="2026-03-18 12:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:17:47.268227877 +0000 UTC m=+500.984053411" watchObservedRunningTime="2026-03-18 12:17:47.276100669 +0000 UTC m=+500.991926213" Mar 18 12:17:48 crc kubenswrapper[4843]: I0318 12:17:48.237882 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4ncn" event={"ID":"00cc77a8-dc0e-44dc-905e-8ed09c5646a3","Type":"ContainerStarted","Data":"27fb07d6a114acdb9078658619d9fa63a6cf3d137cde90fe191bc65098abc12a"} Mar 18 12:17:48 crc kubenswrapper[4843]: I0318 12:17:48.241425 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-927pm" event={"ID":"65efad8c-4b82-445e-b2a6-25c6035182da","Type":"ContainerStarted","Data":"8943ceb8dcab0753a3f4240df04e44becbf978380da1bc0cb3779a17d65cd7fd"} Mar 18 12:17:48 crc kubenswrapper[4843]: I0318 12:17:48.263034 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l4ncn" podStartSLOduration=2.6741212819999998 podStartE2EDuration="5.26300339s" podCreationTimestamp="2026-03-18 12:17:43 +0000 UTC" firstStartedPulling="2026-03-18 12:17:45.164585704 +0000 UTC m=+498.880411228" lastFinishedPulling="2026-03-18 12:17:47.753467812 +0000 UTC m=+501.469293336" observedRunningTime="2026-03-18 12:17:48.25699506 +0000 UTC m=+501.972820584" watchObservedRunningTime="2026-03-18 12:17:48.26300339 +0000 UTC m=+501.978828914" Mar 18 12:17:49 crc kubenswrapper[4843]: I0318 12:17:49.251811 4843 generic.go:334] "Generic (PLEG): container finished" podID="65efad8c-4b82-445e-b2a6-25c6035182da" containerID="8943ceb8dcab0753a3f4240df04e44becbf978380da1bc0cb3779a17d65cd7fd" exitCode=0 Mar 18 12:17:49 crc kubenswrapper[4843]: I0318 12:17:49.253021 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-927pm" event={"ID":"65efad8c-4b82-445e-b2a6-25c6035182da","Type":"ContainerDied","Data":"8943ceb8dcab0753a3f4240df04e44becbf978380da1bc0cb3779a17d65cd7fd"} Mar 18 12:17:50 crc kubenswrapper[4843]: I0318 12:17:50.035595 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:17:50 crc kubenswrapper[4843]: I0318 12:17:50.035733 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:17:50 crc kubenswrapper[4843]: I0318 12:17:50.261401 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-927pm" event={"ID":"65efad8c-4b82-445e-b2a6-25c6035182da","Type":"ContainerStarted","Data":"fada6418dab5f17fa6ed513d0ad933833c5127c4f120481ee924cf83ccbdf9d4"} Mar 18 12:17:50 crc kubenswrapper[4843]: I0318 12:17:50.284085 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-927pm" podStartSLOduration=2.433074938 podStartE2EDuration="6.28406916s" podCreationTimestamp="2026-03-18 12:17:44 +0000 UTC" firstStartedPulling="2026-03-18 12:17:46.208436656 +0000 UTC m=+499.924262190" lastFinishedPulling="2026-03-18 12:17:50.059430858 +0000 UTC m=+503.775256412" observedRunningTime="2026-03-18 12:17:50.280278813 +0000 UTC m=+503.996104337" watchObservedRunningTime="2026-03-18 12:17:50.28406916 +0000 UTC m=+503.999894684" Mar 18 12:17:52 crc kubenswrapper[4843]: I0318 12:17:52.407341 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:52 crc kubenswrapper[4843]: I0318 12:17:52.408146 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:52 crc kubenswrapper[4843]: I0318 12:17:52.447042 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:53 crc kubenswrapper[4843]: I0318 12:17:53.079354 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:53 crc kubenswrapper[4843]: I0318 12:17:53.079723 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:53 crc kubenswrapper[4843]: I0318 12:17:53.121306 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:53 crc kubenswrapper[4843]: I0318 12:17:53.313537 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zmr7t" Mar 18 12:17:53 crc kubenswrapper[4843]: I0318 12:17:53.313596 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xrxxz" Mar 18 12:17:54 crc kubenswrapper[4843]: I0318 12:17:54.087081 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:54 crc kubenswrapper[4843]: I0318 12:17:54.087303 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:54 crc kubenswrapper[4843]: I0318 12:17:54.123091 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:54 crc kubenswrapper[4843]: I0318 12:17:54.317903 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l4ncn" Mar 18 12:17:55 crc kubenswrapper[4843]: I0318 12:17:55.335118 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:55 crc kubenswrapper[4843]: I0318 12:17:55.335800 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:55 crc kubenswrapper[4843]: I0318 12:17:55.377779 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:17:56 crc kubenswrapper[4843]: I0318 12:17:56.362326 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-927pm" Mar 18 12:18:00 crc kubenswrapper[4843]: I0318 12:18:00.176532 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563938-xd75z"] Mar 18 12:18:00 crc kubenswrapper[4843]: I0318 12:18:00.179353 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563938-xd75z" Mar 18 12:18:00 crc kubenswrapper[4843]: I0318 12:18:00.183266 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:18:00 crc kubenswrapper[4843]: I0318 12:18:00.183874 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:18:00 crc kubenswrapper[4843]: I0318 12:18:00.189694 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563938-xd75z"] Mar 18 12:18:00 crc kubenswrapper[4843]: I0318 12:18:00.190501 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:18:00 crc kubenswrapper[4843]: I0318 12:18:00.277960 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkr8f\" (UniqueName: \"kubernetes.io/projected/f1fc99fd-7d70-4257-a31c-ed61653da64e-kube-api-access-zkr8f\") pod \"auto-csr-approver-29563938-xd75z\" (UID: \"f1fc99fd-7d70-4257-a31c-ed61653da64e\") " pod="openshift-infra/auto-csr-approver-29563938-xd75z" Mar 18 12:18:00 crc kubenswrapper[4843]: I0318 12:18:00.379365 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkr8f\" (UniqueName: \"kubernetes.io/projected/f1fc99fd-7d70-4257-a31c-ed61653da64e-kube-api-access-zkr8f\") pod \"auto-csr-approver-29563938-xd75z\" (UID: \"f1fc99fd-7d70-4257-a31c-ed61653da64e\") " pod="openshift-infra/auto-csr-approver-29563938-xd75z" Mar 18 12:18:00 crc kubenswrapper[4843]: I0318 12:18:00.409127 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkr8f\" (UniqueName: \"kubernetes.io/projected/f1fc99fd-7d70-4257-a31c-ed61653da64e-kube-api-access-zkr8f\") pod \"auto-csr-approver-29563938-xd75z\" (UID: \"f1fc99fd-7d70-4257-a31c-ed61653da64e\") " pod="openshift-infra/auto-csr-approver-29563938-xd75z" Mar 18 12:18:00 crc kubenswrapper[4843]: I0318 12:18:00.495644 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563938-xd75z" Mar 18 12:18:00 crc kubenswrapper[4843]: I0318 12:18:00.903933 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563938-xd75z"] Mar 18 12:18:00 crc kubenswrapper[4843]: W0318 12:18:00.913046 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1fc99fd_7d70_4257_a31c_ed61653da64e.slice/crio-5caa8efc862a64e1d91ba8c32e38af8a723c1961aa95a2f19d942489d061f337 WatchSource:0}: Error finding container 5caa8efc862a64e1d91ba8c32e38af8a723c1961aa95a2f19d942489d061f337: Status 404 returned error can't find the container with id 5caa8efc862a64e1d91ba8c32e38af8a723c1961aa95a2f19d942489d061f337 Mar 18 12:18:01 crc kubenswrapper[4843]: I0318 12:18:01.388326 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563938-xd75z" event={"ID":"f1fc99fd-7d70-4257-a31c-ed61653da64e","Type":"ContainerStarted","Data":"5caa8efc862a64e1d91ba8c32e38af8a723c1961aa95a2f19d942489d061f337"} Mar 18 12:18:03 crc kubenswrapper[4843]: I0318 12:18:03.401040 4843 generic.go:334] "Generic (PLEG): container finished" podID="f1fc99fd-7d70-4257-a31c-ed61653da64e" containerID="58597e6eb6a7a09eb41b37c2812d53fe1f1358c6858cdb2f6cc70f038ff2d61f" exitCode=0 Mar 18 12:18:03 crc kubenswrapper[4843]: I0318 12:18:03.401099 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563938-xd75z" event={"ID":"f1fc99fd-7d70-4257-a31c-ed61653da64e","Type":"ContainerDied","Data":"58597e6eb6a7a09eb41b37c2812d53fe1f1358c6858cdb2f6cc70f038ff2d61f"} Mar 18 12:18:04 crc kubenswrapper[4843]: I0318 12:18:04.620943 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563938-xd75z" Mar 18 12:18:04 crc kubenswrapper[4843]: I0318 12:18:04.745859 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkr8f\" (UniqueName: \"kubernetes.io/projected/f1fc99fd-7d70-4257-a31c-ed61653da64e-kube-api-access-zkr8f\") pod \"f1fc99fd-7d70-4257-a31c-ed61653da64e\" (UID: \"f1fc99fd-7d70-4257-a31c-ed61653da64e\") " Mar 18 12:18:04 crc kubenswrapper[4843]: I0318 12:18:04.752009 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1fc99fd-7d70-4257-a31c-ed61653da64e-kube-api-access-zkr8f" (OuterVolumeSpecName: "kube-api-access-zkr8f") pod "f1fc99fd-7d70-4257-a31c-ed61653da64e" (UID: "f1fc99fd-7d70-4257-a31c-ed61653da64e"). InnerVolumeSpecName "kube-api-access-zkr8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:18:04 crc kubenswrapper[4843]: I0318 12:18:04.848112 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkr8f\" (UniqueName: \"kubernetes.io/projected/f1fc99fd-7d70-4257-a31c-ed61653da64e-kube-api-access-zkr8f\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:05 crc kubenswrapper[4843]: I0318 12:18:05.413325 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563938-xd75z" event={"ID":"f1fc99fd-7d70-4257-a31c-ed61653da64e","Type":"ContainerDied","Data":"5caa8efc862a64e1d91ba8c32e38af8a723c1961aa95a2f19d942489d061f337"} Mar 18 12:18:05 crc kubenswrapper[4843]: I0318 12:18:05.413369 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5caa8efc862a64e1d91ba8c32e38af8a723c1961aa95a2f19d942489d061f337" Mar 18 12:18:05 crc kubenswrapper[4843]: I0318 12:18:05.413411 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563938-xd75z" Mar 18 12:18:05 crc kubenswrapper[4843]: I0318 12:18:05.690638 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563932-bzf8p"] Mar 18 12:18:05 crc kubenswrapper[4843]: I0318 12:18:05.693952 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563932-bzf8p"] Mar 18 12:18:06 crc kubenswrapper[4843]: I0318 12:18:06.231788 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-6w9n2" Mar 18 12:18:06 crc kubenswrapper[4843]: I0318 12:18:06.293959 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9g67x"] Mar 18 12:18:07 crc kubenswrapper[4843]: I0318 12:18:07.074331 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="803876de-64f6-4347-8ea5-6d2d8f87e828" path="/var/lib/kubelet/pods/803876de-64f6-4347-8ea5-6d2d8f87e828/volumes" Mar 18 12:18:20 crc kubenswrapper[4843]: I0318 12:18:20.035729 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:18:20 crc kubenswrapper[4843]: I0318 12:18:20.036729 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:18:20 crc kubenswrapper[4843]: I0318 12:18:20.036821 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:18:20 crc kubenswrapper[4843]: I0318 12:18:20.037840 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"935c5c8bac02b5ebb6013b7620e469aeb3c950d7626ae59a60e0b050f1d51353"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:18:20 crc kubenswrapper[4843]: I0318 12:18:20.037963 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://935c5c8bac02b5ebb6013b7620e469aeb3c950d7626ae59a60e0b050f1d51353" gracePeriod=600 Mar 18 12:18:20 crc kubenswrapper[4843]: I0318 12:18:20.231755 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="935c5c8bac02b5ebb6013b7620e469aeb3c950d7626ae59a60e0b050f1d51353" exitCode=0 Mar 18 12:18:20 crc kubenswrapper[4843]: I0318 12:18:20.231870 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"935c5c8bac02b5ebb6013b7620e469aeb3c950d7626ae59a60e0b050f1d51353"} Mar 18 12:18:20 crc kubenswrapper[4843]: I0318 12:18:20.232070 4843 scope.go:117] "RemoveContainer" containerID="6c3ea9d631549bb3436c1ed7e1cf4059b5a1c3df70830e0f112a5a3efc115651" Mar 18 12:18:21 crc kubenswrapper[4843]: I0318 12:18:21.239521 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"5c3caf7d8c6845049235506b69806f5942daca0107c049b31b6ec76e46fc65dc"} Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.339142 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" podUID="13295157-4a57-4e7f-9ff4-13c2a4381c27" containerName="registry" containerID="cri-o://40553dc05de5155b32f1542c3e39c5a553f8008987939eabcf80cbf72ad9c002" gracePeriod=30 Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.689334 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.849832 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"13295157-4a57-4e7f-9ff4-13c2a4381c27\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.849893 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxbf7\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-kube-api-access-mxbf7\") pod \"13295157-4a57-4e7f-9ff4-13c2a4381c27\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.849929 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/13295157-4a57-4e7f-9ff4-13c2a4381c27-registry-certificates\") pod \"13295157-4a57-4e7f-9ff4-13c2a4381c27\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.849977 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13295157-4a57-4e7f-9ff4-13c2a4381c27-trusted-ca\") pod \"13295157-4a57-4e7f-9ff4-13c2a4381c27\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.850012 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-bound-sa-token\") pod \"13295157-4a57-4e7f-9ff4-13c2a4381c27\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.850098 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-registry-tls\") pod \"13295157-4a57-4e7f-9ff4-13c2a4381c27\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.850131 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/13295157-4a57-4e7f-9ff4-13c2a4381c27-installation-pull-secrets\") pod \"13295157-4a57-4e7f-9ff4-13c2a4381c27\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.850158 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/13295157-4a57-4e7f-9ff4-13c2a4381c27-ca-trust-extracted\") pod \"13295157-4a57-4e7f-9ff4-13c2a4381c27\" (UID: \"13295157-4a57-4e7f-9ff4-13c2a4381c27\") " Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.851895 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13295157-4a57-4e7f-9ff4-13c2a4381c27-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "13295157-4a57-4e7f-9ff4-13c2a4381c27" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.852135 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13295157-4a57-4e7f-9ff4-13c2a4381c27-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "13295157-4a57-4e7f-9ff4-13c2a4381c27" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.857287 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-kube-api-access-mxbf7" (OuterVolumeSpecName: "kube-api-access-mxbf7") pod "13295157-4a57-4e7f-9ff4-13c2a4381c27" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27"). InnerVolumeSpecName "kube-api-access-mxbf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.857633 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13295157-4a57-4e7f-9ff4-13c2a4381c27-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "13295157-4a57-4e7f-9ff4-13c2a4381c27" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.858091 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "13295157-4a57-4e7f-9ff4-13c2a4381c27" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.860257 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "13295157-4a57-4e7f-9ff4-13c2a4381c27" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.866223 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "13295157-4a57-4e7f-9ff4-13c2a4381c27" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.876688 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13295157-4a57-4e7f-9ff4-13c2a4381c27-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "13295157-4a57-4e7f-9ff4-13c2a4381c27" (UID: "13295157-4a57-4e7f-9ff4-13c2a4381c27"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.952726 4843 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/13295157-4a57-4e7f-9ff4-13c2a4381c27-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.952797 4843 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/13295157-4a57-4e7f-9ff4-13c2a4381c27-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.952826 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxbf7\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-kube-api-access-mxbf7\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.952848 4843 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/13295157-4a57-4e7f-9ff4-13c2a4381c27-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.952874 4843 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13295157-4a57-4e7f-9ff4-13c2a4381c27-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.952900 4843 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:31 crc kubenswrapper[4843]: I0318 12:18:31.952922 4843 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/13295157-4a57-4e7f-9ff4-13c2a4381c27-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:32 crc kubenswrapper[4843]: I0318 12:18:32.297408 4843 generic.go:334] "Generic (PLEG): container finished" podID="13295157-4a57-4e7f-9ff4-13c2a4381c27" containerID="40553dc05de5155b32f1542c3e39c5a553f8008987939eabcf80cbf72ad9c002" exitCode=0 Mar 18 12:18:32 crc kubenswrapper[4843]: I0318 12:18:32.297457 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" event={"ID":"13295157-4a57-4e7f-9ff4-13c2a4381c27","Type":"ContainerDied","Data":"40553dc05de5155b32f1542c3e39c5a553f8008987939eabcf80cbf72ad9c002"} Mar 18 12:18:32 crc kubenswrapper[4843]: I0318 12:18:32.297467 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" Mar 18 12:18:32 crc kubenswrapper[4843]: I0318 12:18:32.297485 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9g67x" event={"ID":"13295157-4a57-4e7f-9ff4-13c2a4381c27","Type":"ContainerDied","Data":"941d3851100d62a06548ff236679ceefd6db270e5856456a99912e60241302c1"} Mar 18 12:18:32 crc kubenswrapper[4843]: I0318 12:18:32.297505 4843 scope.go:117] "RemoveContainer" containerID="40553dc05de5155b32f1542c3e39c5a553f8008987939eabcf80cbf72ad9c002" Mar 18 12:18:32 crc kubenswrapper[4843]: I0318 12:18:32.326371 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9g67x"] Mar 18 12:18:32 crc kubenswrapper[4843]: I0318 12:18:32.329313 4843 scope.go:117] "RemoveContainer" containerID="40553dc05de5155b32f1542c3e39c5a553f8008987939eabcf80cbf72ad9c002" Mar 18 12:18:32 crc kubenswrapper[4843]: E0318 12:18:32.330133 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40553dc05de5155b32f1542c3e39c5a553f8008987939eabcf80cbf72ad9c002\": container with ID starting with 40553dc05de5155b32f1542c3e39c5a553f8008987939eabcf80cbf72ad9c002 not found: ID does not exist" containerID="40553dc05de5155b32f1542c3e39c5a553f8008987939eabcf80cbf72ad9c002" Mar 18 12:18:32 crc kubenswrapper[4843]: I0318 12:18:32.330266 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40553dc05de5155b32f1542c3e39c5a553f8008987939eabcf80cbf72ad9c002"} err="failed to get container status \"40553dc05de5155b32f1542c3e39c5a553f8008987939eabcf80cbf72ad9c002\": rpc error: code = NotFound desc = could not find container \"40553dc05de5155b32f1542c3e39c5a553f8008987939eabcf80cbf72ad9c002\": container with ID starting with 40553dc05de5155b32f1542c3e39c5a553f8008987939eabcf80cbf72ad9c002 not found: ID does not exist" Mar 18 12:18:32 crc kubenswrapper[4843]: I0318 12:18:32.330780 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9g67x"] Mar 18 12:18:32 crc kubenswrapper[4843]: I0318 12:18:32.990596 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13295157-4a57-4e7f-9ff4-13c2a4381c27" path="/var/lib/kubelet/pods/13295157-4a57-4e7f-9ff4-13c2a4381c27/volumes" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.158915 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563940-q5psz"] Mar 18 12:20:00 crc kubenswrapper[4843]: E0318 12:20:00.160466 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13295157-4a57-4e7f-9ff4-13c2a4381c27" containerName="registry" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.160486 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="13295157-4a57-4e7f-9ff4-13c2a4381c27" containerName="registry" Mar 18 12:20:00 crc kubenswrapper[4843]: E0318 12:20:00.160499 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1fc99fd-7d70-4257-a31c-ed61653da64e" containerName="oc" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.160505 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1fc99fd-7d70-4257-a31c-ed61653da64e" containerName="oc" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.160680 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="13295157-4a57-4e7f-9ff4-13c2a4381c27" containerName="registry" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.163112 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1fc99fd-7d70-4257-a31c-ed61653da64e" containerName="oc" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.164054 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563940-q5psz" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.169046 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563940-q5psz"] Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.170366 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.170437 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.170489 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.366450 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wcwg\" (UniqueName: \"kubernetes.io/projected/27f627d5-05cf-46a6-a1f9-65ecf9b3c206-kube-api-access-6wcwg\") pod \"auto-csr-approver-29563940-q5psz\" (UID: \"27f627d5-05cf-46a6-a1f9-65ecf9b3c206\") " pod="openshift-infra/auto-csr-approver-29563940-q5psz" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.467877 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wcwg\" (UniqueName: \"kubernetes.io/projected/27f627d5-05cf-46a6-a1f9-65ecf9b3c206-kube-api-access-6wcwg\") pod \"auto-csr-approver-29563940-q5psz\" (UID: \"27f627d5-05cf-46a6-a1f9-65ecf9b3c206\") " pod="openshift-infra/auto-csr-approver-29563940-q5psz" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.499970 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wcwg\" (UniqueName: \"kubernetes.io/projected/27f627d5-05cf-46a6-a1f9-65ecf9b3c206-kube-api-access-6wcwg\") pod \"auto-csr-approver-29563940-q5psz\" (UID: \"27f627d5-05cf-46a6-a1f9-65ecf9b3c206\") " pod="openshift-infra/auto-csr-approver-29563940-q5psz" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.786256 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563940-q5psz" Mar 18 12:20:00 crc kubenswrapper[4843]: I0318 12:20:00.998212 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563940-q5psz"] Mar 18 12:20:01 crc kubenswrapper[4843]: I0318 12:20:01.003303 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:20:01 crc kubenswrapper[4843]: I0318 12:20:01.173500 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563940-q5psz" event={"ID":"27f627d5-05cf-46a6-a1f9-65ecf9b3c206","Type":"ContainerStarted","Data":"5613f69602d8472e30f3c06dedceb7bfe8bc2d0c4fa2a60992734e8d87549b73"} Mar 18 12:20:03 crc kubenswrapper[4843]: I0318 12:20:03.186952 4843 generic.go:334] "Generic (PLEG): container finished" podID="27f627d5-05cf-46a6-a1f9-65ecf9b3c206" containerID="26c6ce1823f1c66b746ee92acfb9fc33322775aebb2130c80065909fd8f65dd5" exitCode=0 Mar 18 12:20:03 crc kubenswrapper[4843]: I0318 12:20:03.187039 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563940-q5psz" event={"ID":"27f627d5-05cf-46a6-a1f9-65ecf9b3c206","Type":"ContainerDied","Data":"26c6ce1823f1c66b746ee92acfb9fc33322775aebb2130c80065909fd8f65dd5"} Mar 18 12:20:04 crc kubenswrapper[4843]: I0318 12:20:04.486228 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563940-q5psz" Mar 18 12:20:04 crc kubenswrapper[4843]: I0318 12:20:04.625329 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wcwg\" (UniqueName: \"kubernetes.io/projected/27f627d5-05cf-46a6-a1f9-65ecf9b3c206-kube-api-access-6wcwg\") pod \"27f627d5-05cf-46a6-a1f9-65ecf9b3c206\" (UID: \"27f627d5-05cf-46a6-a1f9-65ecf9b3c206\") " Mar 18 12:20:04 crc kubenswrapper[4843]: I0318 12:20:04.632012 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f627d5-05cf-46a6-a1f9-65ecf9b3c206-kube-api-access-6wcwg" (OuterVolumeSpecName: "kube-api-access-6wcwg") pod "27f627d5-05cf-46a6-a1f9-65ecf9b3c206" (UID: "27f627d5-05cf-46a6-a1f9-65ecf9b3c206"). InnerVolumeSpecName "kube-api-access-6wcwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:20:04 crc kubenswrapper[4843]: I0318 12:20:04.727303 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wcwg\" (UniqueName: \"kubernetes.io/projected/27f627d5-05cf-46a6-a1f9-65ecf9b3c206-kube-api-access-6wcwg\") on node \"crc\" DevicePath \"\"" Mar 18 12:20:05 crc kubenswrapper[4843]: I0318 12:20:05.205456 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563940-q5psz" event={"ID":"27f627d5-05cf-46a6-a1f9-65ecf9b3c206","Type":"ContainerDied","Data":"5613f69602d8472e30f3c06dedceb7bfe8bc2d0c4fa2a60992734e8d87549b73"} Mar 18 12:20:05 crc kubenswrapper[4843]: I0318 12:20:05.205523 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5613f69602d8472e30f3c06dedceb7bfe8bc2d0c4fa2a60992734e8d87549b73" Mar 18 12:20:05 crc kubenswrapper[4843]: I0318 12:20:05.205531 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563940-q5psz" Mar 18 12:20:05 crc kubenswrapper[4843]: I0318 12:20:05.571238 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563934-9j5hk"] Mar 18 12:20:05 crc kubenswrapper[4843]: I0318 12:20:05.575327 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563934-9j5hk"] Mar 18 12:20:06 crc kubenswrapper[4843]: I0318 12:20:06.994702 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f32540f-28b9-46d1-a943-f04368d4cae2" path="/var/lib/kubelet/pods/4f32540f-28b9-46d1-a943-f04368d4cae2/volumes" Mar 18 12:20:20 crc kubenswrapper[4843]: I0318 12:20:20.035292 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:20:20 crc kubenswrapper[4843]: I0318 12:20:20.036271 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:20:50 crc kubenswrapper[4843]: I0318 12:20:50.035249 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:20:50 crc kubenswrapper[4843]: I0318 12:20:50.035915 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:21:20 crc kubenswrapper[4843]: I0318 12:21:20.035719 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:21:20 crc kubenswrapper[4843]: I0318 12:21:20.036807 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:21:20 crc kubenswrapper[4843]: I0318 12:21:20.036901 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:21:20 crc kubenswrapper[4843]: I0318 12:21:20.038149 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c3caf7d8c6845049235506b69806f5942daca0107c049b31b6ec76e46fc65dc"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:21:20 crc kubenswrapper[4843]: I0318 12:21:20.038239 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://5c3caf7d8c6845049235506b69806f5942daca0107c049b31b6ec76e46fc65dc" gracePeriod=600 Mar 18 12:21:20 crc kubenswrapper[4843]: I0318 12:21:20.896430 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="5c3caf7d8c6845049235506b69806f5942daca0107c049b31b6ec76e46fc65dc" exitCode=0 Mar 18 12:21:20 crc kubenswrapper[4843]: I0318 12:21:20.897031 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"5c3caf7d8c6845049235506b69806f5942daca0107c049b31b6ec76e46fc65dc"} Mar 18 12:21:20 crc kubenswrapper[4843]: I0318 12:21:20.897077 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"900a6973f4fc33d51f048ffe75a20de2a300ad414b4d778db1160e26d9d43452"} Mar 18 12:21:20 crc kubenswrapper[4843]: I0318 12:21:20.897103 4843 scope.go:117] "RemoveContainer" containerID="935c5c8bac02b5ebb6013b7620e469aeb3c950d7626ae59a60e0b050f1d51353" Mar 18 12:21:56 crc kubenswrapper[4843]: I0318 12:21:56.359132 4843 scope.go:117] "RemoveContainer" containerID="817b68362f28f87c80accada443d0e386ec43d01922fa240f9be74fcf5abe4aa" Mar 18 12:21:56 crc kubenswrapper[4843]: I0318 12:21:56.403329 4843 scope.go:117] "RemoveContainer" containerID="41a8bab979722aa364412d9f395252731eeafd17df8f23745e560c3fd7f9d30e" Mar 18 12:22:00 crc kubenswrapper[4843]: I0318 12:22:00.144527 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563942-sgbqf"] Mar 18 12:22:00 crc kubenswrapper[4843]: E0318 12:22:00.145281 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f627d5-05cf-46a6-a1f9-65ecf9b3c206" containerName="oc" Mar 18 12:22:00 crc kubenswrapper[4843]: I0318 12:22:00.145305 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f627d5-05cf-46a6-a1f9-65ecf9b3c206" containerName="oc" Mar 18 12:22:00 crc kubenswrapper[4843]: I0318 12:22:00.146959 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f627d5-05cf-46a6-a1f9-65ecf9b3c206" containerName="oc" Mar 18 12:22:00 crc kubenswrapper[4843]: I0318 12:22:00.147631 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563942-sgbqf" Mar 18 12:22:00 crc kubenswrapper[4843]: I0318 12:22:00.150866 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:22:00 crc kubenswrapper[4843]: I0318 12:22:00.151020 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:22:00 crc kubenswrapper[4843]: I0318 12:22:00.152859 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:22:00 crc kubenswrapper[4843]: I0318 12:22:00.153120 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563942-sgbqf"] Mar 18 12:22:00 crc kubenswrapper[4843]: I0318 12:22:00.340199 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc5p5\" (UniqueName: \"kubernetes.io/projected/a7e759d3-8475-46c1-b8ac-7cdca013d031-kube-api-access-cc5p5\") pod \"auto-csr-approver-29563942-sgbqf\" (UID: \"a7e759d3-8475-46c1-b8ac-7cdca013d031\") " pod="openshift-infra/auto-csr-approver-29563942-sgbqf" Mar 18 12:22:00 crc kubenswrapper[4843]: I0318 12:22:00.441767 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc5p5\" (UniqueName: \"kubernetes.io/projected/a7e759d3-8475-46c1-b8ac-7cdca013d031-kube-api-access-cc5p5\") pod \"auto-csr-approver-29563942-sgbqf\" (UID: \"a7e759d3-8475-46c1-b8ac-7cdca013d031\") " pod="openshift-infra/auto-csr-approver-29563942-sgbqf" Mar 18 12:22:00 crc kubenswrapper[4843]: I0318 12:22:00.468828 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc5p5\" (UniqueName: \"kubernetes.io/projected/a7e759d3-8475-46c1-b8ac-7cdca013d031-kube-api-access-cc5p5\") pod \"auto-csr-approver-29563942-sgbqf\" (UID: \"a7e759d3-8475-46c1-b8ac-7cdca013d031\") " pod="openshift-infra/auto-csr-approver-29563942-sgbqf" Mar 18 12:22:00 crc kubenswrapper[4843]: I0318 12:22:00.506079 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563942-sgbqf" Mar 18 12:22:00 crc kubenswrapper[4843]: I0318 12:22:00.721167 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563942-sgbqf"] Mar 18 12:22:01 crc kubenswrapper[4843]: I0318 12:22:01.267607 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563942-sgbqf" event={"ID":"a7e759d3-8475-46c1-b8ac-7cdca013d031","Type":"ContainerStarted","Data":"f08f770506fe71b83a6d2e5de4f6a58ba7853878b04a09895afd3bc8d9571eb2"} Mar 18 12:22:02 crc kubenswrapper[4843]: I0318 12:22:02.273753 4843 generic.go:334] "Generic (PLEG): container finished" podID="a7e759d3-8475-46c1-b8ac-7cdca013d031" containerID="8abbda86c72abc940167f7d62e51c0a7bbfb31a957120d6605251ddb64ca44d8" exitCode=0 Mar 18 12:22:02 crc kubenswrapper[4843]: I0318 12:22:02.273813 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563942-sgbqf" event={"ID":"a7e759d3-8475-46c1-b8ac-7cdca013d031","Type":"ContainerDied","Data":"8abbda86c72abc940167f7d62e51c0a7bbfb31a957120d6605251ddb64ca44d8"} Mar 18 12:22:03 crc kubenswrapper[4843]: I0318 12:22:03.538501 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563942-sgbqf" Mar 18 12:22:03 crc kubenswrapper[4843]: I0318 12:22:03.686135 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc5p5\" (UniqueName: \"kubernetes.io/projected/a7e759d3-8475-46c1-b8ac-7cdca013d031-kube-api-access-cc5p5\") pod \"a7e759d3-8475-46c1-b8ac-7cdca013d031\" (UID: \"a7e759d3-8475-46c1-b8ac-7cdca013d031\") " Mar 18 12:22:03 crc kubenswrapper[4843]: I0318 12:22:03.693069 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e759d3-8475-46c1-b8ac-7cdca013d031-kube-api-access-cc5p5" (OuterVolumeSpecName: "kube-api-access-cc5p5") pod "a7e759d3-8475-46c1-b8ac-7cdca013d031" (UID: "a7e759d3-8475-46c1-b8ac-7cdca013d031"). InnerVolumeSpecName "kube-api-access-cc5p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:22:03 crc kubenswrapper[4843]: I0318 12:22:03.787516 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc5p5\" (UniqueName: \"kubernetes.io/projected/a7e759d3-8475-46c1-b8ac-7cdca013d031-kube-api-access-cc5p5\") on node \"crc\" DevicePath \"\"" Mar 18 12:22:04 crc kubenswrapper[4843]: I0318 12:22:04.294922 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563942-sgbqf" event={"ID":"a7e759d3-8475-46c1-b8ac-7cdca013d031","Type":"ContainerDied","Data":"f08f770506fe71b83a6d2e5de4f6a58ba7853878b04a09895afd3bc8d9571eb2"} Mar 18 12:22:04 crc kubenswrapper[4843]: I0318 12:22:04.295011 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f08f770506fe71b83a6d2e5de4f6a58ba7853878b04a09895afd3bc8d9571eb2" Mar 18 12:22:04 crc kubenswrapper[4843]: I0318 12:22:04.294959 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563942-sgbqf" Mar 18 12:22:04 crc kubenswrapper[4843]: I0318 12:22:04.633381 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563936-xkdrn"] Mar 18 12:22:04 crc kubenswrapper[4843]: I0318 12:22:04.640676 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563936-xkdrn"] Mar 18 12:22:04 crc kubenswrapper[4843]: I0318 12:22:04.991010 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c22c026-c039-414d-aa42-cfbcd1799c74" path="/var/lib/kubelet/pods/1c22c026-c039-414d-aa42-cfbcd1799c74/volumes" Mar 18 12:22:56 crc kubenswrapper[4843]: I0318 12:22:56.470379 4843 scope.go:117] "RemoveContainer" containerID="36866618c1f9501a88cf88087a428b34a0727f9f12b83f17d7be3e7af7d7ca01" Mar 18 12:23:20 crc kubenswrapper[4843]: I0318 12:23:20.034934 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:23:20 crc kubenswrapper[4843]: I0318 12:23:20.035958 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:23:50 crc kubenswrapper[4843]: I0318 12:23:50.035595 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:23:50 crc kubenswrapper[4843]: I0318 12:23:50.036275 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:23:52 crc kubenswrapper[4843]: I0318 12:23:52.628289 4843 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 12:24:00 crc kubenswrapper[4843]: I0318 12:24:00.149146 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563944-69jz4"] Mar 18 12:24:00 crc kubenswrapper[4843]: E0318 12:24:00.149956 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e759d3-8475-46c1-b8ac-7cdca013d031" containerName="oc" Mar 18 12:24:00 crc kubenswrapper[4843]: I0318 12:24:00.149970 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e759d3-8475-46c1-b8ac-7cdca013d031" containerName="oc" Mar 18 12:24:00 crc kubenswrapper[4843]: I0318 12:24:00.150121 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e759d3-8475-46c1-b8ac-7cdca013d031" containerName="oc" Mar 18 12:24:00 crc kubenswrapper[4843]: I0318 12:24:00.150540 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563944-69jz4" Mar 18 12:24:00 crc kubenswrapper[4843]: I0318 12:24:00.175748 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:24:00 crc kubenswrapper[4843]: I0318 12:24:00.176011 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:24:00 crc kubenswrapper[4843]: I0318 12:24:00.178389 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:24:00 crc kubenswrapper[4843]: I0318 12:24:00.212992 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4p4q\" (UniqueName: \"kubernetes.io/projected/0d999a45-20dd-4cc6-820a-1d54ca5f0fe2-kube-api-access-n4p4q\") pod \"auto-csr-approver-29563944-69jz4\" (UID: \"0d999a45-20dd-4cc6-820a-1d54ca5f0fe2\") " pod="openshift-infra/auto-csr-approver-29563944-69jz4" Mar 18 12:24:00 crc kubenswrapper[4843]: I0318 12:24:00.219611 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563944-69jz4"] Mar 18 12:24:00 crc kubenswrapper[4843]: I0318 12:24:00.314224 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4p4q\" (UniqueName: \"kubernetes.io/projected/0d999a45-20dd-4cc6-820a-1d54ca5f0fe2-kube-api-access-n4p4q\") pod \"auto-csr-approver-29563944-69jz4\" (UID: \"0d999a45-20dd-4cc6-820a-1d54ca5f0fe2\") " pod="openshift-infra/auto-csr-approver-29563944-69jz4" Mar 18 12:24:00 crc kubenswrapper[4843]: I0318 12:24:00.334692 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4p4q\" (UniqueName: \"kubernetes.io/projected/0d999a45-20dd-4cc6-820a-1d54ca5f0fe2-kube-api-access-n4p4q\") pod \"auto-csr-approver-29563944-69jz4\" (UID: \"0d999a45-20dd-4cc6-820a-1d54ca5f0fe2\") " pod="openshift-infra/auto-csr-approver-29563944-69jz4" Mar 18 12:24:00 crc kubenswrapper[4843]: I0318 12:24:00.490820 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563944-69jz4" Mar 18 12:24:01 crc kubenswrapper[4843]: I0318 12:24:01.003953 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563944-69jz4"] Mar 18 12:24:01 crc kubenswrapper[4843]: I0318 12:24:01.182573 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563944-69jz4" event={"ID":"0d999a45-20dd-4cc6-820a-1d54ca5f0fe2","Type":"ContainerStarted","Data":"3e568ac2c4518dff8296570d6fd6b0efedaa020149f82ae3790ef71dfa995a76"} Mar 18 12:24:03 crc kubenswrapper[4843]: I0318 12:24:03.195961 4843 generic.go:334] "Generic (PLEG): container finished" podID="0d999a45-20dd-4cc6-820a-1d54ca5f0fe2" containerID="89710d877cb10056d8c3b344b39be2354aff6222e9a15e23590a0650948a2b32" exitCode=0 Mar 18 12:24:03 crc kubenswrapper[4843]: I0318 12:24:03.196027 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563944-69jz4" event={"ID":"0d999a45-20dd-4cc6-820a-1d54ca5f0fe2","Type":"ContainerDied","Data":"89710d877cb10056d8c3b344b39be2354aff6222e9a15e23590a0650948a2b32"} Mar 18 12:24:04 crc kubenswrapper[4843]: I0318 12:24:04.406401 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563944-69jz4" Mar 18 12:24:04 crc kubenswrapper[4843]: I0318 12:24:04.575526 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4p4q\" (UniqueName: \"kubernetes.io/projected/0d999a45-20dd-4cc6-820a-1d54ca5f0fe2-kube-api-access-n4p4q\") pod \"0d999a45-20dd-4cc6-820a-1d54ca5f0fe2\" (UID: \"0d999a45-20dd-4cc6-820a-1d54ca5f0fe2\") " Mar 18 12:24:04 crc kubenswrapper[4843]: I0318 12:24:04.582207 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d999a45-20dd-4cc6-820a-1d54ca5f0fe2-kube-api-access-n4p4q" (OuterVolumeSpecName: "kube-api-access-n4p4q") pod "0d999a45-20dd-4cc6-820a-1d54ca5f0fe2" (UID: "0d999a45-20dd-4cc6-820a-1d54ca5f0fe2"). InnerVolumeSpecName "kube-api-access-n4p4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:24:04 crc kubenswrapper[4843]: I0318 12:24:04.676809 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4p4q\" (UniqueName: \"kubernetes.io/projected/0d999a45-20dd-4cc6-820a-1d54ca5f0fe2-kube-api-access-n4p4q\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:05 crc kubenswrapper[4843]: I0318 12:24:05.208321 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563944-69jz4" event={"ID":"0d999a45-20dd-4cc6-820a-1d54ca5f0fe2","Type":"ContainerDied","Data":"3e568ac2c4518dff8296570d6fd6b0efedaa020149f82ae3790ef71dfa995a76"} Mar 18 12:24:05 crc kubenswrapper[4843]: I0318 12:24:05.208415 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563944-69jz4" Mar 18 12:24:05 crc kubenswrapper[4843]: I0318 12:24:05.208443 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e568ac2c4518dff8296570d6fd6b0efedaa020149f82ae3790ef71dfa995a76" Mar 18 12:24:05 crc kubenswrapper[4843]: I0318 12:24:05.581001 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563938-xd75z"] Mar 18 12:24:05 crc kubenswrapper[4843]: I0318 12:24:05.585223 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563938-xd75z"] Mar 18 12:24:06 crc kubenswrapper[4843]: I0318 12:24:06.993490 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1fc99fd-7d70-4257-a31c-ed61653da64e" path="/var/lib/kubelet/pods/f1fc99fd-7d70-4257-a31c-ed61653da64e/volumes" Mar 18 12:24:20 crc kubenswrapper[4843]: I0318 12:24:20.035851 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:24:20 crc kubenswrapper[4843]: I0318 12:24:20.036849 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:24:20 crc kubenswrapper[4843]: I0318 12:24:20.036945 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:24:20 crc kubenswrapper[4843]: I0318 12:24:20.038098 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"900a6973f4fc33d51f048ffe75a20de2a300ad414b4d778db1160e26d9d43452"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:24:20 crc kubenswrapper[4843]: I0318 12:24:20.038263 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://900a6973f4fc33d51f048ffe75a20de2a300ad414b4d778db1160e26d9d43452" gracePeriod=600 Mar 18 12:24:20 crc kubenswrapper[4843]: I0318 12:24:20.296763 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="900a6973f4fc33d51f048ffe75a20de2a300ad414b4d778db1160e26d9d43452" exitCode=0 Mar 18 12:24:20 crc kubenswrapper[4843]: I0318 12:24:20.296811 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"900a6973f4fc33d51f048ffe75a20de2a300ad414b4d778db1160e26d9d43452"} Mar 18 12:24:20 crc kubenswrapper[4843]: I0318 12:24:20.297920 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"156751a099ebefa58e45dd19fa380fffeac977e92f6bd61d7c8b0b1be68aae80"} Mar 18 12:24:20 crc kubenswrapper[4843]: I0318 12:24:20.298013 4843 scope.go:117] "RemoveContainer" containerID="5c3caf7d8c6845049235506b69806f5942daca0107c049b31b6ec76e46fc65dc" Mar 18 12:24:56 crc kubenswrapper[4843]: I0318 12:24:56.535233 4843 scope.go:117] "RemoveContainer" containerID="58597e6eb6a7a09eb41b37c2812d53fe1f1358c6858cdb2f6cc70f038ff2d61f" Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.907012 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6fgmp"] Mar 18 12:25:07 crc kubenswrapper[4843]: E0318 12:25:07.907955 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d999a45-20dd-4cc6-820a-1d54ca5f0fe2" containerName="oc" Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.907974 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d999a45-20dd-4cc6-820a-1d54ca5f0fe2" containerName="oc" Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.908105 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d999a45-20dd-4cc6-820a-1d54ca5f0fe2" containerName="oc" Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.908694 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6fgmp" Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.911407 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.911488 4843 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-p4szz" Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.911939 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.927386 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-jwmgj"] Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.928242 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-jwmgj" Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.930540 4843 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-f2nw2" Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.933673 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6fgmp"] Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.958138 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-v5tld"] Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.959025 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-v5tld" Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.961037 4843 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-2jvpt" Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.963055 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-jwmgj"] Mar 18 12:25:07 crc kubenswrapper[4843]: I0318 12:25:07.967509 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-v5tld"] Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.061271 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mpbb\" (UniqueName: \"kubernetes.io/projected/8047c536-ad04-4693-9e68-a75e89953f61-kube-api-access-7mpbb\") pod \"cert-manager-858654f9db-jwmgj\" (UID: \"8047c536-ad04-4693-9e68-a75e89953f61\") " pod="cert-manager/cert-manager-858654f9db-jwmgj" Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.061389 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghwxl\" (UniqueName: \"kubernetes.io/projected/e0a593cb-58ea-462d-994b-eafeebd4a2e1-kube-api-access-ghwxl\") pod \"cert-manager-webhook-687f57d79b-v5tld\" (UID: \"e0a593cb-58ea-462d-994b-eafeebd4a2e1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-v5tld" Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.061434 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htzxt\" (UniqueName: \"kubernetes.io/projected/7a983b8c-1c33-466e-9a33-f25f8b160273-kube-api-access-htzxt\") pod \"cert-manager-cainjector-cf98fcc89-6fgmp\" (UID: \"7a983b8c-1c33-466e-9a33-f25f8b160273\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6fgmp" Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.162589 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghwxl\" (UniqueName: \"kubernetes.io/projected/e0a593cb-58ea-462d-994b-eafeebd4a2e1-kube-api-access-ghwxl\") pod \"cert-manager-webhook-687f57d79b-v5tld\" (UID: \"e0a593cb-58ea-462d-994b-eafeebd4a2e1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-v5tld" Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.162635 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htzxt\" (UniqueName: \"kubernetes.io/projected/7a983b8c-1c33-466e-9a33-f25f8b160273-kube-api-access-htzxt\") pod \"cert-manager-cainjector-cf98fcc89-6fgmp\" (UID: \"7a983b8c-1c33-466e-9a33-f25f8b160273\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6fgmp" Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.162706 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mpbb\" (UniqueName: \"kubernetes.io/projected/8047c536-ad04-4693-9e68-a75e89953f61-kube-api-access-7mpbb\") pod \"cert-manager-858654f9db-jwmgj\" (UID: \"8047c536-ad04-4693-9e68-a75e89953f61\") " pod="cert-manager/cert-manager-858654f9db-jwmgj" Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.190952 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htzxt\" (UniqueName: \"kubernetes.io/projected/7a983b8c-1c33-466e-9a33-f25f8b160273-kube-api-access-htzxt\") pod \"cert-manager-cainjector-cf98fcc89-6fgmp\" (UID: \"7a983b8c-1c33-466e-9a33-f25f8b160273\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6fgmp" Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.198026 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghwxl\" (UniqueName: \"kubernetes.io/projected/e0a593cb-58ea-462d-994b-eafeebd4a2e1-kube-api-access-ghwxl\") pod \"cert-manager-webhook-687f57d79b-v5tld\" (UID: \"e0a593cb-58ea-462d-994b-eafeebd4a2e1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-v5tld" Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.202330 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mpbb\" (UniqueName: \"kubernetes.io/projected/8047c536-ad04-4693-9e68-a75e89953f61-kube-api-access-7mpbb\") pod \"cert-manager-858654f9db-jwmgj\" (UID: \"8047c536-ad04-4693-9e68-a75e89953f61\") " pod="cert-manager/cert-manager-858654f9db-jwmgj" Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.228610 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6fgmp" Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.249663 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-jwmgj" Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.276039 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-v5tld" Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.521010 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6fgmp"] Mar 18 12:25:08 crc kubenswrapper[4843]: W0318 12:25:08.534080 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a983b8c_1c33_466e_9a33_f25f8b160273.slice/crio-0443d02dbceab9fc946c50ca658974432c0bc2fbcd3bf553973958a30ac9b7c7 WatchSource:0}: Error finding container 0443d02dbceab9fc946c50ca658974432c0bc2fbcd3bf553973958a30ac9b7c7: Status 404 returned error can't find the container with id 0443d02dbceab9fc946c50ca658974432c0bc2fbcd3bf553973958a30ac9b7c7 Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.550877 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.569550 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-jwmgj"] Mar 18 12:25:08 crc kubenswrapper[4843]: W0318 12:25:08.579032 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8047c536_ad04_4693_9e68_a75e89953f61.slice/crio-cd31837cd308f6293bdf88b21a92fd0836174ea90b1cccdb3ccc14a20b912a71 WatchSource:0}: Error finding container cd31837cd308f6293bdf88b21a92fd0836174ea90b1cccdb3ccc14a20b912a71: Status 404 returned error can't find the container with id cd31837cd308f6293bdf88b21a92fd0836174ea90b1cccdb3ccc14a20b912a71 Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.628060 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-v5tld"] Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.849260 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6fgmp" event={"ID":"7a983b8c-1c33-466e-9a33-f25f8b160273","Type":"ContainerStarted","Data":"0443d02dbceab9fc946c50ca658974432c0bc2fbcd3bf553973958a30ac9b7c7"} Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.850376 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-v5tld" event={"ID":"e0a593cb-58ea-462d-994b-eafeebd4a2e1","Type":"ContainerStarted","Data":"6b210e493e97a93f33a57f62567daca61005bee048b49ab1ba032bc4d5d5f604"} Mar 18 12:25:08 crc kubenswrapper[4843]: I0318 12:25:08.851510 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-jwmgj" event={"ID":"8047c536-ad04-4693-9e68-a75e89953f61","Type":"ContainerStarted","Data":"cd31837cd308f6293bdf88b21a92fd0836174ea90b1cccdb3ccc14a20b912a71"} Mar 18 12:25:11 crc kubenswrapper[4843]: I0318 12:25:11.871952 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6fgmp" event={"ID":"7a983b8c-1c33-466e-9a33-f25f8b160273","Type":"ContainerStarted","Data":"b0fae723b31fc2a51c83f9ad857986b5e7931a924c6af5280e9b51e27b7a213f"} Mar 18 12:25:11 crc kubenswrapper[4843]: I0318 12:25:11.887773 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6fgmp" podStartSLOduration=1.799790701 podStartE2EDuration="4.887752708s" podCreationTimestamp="2026-03-18 12:25:07 +0000 UTC" firstStartedPulling="2026-03-18 12:25:08.550264778 +0000 UTC m=+942.266090302" lastFinishedPulling="2026-03-18 12:25:11.638226785 +0000 UTC m=+945.354052309" observedRunningTime="2026-03-18 12:25:11.886805001 +0000 UTC m=+945.602630525" watchObservedRunningTime="2026-03-18 12:25:11.887752708 +0000 UTC m=+945.603578232" Mar 18 12:25:13 crc kubenswrapper[4843]: I0318 12:25:13.886068 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-v5tld" event={"ID":"e0a593cb-58ea-462d-994b-eafeebd4a2e1","Type":"ContainerStarted","Data":"ee4f565124588a51686580cf4cbb318df048b850b1fc61e96c382bc4bcb6a0ab"} Mar 18 12:25:13 crc kubenswrapper[4843]: I0318 12:25:13.886462 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-v5tld" Mar 18 12:25:13 crc kubenswrapper[4843]: I0318 12:25:13.887899 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-jwmgj" event={"ID":"8047c536-ad04-4693-9e68-a75e89953f61","Type":"ContainerStarted","Data":"e75e14de10f361672495e3d870d44d69da01c53ecf0ff792dac0eb2735971f67"} Mar 18 12:25:13 crc kubenswrapper[4843]: I0318 12:25:13.906222 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-v5tld" podStartSLOduration=2.243111752 podStartE2EDuration="6.906201643s" podCreationTimestamp="2026-03-18 12:25:07 +0000 UTC" firstStartedPulling="2026-03-18 12:25:08.634161686 +0000 UTC m=+942.349987210" lastFinishedPulling="2026-03-18 12:25:13.297251577 +0000 UTC m=+947.013077101" observedRunningTime="2026-03-18 12:25:13.901591601 +0000 UTC m=+947.617417125" watchObservedRunningTime="2026-03-18 12:25:13.906201643 +0000 UTC m=+947.622027167" Mar 18 12:25:13 crc kubenswrapper[4843]: I0318 12:25:13.921208 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-jwmgj" podStartSLOduration=2.168957913 podStartE2EDuration="6.921188062s" podCreationTimestamp="2026-03-18 12:25:07 +0000 UTC" firstStartedPulling="2026-03-18 12:25:08.581937233 +0000 UTC m=+942.297762757" lastFinishedPulling="2026-03-18 12:25:13.334167382 +0000 UTC m=+947.049992906" observedRunningTime="2026-03-18 12:25:13.918187536 +0000 UTC m=+947.634013060" watchObservedRunningTime="2026-03-18 12:25:13.921188062 +0000 UTC m=+947.637013586" Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.619535 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bc7c6"] Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.620222 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovn-controller" containerID="cri-o://98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4" gracePeriod=30 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.620282 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="nbdb" containerID="cri-o://38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e" gracePeriod=30 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.620363 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101" gracePeriod=30 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.620382 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="kube-rbac-proxy-node" containerID="cri-o://5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c" gracePeriod=30 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.620425 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovn-acl-logging" containerID="cri-o://0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f" gracePeriod=30 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.620703 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="sbdb" containerID="cri-o://22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94" gracePeriod=30 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.620352 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="northd" containerID="cri-o://d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce" gracePeriod=30 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.663873 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" containerID="cri-o://39bf6650b8bad5f54492fd2e7d7f7e4e2087d3fd5dec7dcae16607953e5712af" gracePeriod=30 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.913330 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ls8r8_1aa16ddb-306b-4e37-a33a-b9cdce3c254e/kube-multus/2.log" Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.913858 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ls8r8_1aa16ddb-306b-4e37-a33a-b9cdce3c254e/kube-multus/1.log" Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.913891 4843 generic.go:334] "Generic (PLEG): container finished" podID="1aa16ddb-306b-4e37-a33a-b9cdce3c254e" containerID="bd2f541f29cc54ab425776fa1e30b681a6fbe2865d995ebbfc75b4407a9e2df0" exitCode=2 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.913937 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ls8r8" event={"ID":"1aa16ddb-306b-4e37-a33a-b9cdce3c254e","Type":"ContainerDied","Data":"bd2f541f29cc54ab425776fa1e30b681a6fbe2865d995ebbfc75b4407a9e2df0"} Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.913968 4843 scope.go:117] "RemoveContainer" containerID="b99a8424a67312606a51f09b6fe9197761584e8c306318cdb3f765a2aafc3223" Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.914431 4843 scope.go:117] "RemoveContainer" containerID="bd2f541f29cc54ab425776fa1e30b681a6fbe2865d995ebbfc75b4407a9e2df0" Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.916903 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovnkube-controller/3.log" Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.925178 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovn-acl-logging/0.log" Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.925899 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovn-controller/0.log" Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926282 4843 generic.go:334] "Generic (PLEG): container finished" podID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerID="39bf6650b8bad5f54492fd2e7d7f7e4e2087d3fd5dec7dcae16607953e5712af" exitCode=0 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926311 4843 generic.go:334] "Generic (PLEG): container finished" podID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerID="22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94" exitCode=0 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926320 4843 generic.go:334] "Generic (PLEG): container finished" podID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerID="38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e" exitCode=0 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926329 4843 generic.go:334] "Generic (PLEG): container finished" podID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerID="d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce" exitCode=0 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926338 4843 generic.go:334] "Generic (PLEG): container finished" podID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerID="ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101" exitCode=0 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926349 4843 generic.go:334] "Generic (PLEG): container finished" podID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerID="5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c" exitCode=0 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926358 4843 generic.go:334] "Generic (PLEG): container finished" podID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerID="0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f" exitCode=143 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926366 4843 generic.go:334] "Generic (PLEG): container finished" podID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerID="98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4" exitCode=143 Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926389 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"39bf6650b8bad5f54492fd2e7d7f7e4e2087d3fd5dec7dcae16607953e5712af"} Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926415 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94"} Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926428 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e"} Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926441 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce"} Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926452 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101"} Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926463 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c"} Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926475 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f"} Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926486 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4"} Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926498 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" event={"ID":"45174fd5-3a94-47fe-81c3-18bd634c4fcf","Type":"ContainerDied","Data":"aedfbd173c27261b62efec95c11641890eca11561d631f6c5fb684303fbdb259"} Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.926509 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aedfbd173c27261b62efec95c11641890eca11561d631f6c5fb684303fbdb259" Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.940760 4843 scope.go:117] "RemoveContainer" containerID="3bdbcb1a9e9191c9d4a7ac1e7144ee2cdd03aaa1197ee2b2842d23d84dcfaa77" Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.957371 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovn-acl-logging/0.log" Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.958362 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovn-controller/0.log" Mar 18 12:25:17 crc kubenswrapper[4843]: I0318 12:25:17.959069 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022256 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qwcw4"] Mar 18 12:25:18 crc kubenswrapper[4843]: E0318 12:25:18.022535 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovn-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022552 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovn-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: E0318 12:25:18.022566 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="kube-rbac-proxy-node" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022573 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="kube-rbac-proxy-node" Mar 18 12:25:18 crc kubenswrapper[4843]: E0318 12:25:18.022587 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022599 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: E0318 12:25:18.022608 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="nbdb" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022615 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="nbdb" Mar 18 12:25:18 crc kubenswrapper[4843]: E0318 12:25:18.022624 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022632 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: E0318 12:25:18.022661 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022670 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 12:25:18 crc kubenswrapper[4843]: E0318 12:25:18.022682 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022688 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: E0318 12:25:18.022697 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="sbdb" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022704 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="sbdb" Mar 18 12:25:18 crc kubenswrapper[4843]: E0318 12:25:18.022715 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022723 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: E0318 12:25:18.022732 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovn-acl-logging" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022750 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovn-acl-logging" Mar 18 12:25:18 crc kubenswrapper[4843]: E0318 12:25:18.022760 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="northd" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022767 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="northd" Mar 18 12:25:18 crc kubenswrapper[4843]: E0318 12:25:18.022777 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="kubecfg-setup" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022784 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="kubecfg-setup" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022882 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022897 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022904 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovn-acl-logging" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022911 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="kube-rbac-proxy-node" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022921 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="northd" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022927 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022934 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovn-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022940 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022948 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="sbdb" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022954 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="nbdb" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.022963 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: E0318 12:25:18.023047 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.023054 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.023144 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" containerName="ovnkube-controller" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.024957 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.052708 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-var-lib-openvswitch\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.056883 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-node-log\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.056928 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovn-node-metrics-cert\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.056956 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84n7m\" (UniqueName: \"kubernetes.io/projected/45174fd5-3a94-47fe-81c3-18bd634c4fcf-kube-api-access-84n7m\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.056982 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-kubelet\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057006 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-env-overrides\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057032 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-ovn\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057055 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-systemd\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057071 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovnkube-config\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057089 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-systemd-units\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057102 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-openvswitch\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057118 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-slash\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057140 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-run-netns\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057169 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-cni-netd\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057206 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-log-socket\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057239 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.052828 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057281 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-cni-bin\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057227 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057305 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-run-ovn-kubernetes\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057323 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-node-log" (OuterVolumeSpecName: "node-log") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057330 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-etc-openvswitch\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057355 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057389 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovnkube-script-lib\") pod \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\" (UID: \"45174fd5-3a94-47fe-81c3-18bd634c4fcf\") " Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057614 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-systemd-units\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057692 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-cni-bin\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057738 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057796 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43aa7eeb-1691-4df9-9315-ffeceefa5da6-ovn-node-metrics-cert\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057870 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-log-socket\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057915 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-run-openvswitch\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.057970 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-etc-openvswitch\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.058007 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-slash\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.058061 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-run-ovn\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.058097 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43aa7eeb-1691-4df9-9315-ffeceefa5da6-ovnkube-script-lib\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.058720 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.058771 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.058974 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059003 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059020 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059053 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-node-log\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059084 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-run-systemd\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059095 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059139 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-cni-netd\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059123 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059136 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059166 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43aa7eeb-1691-4df9-9315-ffeceefa5da6-ovnkube-config\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059171 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059187 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43aa7eeb-1691-4df9-9315-ffeceefa5da6-env-overrides\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059191 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-log-socket" (OuterVolumeSpecName: "log-socket") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059205 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059261 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-kubelet\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059337 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-run-netns\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059493 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059050 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-slash" (OuterVolumeSpecName: "host-slash") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059668 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-run-ovn-kubernetes\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059723 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-var-lib-openvswitch\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059778 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmlfc\" (UniqueName: \"kubernetes.io/projected/43aa7eeb-1691-4df9-9315-ffeceefa5da6-kube-api-access-qmlfc\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059845 4843 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059862 4843 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059876 4843 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059904 4843 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059916 4843 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059931 4843 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059940 4843 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059951 4843 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059964 4843 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059976 4843 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059987 4843 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.059998 4843 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.060009 4843 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.060019 4843 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.060029 4843 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.060040 4843 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.060051 4843 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.064113 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.064778 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45174fd5-3a94-47fe-81c3-18bd634c4fcf-kube-api-access-84n7m" (OuterVolumeSpecName: "kube-api-access-84n7m") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "kube-api-access-84n7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.072818 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "45174fd5-3a94-47fe-81c3-18bd634c4fcf" (UID: "45174fd5-3a94-47fe-81c3-18bd634c4fcf"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.161546 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.161618 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43aa7eeb-1691-4df9-9315-ffeceefa5da6-ovn-node-metrics-cert\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.161673 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-log-socket\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.161699 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-run-openvswitch\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.161724 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-etc-openvswitch\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.161748 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-slash\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.161961 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-run-ovn\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162006 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-run-openvswitch\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162016 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43aa7eeb-1691-4df9-9315-ffeceefa5da6-ovnkube-script-lib\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162156 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-run-ovn\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162085 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-log-socket\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162201 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-node-log\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.161711 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162233 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-run-systemd\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162165 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-etc-openvswitch\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162297 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-cni-netd\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162310 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-run-systemd\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162281 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-node-log\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162332 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43aa7eeb-1691-4df9-9315-ffeceefa5da6-ovnkube-config\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162446 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43aa7eeb-1691-4df9-9315-ffeceefa5da6-env-overrides\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162382 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-cni-netd\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162509 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-kubelet\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162569 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-kubelet\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162583 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-run-netns\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162629 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-run-ovn-kubernetes\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162717 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-var-lib-openvswitch\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162744 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-run-netns\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162831 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmlfc\" (UniqueName: \"kubernetes.io/projected/43aa7eeb-1691-4df9-9315-ffeceefa5da6-kube-api-access-qmlfc\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162847 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-run-ovn-kubernetes\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162840 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-var-lib-openvswitch\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162942 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-systemd-units\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.162975 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43aa7eeb-1691-4df9-9315-ffeceefa5da6-ovnkube-script-lib\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.163016 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-cni-bin\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.163049 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-systemd-units\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.163119 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43aa7eeb-1691-4df9-9315-ffeceefa5da6-ovnkube-config\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.163139 4843 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45174fd5-3a94-47fe-81c3-18bd634c4fcf-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.163146 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-cni-bin\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.163180 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43aa7eeb-1691-4df9-9315-ffeceefa5da6-host-slash\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.163174 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84n7m\" (UniqueName: \"kubernetes.io/projected/45174fd5-3a94-47fe-81c3-18bd634c4fcf-kube-api-access-84n7m\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.163221 4843 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45174fd5-3a94-47fe-81c3-18bd634c4fcf-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.163505 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43aa7eeb-1691-4df9-9315-ffeceefa5da6-env-overrides\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.166868 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43aa7eeb-1691-4df9-9315-ffeceefa5da6-ovn-node-metrics-cert\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.192601 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmlfc\" (UniqueName: \"kubernetes.io/projected/43aa7eeb-1691-4df9-9315-ffeceefa5da6-kube-api-access-qmlfc\") pod \"ovnkube-node-qwcw4\" (UID: \"43aa7eeb-1691-4df9-9315-ffeceefa5da6\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.278288 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-v5tld" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.340866 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:18 crc kubenswrapper[4843]: W0318 12:25:18.370058 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43aa7eeb_1691_4df9_9315_ffeceefa5da6.slice/crio-af42e49c023150e282f4aee110a57216e8fb56e2f374c62f3848ebe6a182b86e WatchSource:0}: Error finding container af42e49c023150e282f4aee110a57216e8fb56e2f374c62f3848ebe6a182b86e: Status 404 returned error can't find the container with id af42e49c023150e282f4aee110a57216e8fb56e2f374c62f3848ebe6a182b86e Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.935345 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovn-acl-logging/0.log" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.935850 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bc7c6_45174fd5-3a94-47fe-81c3-18bd634c4fcf/ovn-controller/0.log" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.936543 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bc7c6" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.941046 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ls8r8_1aa16ddb-306b-4e37-a33a-b9cdce3c254e/kube-multus/2.log" Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.941148 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ls8r8" event={"ID":"1aa16ddb-306b-4e37-a33a-b9cdce3c254e","Type":"ContainerStarted","Data":"1ea849c9fa4ed01c3fe6a9cdc3978e6835e50f3d38504ab2e40b9b7d3c097bfd"} Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.942536 4843 generic.go:334] "Generic (PLEG): container finished" podID="43aa7eeb-1691-4df9-9315-ffeceefa5da6" containerID="890418a3006cca6847a7e85341d01d1d10b120f3b0dd8dcaf859b67127a84733" exitCode=0 Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.942575 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" event={"ID":"43aa7eeb-1691-4df9-9315-ffeceefa5da6","Type":"ContainerDied","Data":"890418a3006cca6847a7e85341d01d1d10b120f3b0dd8dcaf859b67127a84733"} Mar 18 12:25:18 crc kubenswrapper[4843]: I0318 12:25:18.942599 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" event={"ID":"43aa7eeb-1691-4df9-9315-ffeceefa5da6","Type":"ContainerStarted","Data":"af42e49c023150e282f4aee110a57216e8fb56e2f374c62f3848ebe6a182b86e"} Mar 18 12:25:19 crc kubenswrapper[4843]: I0318 12:25:19.037277 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bc7c6"] Mar 18 12:25:19 crc kubenswrapper[4843]: I0318 12:25:19.041540 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bc7c6"] Mar 18 12:25:19 crc kubenswrapper[4843]: I0318 12:25:19.953289 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" event={"ID":"43aa7eeb-1691-4df9-9315-ffeceefa5da6","Type":"ContainerStarted","Data":"d5b6c37d6debf92ad55cce012de0c8ac9fe022676b525d33b6476d3049d388ed"} Mar 18 12:25:19 crc kubenswrapper[4843]: I0318 12:25:19.953632 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" event={"ID":"43aa7eeb-1691-4df9-9315-ffeceefa5da6","Type":"ContainerStarted","Data":"cd18925ccd40f22edaea634d890f0651967492da1d010f343c2ffe0004450936"} Mar 18 12:25:19 crc kubenswrapper[4843]: I0318 12:25:19.953642 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" event={"ID":"43aa7eeb-1691-4df9-9315-ffeceefa5da6","Type":"ContainerStarted","Data":"c8317eeb6958ee7755dea359457700025ae21c8cab47f0dd40d0969967c8c5a4"} Mar 18 12:25:19 crc kubenswrapper[4843]: I0318 12:25:19.953673 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" event={"ID":"43aa7eeb-1691-4df9-9315-ffeceefa5da6","Type":"ContainerStarted","Data":"1b376afdbaec7c61eb9e9d8952a6a6262e44aef6a482ff5fc188f778627e5f80"} Mar 18 12:25:19 crc kubenswrapper[4843]: I0318 12:25:19.953682 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" event={"ID":"43aa7eeb-1691-4df9-9315-ffeceefa5da6","Type":"ContainerStarted","Data":"c8804bfaefec0d97d4b2eb0aafd59ee771914c89c64fe9063d5b1540bbf83453"} Mar 18 12:25:19 crc kubenswrapper[4843]: I0318 12:25:19.953692 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" event={"ID":"43aa7eeb-1691-4df9-9315-ffeceefa5da6","Type":"ContainerStarted","Data":"ac21fdcb6ec49e8b132c6486d4cdb951a0cac6d2d0d0f67ad0768022fe78a500"} Mar 18 12:25:20 crc kubenswrapper[4843]: I0318 12:25:20.995741 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45174fd5-3a94-47fe-81c3-18bd634c4fcf" path="/var/lib/kubelet/pods/45174fd5-3a94-47fe-81c3-18bd634c4fcf/volumes" Mar 18 12:25:23 crc kubenswrapper[4843]: I0318 12:25:23.073420 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" event={"ID":"43aa7eeb-1691-4df9-9315-ffeceefa5da6","Type":"ContainerStarted","Data":"da322e2e9e76a0e0f82d4fd32adfe7638e67960f942af18318eb64e5a42264d3"} Mar 18 12:25:25 crc kubenswrapper[4843]: I0318 12:25:25.090597 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" event={"ID":"43aa7eeb-1691-4df9-9315-ffeceefa5da6","Type":"ContainerStarted","Data":"ea3ccf92a97e292af24d1d55eaa784fb1a198f2a9975890434647929d4003a6b"} Mar 18 12:25:25 crc kubenswrapper[4843]: I0318 12:25:25.091010 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:25 crc kubenswrapper[4843]: I0318 12:25:25.091028 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:25 crc kubenswrapper[4843]: I0318 12:25:25.091039 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:25 crc kubenswrapper[4843]: I0318 12:25:25.127545 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" podStartSLOduration=7.127521085 podStartE2EDuration="7.127521085s" podCreationTimestamp="2026-03-18 12:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:25:25.120458773 +0000 UTC m=+958.836284307" watchObservedRunningTime="2026-03-18 12:25:25.127521085 +0000 UTC m=+958.843346609" Mar 18 12:25:25 crc kubenswrapper[4843]: I0318 12:25:25.129261 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:25 crc kubenswrapper[4843]: I0318 12:25:25.132422 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:48 crc kubenswrapper[4843]: I0318 12:25:48.369740 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qwcw4" Mar 18 12:25:55 crc kubenswrapper[4843]: I0318 12:25:55.758212 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b"] Mar 18 12:25:55 crc kubenswrapper[4843]: I0318 12:25:55.760287 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" Mar 18 12:25:55 crc kubenswrapper[4843]: I0318 12:25:55.763858 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 12:25:55 crc kubenswrapper[4843]: I0318 12:25:55.775725 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b"] Mar 18 12:25:55 crc kubenswrapper[4843]: I0318 12:25:55.862404 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b\" (UID: \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" Mar 18 12:25:55 crc kubenswrapper[4843]: I0318 12:25:55.862647 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b\" (UID: \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" Mar 18 12:25:55 crc kubenswrapper[4843]: I0318 12:25:55.862784 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7d8z\" (UniqueName: \"kubernetes.io/projected/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-kube-api-access-k7d8z\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b\" (UID: \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" Mar 18 12:25:55 crc kubenswrapper[4843]: I0318 12:25:55.964443 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b\" (UID: \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" Mar 18 12:25:55 crc kubenswrapper[4843]: I0318 12:25:55.964500 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b\" (UID: \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" Mar 18 12:25:55 crc kubenswrapper[4843]: I0318 12:25:55.964564 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7d8z\" (UniqueName: \"kubernetes.io/projected/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-kube-api-access-k7d8z\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b\" (UID: \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" Mar 18 12:25:55 crc kubenswrapper[4843]: I0318 12:25:55.965272 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b\" (UID: \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" Mar 18 12:25:55 crc kubenswrapper[4843]: I0318 12:25:55.965564 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b\" (UID: \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" Mar 18 12:25:55 crc kubenswrapper[4843]: I0318 12:25:55.999140 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7d8z\" (UniqueName: \"kubernetes.io/projected/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-kube-api-access-k7d8z\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b\" (UID: \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" Mar 18 12:25:56 crc kubenswrapper[4843]: I0318 12:25:56.127670 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" Mar 18 12:25:56 crc kubenswrapper[4843]: I0318 12:25:56.406532 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b"] Mar 18 12:25:56 crc kubenswrapper[4843]: I0318 12:25:56.593278 4843 scope.go:117] "RemoveContainer" containerID="741bc57cb18ab8eb434e5e29b607ad27c656a181a6ea914e57bbab9e2a508545" Mar 18 12:25:56 crc kubenswrapper[4843]: I0318 12:25:56.607018 4843 scope.go:117] "RemoveContainer" containerID="38600e43d9c1c3ada4c497f1927d5c9c8b30a282adbf37af82c2def6d80e3c2e" Mar 18 12:25:56 crc kubenswrapper[4843]: I0318 12:25:56.620273 4843 scope.go:117] "RemoveContainer" containerID="39bf6650b8bad5f54492fd2e7d7f7e4e2087d3fd5dec7dcae16607953e5712af" Mar 18 12:25:56 crc kubenswrapper[4843]: I0318 12:25:56.636959 4843 scope.go:117] "RemoveContainer" containerID="98499a90fb413717e34c83428127e917cf905500048e80bc7e22c90553dc5bb4" Mar 18 12:25:56 crc kubenswrapper[4843]: I0318 12:25:56.654729 4843 scope.go:117] "RemoveContainer" containerID="d2576bec646619d69f62839c98498e6753127cbeb35e07bccefdb659596242ce" Mar 18 12:25:56 crc kubenswrapper[4843]: I0318 12:25:56.671348 4843 scope.go:117] "RemoveContainer" containerID="0c0bd97331c5954408c27e0d6eaec3cebf2f9d75715b22645eb49eb05dea412f" Mar 18 12:25:56 crc kubenswrapper[4843]: I0318 12:25:56.688702 4843 scope.go:117] "RemoveContainer" containerID="5d7361d2b86ed73dc1e2817a6b57b520e58ba8580be8d3b6e572abf62cb8ec7c" Mar 18 12:25:56 crc kubenswrapper[4843]: I0318 12:25:56.702636 4843 scope.go:117] "RemoveContainer" containerID="22256eca995751bbf1c13e08403f8aa1c9cf762609e59571def71ed98cb3be94" Mar 18 12:25:56 crc kubenswrapper[4843]: I0318 12:25:56.717483 4843 scope.go:117] "RemoveContainer" containerID="ad091e456626b9122927b8ea3b40984a60b88dc17ac742c516c8799aa067c101" Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.309647 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" event={"ID":"2ae2830e-03f9-4bfe-941a-a30fa8f092cf","Type":"ContainerStarted","Data":"1bcc369d46c95befcb9ec5fb626a4c14c4135e88c1de0c4c7e899a06305e9c47"} Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.309731 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" event={"ID":"2ae2830e-03f9-4bfe-941a-a30fa8f092cf","Type":"ContainerStarted","Data":"df8bd066224a045a48913c228fc374758075387a631fec69cedda91123510f40"} Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.521527 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bl5fs"] Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.522539 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.535299 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bl5fs"] Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.607840 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4x24\" (UniqueName: \"kubernetes.io/projected/83683877-b923-4fcb-9672-a7e64c05abd5-kube-api-access-f4x24\") pod \"redhat-operators-bl5fs\" (UID: \"83683877-b923-4fcb-9672-a7e64c05abd5\") " pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.607991 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83683877-b923-4fcb-9672-a7e64c05abd5-catalog-content\") pod \"redhat-operators-bl5fs\" (UID: \"83683877-b923-4fcb-9672-a7e64c05abd5\") " pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.608033 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83683877-b923-4fcb-9672-a7e64c05abd5-utilities\") pod \"redhat-operators-bl5fs\" (UID: \"83683877-b923-4fcb-9672-a7e64c05abd5\") " pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.709313 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83683877-b923-4fcb-9672-a7e64c05abd5-catalog-content\") pod \"redhat-operators-bl5fs\" (UID: \"83683877-b923-4fcb-9672-a7e64c05abd5\") " pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.709372 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83683877-b923-4fcb-9672-a7e64c05abd5-utilities\") pod \"redhat-operators-bl5fs\" (UID: \"83683877-b923-4fcb-9672-a7e64c05abd5\") " pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.709405 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4x24\" (UniqueName: \"kubernetes.io/projected/83683877-b923-4fcb-9672-a7e64c05abd5-kube-api-access-f4x24\") pod \"redhat-operators-bl5fs\" (UID: \"83683877-b923-4fcb-9672-a7e64c05abd5\") " pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.709995 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83683877-b923-4fcb-9672-a7e64c05abd5-catalog-content\") pod \"redhat-operators-bl5fs\" (UID: \"83683877-b923-4fcb-9672-a7e64c05abd5\") " pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.710055 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83683877-b923-4fcb-9672-a7e64c05abd5-utilities\") pod \"redhat-operators-bl5fs\" (UID: \"83683877-b923-4fcb-9672-a7e64c05abd5\") " pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.731277 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4x24\" (UniqueName: \"kubernetes.io/projected/83683877-b923-4fcb-9672-a7e64c05abd5-kube-api-access-f4x24\") pod \"redhat-operators-bl5fs\" (UID: \"83683877-b923-4fcb-9672-a7e64c05abd5\") " pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:25:57 crc kubenswrapper[4843]: I0318 12:25:57.841562 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:25:58 crc kubenswrapper[4843]: I0318 12:25:58.038358 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bl5fs"] Mar 18 12:25:58 crc kubenswrapper[4843]: I0318 12:25:58.316306 4843 generic.go:334] "Generic (PLEG): container finished" podID="83683877-b923-4fcb-9672-a7e64c05abd5" containerID="2db93522ee0c6be54642c3b8594df23d74a250bfba28f3e7372a4ddb53681196" exitCode=0 Mar 18 12:25:58 crc kubenswrapper[4843]: I0318 12:25:58.316498 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl5fs" event={"ID":"83683877-b923-4fcb-9672-a7e64c05abd5","Type":"ContainerDied","Data":"2db93522ee0c6be54642c3b8594df23d74a250bfba28f3e7372a4ddb53681196"} Mar 18 12:25:58 crc kubenswrapper[4843]: I0318 12:25:58.316622 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl5fs" event={"ID":"83683877-b923-4fcb-9672-a7e64c05abd5","Type":"ContainerStarted","Data":"861d80732f8c2e1d34aa54cf9a32f2ea06b3633467ba4038aba4c9445bf3933a"} Mar 18 12:25:58 crc kubenswrapper[4843]: I0318 12:25:58.318053 4843 generic.go:334] "Generic (PLEG): container finished" podID="2ae2830e-03f9-4bfe-941a-a30fa8f092cf" containerID="1bcc369d46c95befcb9ec5fb626a4c14c4135e88c1de0c4c7e899a06305e9c47" exitCode=0 Mar 18 12:25:58 crc kubenswrapper[4843]: I0318 12:25:58.318079 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" event={"ID":"2ae2830e-03f9-4bfe-941a-a30fa8f092cf","Type":"ContainerDied","Data":"1bcc369d46c95befcb9ec5fb626a4c14c4135e88c1de0c4c7e899a06305e9c47"} Mar 18 12:25:59 crc kubenswrapper[4843]: I0318 12:25:59.330538 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl5fs" event={"ID":"83683877-b923-4fcb-9672-a7e64c05abd5","Type":"ContainerStarted","Data":"e309110ccd7411bc3df1c593b05867fc992c3fc2ef76fd8126a4e3aba1015084"} Mar 18 12:26:00 crc kubenswrapper[4843]: I0318 12:26:00.156534 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563946-9cgbr"] Mar 18 12:26:00 crc kubenswrapper[4843]: I0318 12:26:00.157486 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563946-9cgbr" Mar 18 12:26:00 crc kubenswrapper[4843]: I0318 12:26:00.160428 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:26:00 crc kubenswrapper[4843]: I0318 12:26:00.160537 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:26:00 crc kubenswrapper[4843]: I0318 12:26:00.160689 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:26:00 crc kubenswrapper[4843]: I0318 12:26:00.182175 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563946-9cgbr"] Mar 18 12:26:00 crc kubenswrapper[4843]: I0318 12:26:00.239958 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6vzm\" (UniqueName: \"kubernetes.io/projected/d61bcdf2-7d5a-4d36-ac50-896209b2d468-kube-api-access-b6vzm\") pod \"auto-csr-approver-29563946-9cgbr\" (UID: \"d61bcdf2-7d5a-4d36-ac50-896209b2d468\") " pod="openshift-infra/auto-csr-approver-29563946-9cgbr" Mar 18 12:26:00 crc kubenswrapper[4843]: I0318 12:26:00.337565 4843 generic.go:334] "Generic (PLEG): container finished" podID="2ae2830e-03f9-4bfe-941a-a30fa8f092cf" containerID="b78858b3f0c00da71c98c4091113626f10c14f8877bd904eca46af31b6c925d4" exitCode=0 Mar 18 12:26:00 crc kubenswrapper[4843]: I0318 12:26:00.337725 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" event={"ID":"2ae2830e-03f9-4bfe-941a-a30fa8f092cf","Type":"ContainerDied","Data":"b78858b3f0c00da71c98c4091113626f10c14f8877bd904eca46af31b6c925d4"} Mar 18 12:26:00 crc kubenswrapper[4843]: I0318 12:26:00.342698 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6vzm\" (UniqueName: \"kubernetes.io/projected/d61bcdf2-7d5a-4d36-ac50-896209b2d468-kube-api-access-b6vzm\") pod \"auto-csr-approver-29563946-9cgbr\" (UID: \"d61bcdf2-7d5a-4d36-ac50-896209b2d468\") " pod="openshift-infra/auto-csr-approver-29563946-9cgbr" Mar 18 12:26:00 crc kubenswrapper[4843]: I0318 12:26:00.379694 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6vzm\" (UniqueName: \"kubernetes.io/projected/d61bcdf2-7d5a-4d36-ac50-896209b2d468-kube-api-access-b6vzm\") pod \"auto-csr-approver-29563946-9cgbr\" (UID: \"d61bcdf2-7d5a-4d36-ac50-896209b2d468\") " pod="openshift-infra/auto-csr-approver-29563946-9cgbr" Mar 18 12:26:00 crc kubenswrapper[4843]: I0318 12:26:00.490934 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563946-9cgbr" Mar 18 12:26:00 crc kubenswrapper[4843]: I0318 12:26:00.721407 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563946-9cgbr"] Mar 18 12:26:00 crc kubenswrapper[4843]: W0318 12:26:00.750280 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd61bcdf2_7d5a_4d36_ac50_896209b2d468.slice/crio-91255ef4c01c8680931e8ac2509a954ecce292cc6b0ab2dd4f366cd304886031 WatchSource:0}: Error finding container 91255ef4c01c8680931e8ac2509a954ecce292cc6b0ab2dd4f366cd304886031: Status 404 returned error can't find the container with id 91255ef4c01c8680931e8ac2509a954ecce292cc6b0ab2dd4f366cd304886031 Mar 18 12:26:01 crc kubenswrapper[4843]: I0318 12:26:01.344820 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563946-9cgbr" event={"ID":"d61bcdf2-7d5a-4d36-ac50-896209b2d468","Type":"ContainerStarted","Data":"91255ef4c01c8680931e8ac2509a954ecce292cc6b0ab2dd4f366cd304886031"} Mar 18 12:26:01 crc kubenswrapper[4843]: I0318 12:26:01.347563 4843 generic.go:334] "Generic (PLEG): container finished" podID="2ae2830e-03f9-4bfe-941a-a30fa8f092cf" containerID="d64a7b855664a16aae9af65ffed0c8c73b6379d33fcc1b4bd90e0a5eca9d6d8a" exitCode=0 Mar 18 12:26:01 crc kubenswrapper[4843]: I0318 12:26:01.347708 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" event={"ID":"2ae2830e-03f9-4bfe-941a-a30fa8f092cf","Type":"ContainerDied","Data":"d64a7b855664a16aae9af65ffed0c8c73b6379d33fcc1b4bd90e0a5eca9d6d8a"} Mar 18 12:26:01 crc kubenswrapper[4843]: I0318 12:26:01.351782 4843 generic.go:334] "Generic (PLEG): container finished" podID="83683877-b923-4fcb-9672-a7e64c05abd5" containerID="e309110ccd7411bc3df1c593b05867fc992c3fc2ef76fd8126a4e3aba1015084" exitCode=0 Mar 18 12:26:01 crc kubenswrapper[4843]: I0318 12:26:01.351838 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl5fs" event={"ID":"83683877-b923-4fcb-9672-a7e64c05abd5","Type":"ContainerDied","Data":"e309110ccd7411bc3df1c593b05867fc992c3fc2ef76fd8126a4e3aba1015084"} Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.360789 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563946-9cgbr" event={"ID":"d61bcdf2-7d5a-4d36-ac50-896209b2d468","Type":"ContainerStarted","Data":"5fe5d24533ad2ffb865be576a8383e9595bcec8f7b4c831fc413219d301d176d"} Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.363063 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl5fs" event={"ID":"83683877-b923-4fcb-9672-a7e64c05abd5","Type":"ContainerStarted","Data":"330725b7c9878288ca565c7a290e6196a39d4535b3d39e02d046c7c776266024"} Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.379488 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563946-9cgbr" podStartSLOduration=1.25205942 podStartE2EDuration="2.379470503s" podCreationTimestamp="2026-03-18 12:26:00 +0000 UTC" firstStartedPulling="2026-03-18 12:26:00.75347381 +0000 UTC m=+994.469299334" lastFinishedPulling="2026-03-18 12:26:01.880884893 +0000 UTC m=+995.596710417" observedRunningTime="2026-03-18 12:26:02.379020901 +0000 UTC m=+996.094846425" watchObservedRunningTime="2026-03-18 12:26:02.379470503 +0000 UTC m=+996.095296027" Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.404060 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bl5fs" podStartSLOduration=1.8998510020000001 podStartE2EDuration="5.404040192s" podCreationTimestamp="2026-03-18 12:25:57 +0000 UTC" firstStartedPulling="2026-03-18 12:25:58.318749845 +0000 UTC m=+992.034575369" lastFinishedPulling="2026-03-18 12:26:01.822939035 +0000 UTC m=+995.538764559" observedRunningTime="2026-03-18 12:26:02.399940046 +0000 UTC m=+996.115765590" watchObservedRunningTime="2026-03-18 12:26:02.404040192 +0000 UTC m=+996.119865716" Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.615475 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.672682 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-bundle\") pod \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\" (UID: \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\") " Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.672836 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7d8z\" (UniqueName: \"kubernetes.io/projected/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-kube-api-access-k7d8z\") pod \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\" (UID: \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\") " Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.672940 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-util\") pod \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\" (UID: \"2ae2830e-03f9-4bfe-941a-a30fa8f092cf\") " Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.673331 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-bundle" (OuterVolumeSpecName: "bundle") pod "2ae2830e-03f9-4bfe-941a-a30fa8f092cf" (UID: "2ae2830e-03f9-4bfe-941a-a30fa8f092cf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.680963 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-kube-api-access-k7d8z" (OuterVolumeSpecName: "kube-api-access-k7d8z") pod "2ae2830e-03f9-4bfe-941a-a30fa8f092cf" (UID: "2ae2830e-03f9-4bfe-941a-a30fa8f092cf"). InnerVolumeSpecName "kube-api-access-k7d8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.683273 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-util" (OuterVolumeSpecName: "util") pod "2ae2830e-03f9-4bfe-941a-a30fa8f092cf" (UID: "2ae2830e-03f9-4bfe-941a-a30fa8f092cf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.774874 4843 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-util\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.774916 4843 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:02 crc kubenswrapper[4843]: I0318 12:26:02.774931 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7d8z\" (UniqueName: \"kubernetes.io/projected/2ae2830e-03f9-4bfe-941a-a30fa8f092cf-kube-api-access-k7d8z\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:03 crc kubenswrapper[4843]: I0318 12:26:03.370328 4843 generic.go:334] "Generic (PLEG): container finished" podID="d61bcdf2-7d5a-4d36-ac50-896209b2d468" containerID="5fe5d24533ad2ffb865be576a8383e9595bcec8f7b4c831fc413219d301d176d" exitCode=0 Mar 18 12:26:03 crc kubenswrapper[4843]: I0318 12:26:03.370405 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563946-9cgbr" event={"ID":"d61bcdf2-7d5a-4d36-ac50-896209b2d468","Type":"ContainerDied","Data":"5fe5d24533ad2ffb865be576a8383e9595bcec8f7b4c831fc413219d301d176d"} Mar 18 12:26:03 crc kubenswrapper[4843]: I0318 12:26:03.374402 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" event={"ID":"2ae2830e-03f9-4bfe-941a-a30fa8f092cf","Type":"ContainerDied","Data":"df8bd066224a045a48913c228fc374758075387a631fec69cedda91123510f40"} Mar 18 12:26:03 crc kubenswrapper[4843]: I0318 12:26:03.374440 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df8bd066224a045a48913c228fc374758075387a631fec69cedda91123510f40" Mar 18 12:26:03 crc kubenswrapper[4843]: I0318 12:26:03.374523 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b" Mar 18 12:26:04 crc kubenswrapper[4843]: I0318 12:26:04.592468 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563946-9cgbr" Mar 18 12:26:04 crc kubenswrapper[4843]: I0318 12:26:04.741241 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6vzm\" (UniqueName: \"kubernetes.io/projected/d61bcdf2-7d5a-4d36-ac50-896209b2d468-kube-api-access-b6vzm\") pod \"d61bcdf2-7d5a-4d36-ac50-896209b2d468\" (UID: \"d61bcdf2-7d5a-4d36-ac50-896209b2d468\") " Mar 18 12:26:04 crc kubenswrapper[4843]: I0318 12:26:04.747171 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d61bcdf2-7d5a-4d36-ac50-896209b2d468-kube-api-access-b6vzm" (OuterVolumeSpecName: "kube-api-access-b6vzm") pod "d61bcdf2-7d5a-4d36-ac50-896209b2d468" (UID: "d61bcdf2-7d5a-4d36-ac50-896209b2d468"). InnerVolumeSpecName "kube-api-access-b6vzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:26:04 crc kubenswrapper[4843]: I0318 12:26:04.843428 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6vzm\" (UniqueName: \"kubernetes.io/projected/d61bcdf2-7d5a-4d36-ac50-896209b2d468-kube-api-access-b6vzm\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:05 crc kubenswrapper[4843]: I0318 12:26:05.387037 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563946-9cgbr" event={"ID":"d61bcdf2-7d5a-4d36-ac50-896209b2d468","Type":"ContainerDied","Data":"91255ef4c01c8680931e8ac2509a954ecce292cc6b0ab2dd4f366cd304886031"} Mar 18 12:26:05 crc kubenswrapper[4843]: I0318 12:26:05.387350 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91255ef4c01c8680931e8ac2509a954ecce292cc6b0ab2dd4f366cd304886031" Mar 18 12:26:05 crc kubenswrapper[4843]: I0318 12:26:05.387121 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563946-9cgbr" Mar 18 12:26:05 crc kubenswrapper[4843]: I0318 12:26:05.439369 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563940-q5psz"] Mar 18 12:26:05 crc kubenswrapper[4843]: I0318 12:26:05.443280 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563940-q5psz"] Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.473835 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-q8kxc"] Mar 18 12:26:06 crc kubenswrapper[4843]: E0318 12:26:06.474086 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61bcdf2-7d5a-4d36-ac50-896209b2d468" containerName="oc" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.474098 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61bcdf2-7d5a-4d36-ac50-896209b2d468" containerName="oc" Mar 18 12:26:06 crc kubenswrapper[4843]: E0318 12:26:06.474108 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae2830e-03f9-4bfe-941a-a30fa8f092cf" containerName="util" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.474114 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae2830e-03f9-4bfe-941a-a30fa8f092cf" containerName="util" Mar 18 12:26:06 crc kubenswrapper[4843]: E0318 12:26:06.474129 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae2830e-03f9-4bfe-941a-a30fa8f092cf" containerName="extract" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.474134 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae2830e-03f9-4bfe-941a-a30fa8f092cf" containerName="extract" Mar 18 12:26:06 crc kubenswrapper[4843]: E0318 12:26:06.474144 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae2830e-03f9-4bfe-941a-a30fa8f092cf" containerName="pull" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.474149 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae2830e-03f9-4bfe-941a-a30fa8f092cf" containerName="pull" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.474258 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61bcdf2-7d5a-4d36-ac50-896209b2d468" containerName="oc" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.474271 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae2830e-03f9-4bfe-941a-a30fa8f092cf" containerName="extract" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.474664 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-q8kxc" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.476895 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-jzxrg" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.478305 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.479637 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.487899 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-q8kxc"] Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.564891 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bzph\" (UniqueName: \"kubernetes.io/projected/943fabaf-11af-46ca-8dd2-3bfd7081bdec-kube-api-access-4bzph\") pod \"nmstate-operator-796d4cfff4-q8kxc\" (UID: \"943fabaf-11af-46ca-8dd2-3bfd7081bdec\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-q8kxc" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.666092 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzph\" (UniqueName: \"kubernetes.io/projected/943fabaf-11af-46ca-8dd2-3bfd7081bdec-kube-api-access-4bzph\") pod \"nmstate-operator-796d4cfff4-q8kxc\" (UID: \"943fabaf-11af-46ca-8dd2-3bfd7081bdec\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-q8kxc" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.685830 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bzph\" (UniqueName: \"kubernetes.io/projected/943fabaf-11af-46ca-8dd2-3bfd7081bdec-kube-api-access-4bzph\") pod \"nmstate-operator-796d4cfff4-q8kxc\" (UID: \"943fabaf-11af-46ca-8dd2-3bfd7081bdec\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-q8kxc" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.789982 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-q8kxc" Mar 18 12:26:06 crc kubenswrapper[4843]: I0318 12:26:06.991043 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f627d5-05cf-46a6-a1f9-65ecf9b3c206" path="/var/lib/kubelet/pods/27f627d5-05cf-46a6-a1f9-65ecf9b3c206/volumes" Mar 18 12:26:07 crc kubenswrapper[4843]: I0318 12:26:07.042468 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-q8kxc"] Mar 18 12:26:07 crc kubenswrapper[4843]: W0318 12:26:07.064518 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod943fabaf_11af_46ca_8dd2_3bfd7081bdec.slice/crio-6f5bf54403e6d62f083f1343a4791668c04226760230289e99f888e6982f218b WatchSource:0}: Error finding container 6f5bf54403e6d62f083f1343a4791668c04226760230289e99f888e6982f218b: Status 404 returned error can't find the container with id 6f5bf54403e6d62f083f1343a4791668c04226760230289e99f888e6982f218b Mar 18 12:26:07 crc kubenswrapper[4843]: I0318 12:26:07.403581 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-q8kxc" event={"ID":"943fabaf-11af-46ca-8dd2-3bfd7081bdec","Type":"ContainerStarted","Data":"6f5bf54403e6d62f083f1343a4791668c04226760230289e99f888e6982f218b"} Mar 18 12:26:07 crc kubenswrapper[4843]: I0318 12:26:07.877045 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:26:07 crc kubenswrapper[4843]: I0318 12:26:07.877387 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:26:08 crc kubenswrapper[4843]: I0318 12:26:08.924966 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bl5fs" podUID="83683877-b923-4fcb-9672-a7e64c05abd5" containerName="registry-server" probeResult="failure" output=< Mar 18 12:26:08 crc kubenswrapper[4843]: timeout: failed to connect service ":50051" within 1s Mar 18 12:26:08 crc kubenswrapper[4843]: > Mar 18 12:26:10 crc kubenswrapper[4843]: I0318 12:26:10.436158 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-q8kxc" event={"ID":"943fabaf-11af-46ca-8dd2-3bfd7081bdec","Type":"ContainerStarted","Data":"db15aed3913061bbbd5fd894771c2ca521539f8c8a25024daf65f71373754518"} Mar 18 12:26:10 crc kubenswrapper[4843]: I0318 12:26:10.460141 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-q8kxc" podStartSLOduration=2.146697915 podStartE2EDuration="4.460121621s" podCreationTimestamp="2026-03-18 12:26:06 +0000 UTC" firstStartedPulling="2026-03-18 12:26:07.068422599 +0000 UTC m=+1000.784248123" lastFinishedPulling="2026-03-18 12:26:09.381846305 +0000 UTC m=+1003.097671829" observedRunningTime="2026-03-18 12:26:10.456486087 +0000 UTC m=+1004.172311621" watchObservedRunningTime="2026-03-18 12:26:10.460121621 +0000 UTC m=+1004.175947155" Mar 18 12:26:17 crc kubenswrapper[4843]: I0318 12:26:17.963224 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.002814 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9sllb"] Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.006097 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.013106 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9sllb"] Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.035997 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.198784 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr86m\" (UniqueName: \"kubernetes.io/projected/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-kube-api-access-dr86m\") pod \"community-operators-9sllb\" (UID: \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\") " pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.198898 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-utilities\") pod \"community-operators-9sllb\" (UID: \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\") " pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.198928 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-catalog-content\") pod \"community-operators-9sllb\" (UID: \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\") " pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.299768 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr86m\" (UniqueName: \"kubernetes.io/projected/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-kube-api-access-dr86m\") pod \"community-operators-9sllb\" (UID: \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\") " pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.299856 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-utilities\") pod \"community-operators-9sllb\" (UID: \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\") " pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.299877 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-catalog-content\") pod \"community-operators-9sllb\" (UID: \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\") " pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.300374 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-catalog-content\") pod \"community-operators-9sllb\" (UID: \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\") " pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.300465 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-utilities\") pod \"community-operators-9sllb\" (UID: \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\") " pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.320857 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr86m\" (UniqueName: \"kubernetes.io/projected/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-kube-api-access-dr86m\") pod \"community-operators-9sllb\" (UID: \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\") " pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.326624 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:18 crc kubenswrapper[4843]: I0318 12:26:18.648837 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9sllb"] Mar 18 12:26:19 crc kubenswrapper[4843]: I0318 12:26:19.374324 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bl5fs"] Mar 18 12:26:19 crc kubenswrapper[4843]: I0318 12:26:19.502001 4843 generic.go:334] "Generic (PLEG): container finished" podID="dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" containerID="31130ac8e6b03530011d8851948e5d31bb5282a089af0a059106df784fc2b76e" exitCode=0 Mar 18 12:26:19 crc kubenswrapper[4843]: I0318 12:26:19.502585 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bl5fs" podUID="83683877-b923-4fcb-9672-a7e64c05abd5" containerName="registry-server" containerID="cri-o://330725b7c9878288ca565c7a290e6196a39d4535b3d39e02d046c7c776266024" gracePeriod=2 Mar 18 12:26:19 crc kubenswrapper[4843]: I0318 12:26:19.504043 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sllb" event={"ID":"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8","Type":"ContainerDied","Data":"31130ac8e6b03530011d8851948e5d31bb5282a089af0a059106df784fc2b76e"} Mar 18 12:26:19 crc kubenswrapper[4843]: I0318 12:26:19.504090 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sllb" event={"ID":"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8","Type":"ContainerStarted","Data":"b68a323a1b3f1c4c8e614548664c350a575d7160376b7c4b8dd3b81b67fea4af"} Mar 18 12:26:19 crc kubenswrapper[4843]: I0318 12:26:19.909637 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.022464 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4x24\" (UniqueName: \"kubernetes.io/projected/83683877-b923-4fcb-9672-a7e64c05abd5-kube-api-access-f4x24\") pod \"83683877-b923-4fcb-9672-a7e64c05abd5\" (UID: \"83683877-b923-4fcb-9672-a7e64c05abd5\") " Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.022597 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83683877-b923-4fcb-9672-a7e64c05abd5-catalog-content\") pod \"83683877-b923-4fcb-9672-a7e64c05abd5\" (UID: \"83683877-b923-4fcb-9672-a7e64c05abd5\") " Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.022768 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83683877-b923-4fcb-9672-a7e64c05abd5-utilities\") pod \"83683877-b923-4fcb-9672-a7e64c05abd5\" (UID: \"83683877-b923-4fcb-9672-a7e64c05abd5\") " Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.023935 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83683877-b923-4fcb-9672-a7e64c05abd5-utilities" (OuterVolumeSpecName: "utilities") pod "83683877-b923-4fcb-9672-a7e64c05abd5" (UID: "83683877-b923-4fcb-9672-a7e64c05abd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.030366 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83683877-b923-4fcb-9672-a7e64c05abd5-kube-api-access-f4x24" (OuterVolumeSpecName: "kube-api-access-f4x24") pod "83683877-b923-4fcb-9672-a7e64c05abd5" (UID: "83683877-b923-4fcb-9672-a7e64c05abd5"). InnerVolumeSpecName "kube-api-access-f4x24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.035026 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.035099 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.125572 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83683877-b923-4fcb-9672-a7e64c05abd5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.125627 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4x24\" (UniqueName: \"kubernetes.io/projected/83683877-b923-4fcb-9672-a7e64c05abd5-kube-api-access-f4x24\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.209640 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83683877-b923-4fcb-9672-a7e64c05abd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83683877-b923-4fcb-9672-a7e64c05abd5" (UID: "83683877-b923-4fcb-9672-a7e64c05abd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.227171 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83683877-b923-4fcb-9672-a7e64c05abd5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.508887 4843 generic.go:334] "Generic (PLEG): container finished" podID="83683877-b923-4fcb-9672-a7e64c05abd5" containerID="330725b7c9878288ca565c7a290e6196a39d4535b3d39e02d046c7c776266024" exitCode=0 Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.508919 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl5fs" event={"ID":"83683877-b923-4fcb-9672-a7e64c05abd5","Type":"ContainerDied","Data":"330725b7c9878288ca565c7a290e6196a39d4535b3d39e02d046c7c776266024"} Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.508952 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bl5fs" event={"ID":"83683877-b923-4fcb-9672-a7e64c05abd5","Type":"ContainerDied","Data":"861d80732f8c2e1d34aa54cf9a32f2ea06b3633467ba4038aba4c9445bf3933a"} Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.508958 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bl5fs" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.508969 4843 scope.go:117] "RemoveContainer" containerID="330725b7c9878288ca565c7a290e6196a39d4535b3d39e02d046c7c776266024" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.510910 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sllb" event={"ID":"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8","Type":"ContainerStarted","Data":"78d880078bdd102ed131150d950f4ccde6b8edd37f1e45497029764ad94368b1"} Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.533917 4843 scope.go:117] "RemoveContainer" containerID="e309110ccd7411bc3df1c593b05867fc992c3fc2ef76fd8126a4e3aba1015084" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.563907 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bl5fs"] Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.566548 4843 scope.go:117] "RemoveContainer" containerID="2db93522ee0c6be54642c3b8594df23d74a250bfba28f3e7372a4ddb53681196" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.569753 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bl5fs"] Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.585037 4843 scope.go:117] "RemoveContainer" containerID="330725b7c9878288ca565c7a290e6196a39d4535b3d39e02d046c7c776266024" Mar 18 12:26:20 crc kubenswrapper[4843]: E0318 12:26:20.585629 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"330725b7c9878288ca565c7a290e6196a39d4535b3d39e02d046c7c776266024\": container with ID starting with 330725b7c9878288ca565c7a290e6196a39d4535b3d39e02d046c7c776266024 not found: ID does not exist" containerID="330725b7c9878288ca565c7a290e6196a39d4535b3d39e02d046c7c776266024" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.585709 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"330725b7c9878288ca565c7a290e6196a39d4535b3d39e02d046c7c776266024"} err="failed to get container status \"330725b7c9878288ca565c7a290e6196a39d4535b3d39e02d046c7c776266024\": rpc error: code = NotFound desc = could not find container \"330725b7c9878288ca565c7a290e6196a39d4535b3d39e02d046c7c776266024\": container with ID starting with 330725b7c9878288ca565c7a290e6196a39d4535b3d39e02d046c7c776266024 not found: ID does not exist" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.585747 4843 scope.go:117] "RemoveContainer" containerID="e309110ccd7411bc3df1c593b05867fc992c3fc2ef76fd8126a4e3aba1015084" Mar 18 12:26:20 crc kubenswrapper[4843]: E0318 12:26:20.586302 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e309110ccd7411bc3df1c593b05867fc992c3fc2ef76fd8126a4e3aba1015084\": container with ID starting with e309110ccd7411bc3df1c593b05867fc992c3fc2ef76fd8126a4e3aba1015084 not found: ID does not exist" containerID="e309110ccd7411bc3df1c593b05867fc992c3fc2ef76fd8126a4e3aba1015084" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.586361 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e309110ccd7411bc3df1c593b05867fc992c3fc2ef76fd8126a4e3aba1015084"} err="failed to get container status \"e309110ccd7411bc3df1c593b05867fc992c3fc2ef76fd8126a4e3aba1015084\": rpc error: code = NotFound desc = could not find container \"e309110ccd7411bc3df1c593b05867fc992c3fc2ef76fd8126a4e3aba1015084\": container with ID starting with e309110ccd7411bc3df1c593b05867fc992c3fc2ef76fd8126a4e3aba1015084 not found: ID does not exist" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.586393 4843 scope.go:117] "RemoveContainer" containerID="2db93522ee0c6be54642c3b8594df23d74a250bfba28f3e7372a4ddb53681196" Mar 18 12:26:20 crc kubenswrapper[4843]: E0318 12:26:20.586864 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2db93522ee0c6be54642c3b8594df23d74a250bfba28f3e7372a4ddb53681196\": container with ID starting with 2db93522ee0c6be54642c3b8594df23d74a250bfba28f3e7372a4ddb53681196 not found: ID does not exist" containerID="2db93522ee0c6be54642c3b8594df23d74a250bfba28f3e7372a4ddb53681196" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.586909 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db93522ee0c6be54642c3b8594df23d74a250bfba28f3e7372a4ddb53681196"} err="failed to get container status \"2db93522ee0c6be54642c3b8594df23d74a250bfba28f3e7372a4ddb53681196\": rpc error: code = NotFound desc = could not find container \"2db93522ee0c6be54642c3b8594df23d74a250bfba28f3e7372a4ddb53681196\": container with ID starting with 2db93522ee0c6be54642c3b8594df23d74a250bfba28f3e7372a4ddb53681196 not found: ID does not exist" Mar 18 12:26:20 crc kubenswrapper[4843]: I0318 12:26:20.997379 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83683877-b923-4fcb-9672-a7e64c05abd5" path="/var/lib/kubelet/pods/83683877-b923-4fcb-9672-a7e64c05abd5/volumes" Mar 18 12:26:21 crc kubenswrapper[4843]: I0318 12:26:21.521035 4843 generic.go:334] "Generic (PLEG): container finished" podID="dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" containerID="78d880078bdd102ed131150d950f4ccde6b8edd37f1e45497029764ad94368b1" exitCode=0 Mar 18 12:26:21 crc kubenswrapper[4843]: I0318 12:26:21.521154 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sllb" event={"ID":"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8","Type":"ContainerDied","Data":"78d880078bdd102ed131150d950f4ccde6b8edd37f1e45497029764ad94368b1"} Mar 18 12:26:23 crc kubenswrapper[4843]: I0318 12:26:23.735481 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sllb" event={"ID":"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8","Type":"ContainerStarted","Data":"f17f0df7c1ab6883900d3697030bb46eaf39ca3740d51a6cccd345c319885821"} Mar 18 12:26:23 crc kubenswrapper[4843]: I0318 12:26:23.764552 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9sllb" podStartSLOduration=3.545213506 podStartE2EDuration="6.764533826s" podCreationTimestamp="2026-03-18 12:26:17 +0000 UTC" firstStartedPulling="2026-03-18 12:26:19.5077392 +0000 UTC m=+1013.223564764" lastFinishedPulling="2026-03-18 12:26:22.72705955 +0000 UTC m=+1016.442885084" observedRunningTime="2026-03-18 12:26:23.75658637 +0000 UTC m=+1017.472411904" watchObservedRunningTime="2026-03-18 12:26:23.764533826 +0000 UTC m=+1017.480359360" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.740936 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-slwpg"] Mar 18 12:26:26 crc kubenswrapper[4843]: E0318 12:26:26.741682 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83683877-b923-4fcb-9672-a7e64c05abd5" containerName="extract-content" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.741698 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="83683877-b923-4fcb-9672-a7e64c05abd5" containerName="extract-content" Mar 18 12:26:26 crc kubenswrapper[4843]: E0318 12:26:26.741720 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83683877-b923-4fcb-9672-a7e64c05abd5" containerName="extract-utilities" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.741729 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="83683877-b923-4fcb-9672-a7e64c05abd5" containerName="extract-utilities" Mar 18 12:26:26 crc kubenswrapper[4843]: E0318 12:26:26.741750 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83683877-b923-4fcb-9672-a7e64c05abd5" containerName="registry-server" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.741757 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="83683877-b923-4fcb-9672-a7e64c05abd5" containerName="registry-server" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.741890 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="83683877-b923-4fcb-9672-a7e64c05abd5" containerName="registry-server" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.742678 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-slwpg" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.746182 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-v5jjc" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.749913 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4"] Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.751075 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.753397 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.763796 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-slwpg"] Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.780491 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4"] Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.827981 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-llzl5"] Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.833106 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.855585 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8xl2\" (UniqueName: \"kubernetes.io/projected/c219dce4-3b4d-4b0f-b8bc-52313c223e06-kube-api-access-d8xl2\") pod \"nmstate-metrics-9b8c8685d-slwpg\" (UID: \"c219dce4-3b4d-4b0f-b8bc-52313c223e06\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-slwpg" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.855633 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5a195771-131e-4a55-a719-ccde43845b3c-nmstate-lock\") pod \"nmstate-handler-llzl5\" (UID: \"5a195771-131e-4a55-a719-ccde43845b3c\") " pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.855682 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/19d2d46b-445f-463b-a1b7-e33d82880a8f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2lmd4\" (UID: \"19d2d46b-445f-463b-a1b7-e33d82880a8f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.855707 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5a195771-131e-4a55-a719-ccde43845b3c-ovs-socket\") pod \"nmstate-handler-llzl5\" (UID: \"5a195771-131e-4a55-a719-ccde43845b3c\") " pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.855732 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chl2f\" (UniqueName: \"kubernetes.io/projected/5a195771-131e-4a55-a719-ccde43845b3c-kube-api-access-chl2f\") pod \"nmstate-handler-llzl5\" (UID: \"5a195771-131e-4a55-a719-ccde43845b3c\") " pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.855749 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl55b\" (UniqueName: \"kubernetes.io/projected/19d2d46b-445f-463b-a1b7-e33d82880a8f-kube-api-access-rl55b\") pod \"nmstate-webhook-5f558f5558-2lmd4\" (UID: \"19d2d46b-445f-463b-a1b7-e33d82880a8f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.855770 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5a195771-131e-4a55-a719-ccde43845b3c-dbus-socket\") pod \"nmstate-handler-llzl5\" (UID: \"5a195771-131e-4a55-a719-ccde43845b3c\") " pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.956591 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5a195771-131e-4a55-a719-ccde43845b3c-ovs-socket\") pod \"nmstate-handler-llzl5\" (UID: \"5a195771-131e-4a55-a719-ccde43845b3c\") " pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.956665 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chl2f\" (UniqueName: \"kubernetes.io/projected/5a195771-131e-4a55-a719-ccde43845b3c-kube-api-access-chl2f\") pod \"nmstate-handler-llzl5\" (UID: \"5a195771-131e-4a55-a719-ccde43845b3c\") " pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.956698 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl55b\" (UniqueName: \"kubernetes.io/projected/19d2d46b-445f-463b-a1b7-e33d82880a8f-kube-api-access-rl55b\") pod \"nmstate-webhook-5f558f5558-2lmd4\" (UID: \"19d2d46b-445f-463b-a1b7-e33d82880a8f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.956734 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5a195771-131e-4a55-a719-ccde43845b3c-dbus-socket\") pod \"nmstate-handler-llzl5\" (UID: \"5a195771-131e-4a55-a719-ccde43845b3c\") " pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.956925 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8xl2\" (UniqueName: \"kubernetes.io/projected/c219dce4-3b4d-4b0f-b8bc-52313c223e06-kube-api-access-d8xl2\") pod \"nmstate-metrics-9b8c8685d-slwpg\" (UID: \"c219dce4-3b4d-4b0f-b8bc-52313c223e06\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-slwpg" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.956953 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5a195771-131e-4a55-a719-ccde43845b3c-nmstate-lock\") pod \"nmstate-handler-llzl5\" (UID: \"5a195771-131e-4a55-a719-ccde43845b3c\") " pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.957005 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/19d2d46b-445f-463b-a1b7-e33d82880a8f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2lmd4\" (UID: \"19d2d46b-445f-463b-a1b7-e33d82880a8f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.957189 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5a195771-131e-4a55-a719-ccde43845b3c-ovs-socket\") pod \"nmstate-handler-llzl5\" (UID: \"5a195771-131e-4a55-a719-ccde43845b3c\") " pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.957968 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5a195771-131e-4a55-a719-ccde43845b3c-nmstate-lock\") pod \"nmstate-handler-llzl5\" (UID: \"5a195771-131e-4a55-a719-ccde43845b3c\") " pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.958098 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5a195771-131e-4a55-a719-ccde43845b3c-dbus-socket\") pod \"nmstate-handler-llzl5\" (UID: \"5a195771-131e-4a55-a719-ccde43845b3c\") " pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.961192 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.965289 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62"] Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.966281 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" Mar 18 12:26:26 crc kubenswrapper[4843]: E0318 12:26:26.968056 4843 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 18 12:26:26 crc kubenswrapper[4843]: E0318 12:26:26.968124 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19d2d46b-445f-463b-a1b7-e33d82880a8f-tls-key-pair podName:19d2d46b-445f-463b-a1b7-e33d82880a8f nodeName:}" failed. No retries permitted until 2026-03-18 12:26:27.468108717 +0000 UTC m=+1021.183934241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/19d2d46b-445f-463b-a1b7-e33d82880a8f-tls-key-pair") pod "nmstate-webhook-5f558f5558-2lmd4" (UID: "19d2d46b-445f-463b-a1b7-e33d82880a8f") : secret "openshift-nmstate-webhook" not found Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.969474 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.970036 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62"] Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.970606 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-gr9t6" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.970829 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.982196 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl55b\" (UniqueName: \"kubernetes.io/projected/19d2d46b-445f-463b-a1b7-e33d82880a8f-kube-api-access-rl55b\") pod \"nmstate-webhook-5f558f5558-2lmd4\" (UID: \"19d2d46b-445f-463b-a1b7-e33d82880a8f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.985220 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chl2f\" (UniqueName: \"kubernetes.io/projected/5a195771-131e-4a55-a719-ccde43845b3c-kube-api-access-chl2f\") pod \"nmstate-handler-llzl5\" (UID: \"5a195771-131e-4a55-a719-ccde43845b3c\") " pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:26 crc kubenswrapper[4843]: I0318 12:26:26.988783 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8xl2\" (UniqueName: \"kubernetes.io/projected/c219dce4-3b4d-4b0f-b8bc-52313c223e06-kube-api-access-d8xl2\") pod \"nmstate-metrics-9b8c8685d-slwpg\" (UID: \"c219dce4-3b4d-4b0f-b8bc-52313c223e06\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-slwpg" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.057699 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba685259-be2e-473b-b07b-76fd1fba4433-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-s9x62\" (UID: \"ba685259-be2e-473b-b07b-76fd1fba4433\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.058034 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ba685259-be2e-473b-b07b-76fd1fba4433-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-s9x62\" (UID: \"ba685259-be2e-473b-b07b-76fd1fba4433\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.058131 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6kjc\" (UniqueName: \"kubernetes.io/projected/ba685259-be2e-473b-b07b-76fd1fba4433-kube-api-access-g6kjc\") pod \"nmstate-console-plugin-86f58fcf4-s9x62\" (UID: \"ba685259-be2e-473b-b07b-76fd1fba4433\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.084732 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-v5jjc" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.092951 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-slwpg" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.153984 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7dff655cb6-4tpqr"] Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.154112 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.154765 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.159078 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6kjc\" (UniqueName: \"kubernetes.io/projected/ba685259-be2e-473b-b07b-76fd1fba4433-kube-api-access-g6kjc\") pod \"nmstate-console-plugin-86f58fcf4-s9x62\" (UID: \"ba685259-be2e-473b-b07b-76fd1fba4433\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.159128 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d20cfaf-615d-4639-b8da-ccc8ed825254-trusted-ca-bundle\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.159155 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbmzb\" (UniqueName: \"kubernetes.io/projected/9d20cfaf-615d-4639-b8da-ccc8ed825254-kube-api-access-pbmzb\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.159188 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d20cfaf-615d-4639-b8da-ccc8ed825254-console-serving-cert\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.159210 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d20cfaf-615d-4639-b8da-ccc8ed825254-console-oauth-config\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.159254 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d20cfaf-615d-4639-b8da-ccc8ed825254-service-ca\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.159276 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d20cfaf-615d-4639-b8da-ccc8ed825254-oauth-serving-cert\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.159306 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba685259-be2e-473b-b07b-76fd1fba4433-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-s9x62\" (UID: \"ba685259-be2e-473b-b07b-76fd1fba4433\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.159371 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d20cfaf-615d-4639-b8da-ccc8ed825254-console-config\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.159402 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ba685259-be2e-473b-b07b-76fd1fba4433-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-s9x62\" (UID: \"ba685259-be2e-473b-b07b-76fd1fba4433\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" Mar 18 12:26:27 crc kubenswrapper[4843]: E0318 12:26:27.159589 4843 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 18 12:26:27 crc kubenswrapper[4843]: E0318 12:26:27.159681 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba685259-be2e-473b-b07b-76fd1fba4433-plugin-serving-cert podName:ba685259-be2e-473b-b07b-76fd1fba4433 nodeName:}" failed. No retries permitted until 2026-03-18 12:26:27.659647254 +0000 UTC m=+1021.375472858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/ba685259-be2e-473b-b07b-76fd1fba4433-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-s9x62" (UID: "ba685259-be2e-473b-b07b-76fd1fba4433") : secret "plugin-serving-cert" not found Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.161274 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ba685259-be2e-473b-b07b-76fd1fba4433-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-s9x62\" (UID: \"ba685259-be2e-473b-b07b-76fd1fba4433\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.172959 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7dff655cb6-4tpqr"] Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.188497 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6kjc\" (UniqueName: \"kubernetes.io/projected/ba685259-be2e-473b-b07b-76fd1fba4433-kube-api-access-g6kjc\") pod \"nmstate-console-plugin-86f58fcf4-s9x62\" (UID: \"ba685259-be2e-473b-b07b-76fd1fba4433\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" Mar 18 12:26:27 crc kubenswrapper[4843]: W0318 12:26:27.230506 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a195771_131e_4a55_a719_ccde43845b3c.slice/crio-b513869b392fb5ce6b603273f465e8f1d36063abf3bf082bfd571b7b7bb47aa9 WatchSource:0}: Error finding container b513869b392fb5ce6b603273f465e8f1d36063abf3bf082bfd571b7b7bb47aa9: Status 404 returned error can't find the container with id b513869b392fb5ce6b603273f465e8f1d36063abf3bf082bfd571b7b7bb47aa9 Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.261694 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d20cfaf-615d-4639-b8da-ccc8ed825254-console-serving-cert\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.262055 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d20cfaf-615d-4639-b8da-ccc8ed825254-console-oauth-config\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.262112 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d20cfaf-615d-4639-b8da-ccc8ed825254-service-ca\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.262146 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d20cfaf-615d-4639-b8da-ccc8ed825254-oauth-serving-cert\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.262340 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d20cfaf-615d-4639-b8da-ccc8ed825254-console-config\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.262401 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d20cfaf-615d-4639-b8da-ccc8ed825254-trusted-ca-bundle\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.262430 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbmzb\" (UniqueName: \"kubernetes.io/projected/9d20cfaf-615d-4639-b8da-ccc8ed825254-kube-api-access-pbmzb\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.264048 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d20cfaf-615d-4639-b8da-ccc8ed825254-console-config\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.270028 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d20cfaf-615d-4639-b8da-ccc8ed825254-trusted-ca-bundle\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.270556 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d20cfaf-615d-4639-b8da-ccc8ed825254-service-ca\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.271395 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d20cfaf-615d-4639-b8da-ccc8ed825254-oauth-serving-cert\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.280399 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d20cfaf-615d-4639-b8da-ccc8ed825254-console-oauth-config\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.280412 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d20cfaf-615d-4639-b8da-ccc8ed825254-console-serving-cert\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.284068 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbmzb\" (UniqueName: \"kubernetes.io/projected/9d20cfaf-615d-4639-b8da-ccc8ed825254-kube-api-access-pbmzb\") pod \"console-7dff655cb6-4tpqr\" (UID: \"9d20cfaf-615d-4639-b8da-ccc8ed825254\") " pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.382805 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-slwpg"] Mar 18 12:26:27 crc kubenswrapper[4843]: W0318 12:26:27.384162 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc219dce4_3b4d_4b0f_b8bc_52313c223e06.slice/crio-8676984dda95ef763e2fa2e625d41a86e2f2955301105ffdba3ec4127274770b WatchSource:0}: Error finding container 8676984dda95ef763e2fa2e625d41a86e2f2955301105ffdba3ec4127274770b: Status 404 returned error can't find the container with id 8676984dda95ef763e2fa2e625d41a86e2f2955301105ffdba3ec4127274770b Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.472479 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.567404 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/19d2d46b-445f-463b-a1b7-e33d82880a8f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2lmd4\" (UID: \"19d2d46b-445f-463b-a1b7-e33d82880a8f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.571255 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/19d2d46b-445f-463b-a1b7-e33d82880a8f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2lmd4\" (UID: \"19d2d46b-445f-463b-a1b7-e33d82880a8f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.668760 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba685259-be2e-473b-b07b-76fd1fba4433-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-s9x62\" (UID: \"ba685259-be2e-473b-b07b-76fd1fba4433\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.671378 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba685259-be2e-473b-b07b-76fd1fba4433-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-s9x62\" (UID: \"ba685259-be2e-473b-b07b-76fd1fba4433\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.696421 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.763217 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-slwpg" event={"ID":"c219dce4-3b4d-4b0f-b8bc-52313c223e06","Type":"ContainerStarted","Data":"8676984dda95ef763e2fa2e625d41a86e2f2955301105ffdba3ec4127274770b"} Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.766247 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-llzl5" event={"ID":"5a195771-131e-4a55-a719-ccde43845b3c","Type":"ContainerStarted","Data":"b513869b392fb5ce6b603273f465e8f1d36063abf3bf082bfd571b7b7bb47aa9"} Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.894671 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7dff655cb6-4tpqr"] Mar 18 12:26:27 crc kubenswrapper[4843]: I0318 12:26:27.923621 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" Mar 18 12:26:28 crc kubenswrapper[4843]: I0318 12:26:28.081902 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4"] Mar 18 12:26:28 crc kubenswrapper[4843]: W0318 12:26:28.092637 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19d2d46b_445f_463b_a1b7_e33d82880a8f.slice/crio-33460a458e90c1973e4b9226da0a7e212e29f6b74882ec08df4758b05b457a96 WatchSource:0}: Error finding container 33460a458e90c1973e4b9226da0a7e212e29f6b74882ec08df4758b05b457a96: Status 404 returned error can't find the container with id 33460a458e90c1973e4b9226da0a7e212e29f6b74882ec08df4758b05b457a96 Mar 18 12:26:28 crc kubenswrapper[4843]: I0318 12:26:28.128268 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62"] Mar 18 12:26:28 crc kubenswrapper[4843]: W0318 12:26:28.136899 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba685259_be2e_473b_b07b_76fd1fba4433.slice/crio-3c3f91d674a6e584778b182c7b3cae5b6d06c7ca6b51c87681e70915e90a9ca4 WatchSource:0}: Error finding container 3c3f91d674a6e584778b182c7b3cae5b6d06c7ca6b51c87681e70915e90a9ca4: Status 404 returned error can't find the container with id 3c3f91d674a6e584778b182c7b3cae5b6d06c7ca6b51c87681e70915e90a9ca4 Mar 18 12:26:28 crc kubenswrapper[4843]: I0318 12:26:28.327885 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:28 crc kubenswrapper[4843]: I0318 12:26:28.328219 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:28 crc kubenswrapper[4843]: I0318 12:26:28.436173 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:28 crc kubenswrapper[4843]: I0318 12:26:28.772434 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7dff655cb6-4tpqr" event={"ID":"9d20cfaf-615d-4639-b8da-ccc8ed825254","Type":"ContainerStarted","Data":"7510872ec213cb1c8e3bef0b909c239615c4e518433bd848cd286d3ecf70c90d"} Mar 18 12:26:28 crc kubenswrapper[4843]: I0318 12:26:28.772485 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7dff655cb6-4tpqr" event={"ID":"9d20cfaf-615d-4639-b8da-ccc8ed825254","Type":"ContainerStarted","Data":"54bdbc971f9a076ff97c1ac9b3cc88bbd9847e23aeba499470d813e366c48e9d"} Mar 18 12:26:28 crc kubenswrapper[4843]: I0318 12:26:28.773890 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" event={"ID":"ba685259-be2e-473b-b07b-76fd1fba4433","Type":"ContainerStarted","Data":"3c3f91d674a6e584778b182c7b3cae5b6d06c7ca6b51c87681e70915e90a9ca4"} Mar 18 12:26:28 crc kubenswrapper[4843]: I0318 12:26:28.775437 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" event={"ID":"19d2d46b-445f-463b-a1b7-e33d82880a8f","Type":"ContainerStarted","Data":"33460a458e90c1973e4b9226da0a7e212e29f6b74882ec08df4758b05b457a96"} Mar 18 12:26:28 crc kubenswrapper[4843]: I0318 12:26:28.792957 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7dff655cb6-4tpqr" podStartSLOduration=1.792943325 podStartE2EDuration="1.792943325s" podCreationTimestamp="2026-03-18 12:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:26:28.791518755 +0000 UTC m=+1022.507344279" watchObservedRunningTime="2026-03-18 12:26:28.792943325 +0000 UTC m=+1022.508768849" Mar 18 12:26:28 crc kubenswrapper[4843]: I0318 12:26:28.821617 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:28 crc kubenswrapper[4843]: I0318 12:26:28.863479 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9sllb"] Mar 18 12:26:30 crc kubenswrapper[4843]: I0318 12:26:30.789038 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-slwpg" event={"ID":"c219dce4-3b4d-4b0f-b8bc-52313c223e06","Type":"ContainerStarted","Data":"18dd26bc40fc04ccf154d44a39c99ca122a086a3a213c4cf7ef7a75be2b97651"} Mar 18 12:26:30 crc kubenswrapper[4843]: I0318 12:26:30.794344 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" event={"ID":"19d2d46b-445f-463b-a1b7-e33d82880a8f","Type":"ContainerStarted","Data":"a339c0c028d940d3332c62d49940f68cf2482ff72fa2c303455c022f61ebbcf9"} Mar 18 12:26:30 crc kubenswrapper[4843]: I0318 12:26:30.794488 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" Mar 18 12:26:30 crc kubenswrapper[4843]: I0318 12:26:30.798790 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-llzl5" event={"ID":"5a195771-131e-4a55-a719-ccde43845b3c","Type":"ContainerStarted","Data":"e37782f92b05e5c9339f6d95139f95a765169ad9c63a8422010e8a7cde9470c3"} Mar 18 12:26:30 crc kubenswrapper[4843]: I0318 12:26:30.799042 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9sllb" podUID="dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" containerName="registry-server" containerID="cri-o://f17f0df7c1ab6883900d3697030bb46eaf39ca3740d51a6cccd345c319885821" gracePeriod=2 Mar 18 12:26:30 crc kubenswrapper[4843]: I0318 12:26:30.823028 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" podStartSLOduration=3.084412937 podStartE2EDuration="4.822989712s" podCreationTimestamp="2026-03-18 12:26:26 +0000 UTC" firstStartedPulling="2026-03-18 12:26:28.094316317 +0000 UTC m=+1021.810141841" lastFinishedPulling="2026-03-18 12:26:29.832893082 +0000 UTC m=+1023.548718616" observedRunningTime="2026-03-18 12:26:30.812467402 +0000 UTC m=+1024.528292926" watchObservedRunningTime="2026-03-18 12:26:30.822989712 +0000 UTC m=+1024.538815236" Mar 18 12:26:30 crc kubenswrapper[4843]: I0318 12:26:30.836846 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-llzl5" podStartSLOduration=2.245510227 podStartE2EDuration="4.836826325s" podCreationTimestamp="2026-03-18 12:26:26 +0000 UTC" firstStartedPulling="2026-03-18 12:26:27.241006378 +0000 UTC m=+1020.956831912" lastFinishedPulling="2026-03-18 12:26:29.832322486 +0000 UTC m=+1023.548148010" observedRunningTime="2026-03-18 12:26:30.836672021 +0000 UTC m=+1024.552497555" watchObservedRunningTime="2026-03-18 12:26:30.836826325 +0000 UTC m=+1024.552651849" Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.174320 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.374101 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-catalog-content\") pod \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\" (UID: \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\") " Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.374188 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr86m\" (UniqueName: \"kubernetes.io/projected/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-kube-api-access-dr86m\") pod \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\" (UID: \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\") " Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.374218 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-utilities\") pod \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\" (UID: \"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8\") " Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.375456 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-utilities" (OuterVolumeSpecName: "utilities") pod "dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" (UID: "dbe098fc-dcb5-4c9a-a6e3-dc58830990a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.392035 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-kube-api-access-dr86m" (OuterVolumeSpecName: "kube-api-access-dr86m") pod "dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" (UID: "dbe098fc-dcb5-4c9a-a6e3-dc58830990a8"). InnerVolumeSpecName "kube-api-access-dr86m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.475909 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr86m\" (UniqueName: \"kubernetes.io/projected/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-kube-api-access-dr86m\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.475942 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.815970 4843 generic.go:334] "Generic (PLEG): container finished" podID="dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" containerID="f17f0df7c1ab6883900d3697030bb46eaf39ca3740d51a6cccd345c319885821" exitCode=0 Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.816058 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sllb" event={"ID":"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8","Type":"ContainerDied","Data":"f17f0df7c1ab6883900d3697030bb46eaf39ca3740d51a6cccd345c319885821"} Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.816098 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9sllb" event={"ID":"dbe098fc-dcb5-4c9a-a6e3-dc58830990a8","Type":"ContainerDied","Data":"b68a323a1b3f1c4c8e614548664c350a575d7160376b7c4b8dd3b81b67fea4af"} Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.816108 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9sllb" Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.816125 4843 scope.go:117] "RemoveContainer" containerID="f17f0df7c1ab6883900d3697030bb46eaf39ca3740d51a6cccd345c319885821" Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.818005 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" event={"ID":"ba685259-be2e-473b-b07b-76fd1fba4433","Type":"ContainerStarted","Data":"75010448d1f58c46aec26d17580d306f5fefa18fcb16507b92d8fcb7facc227f"} Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.818510 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.824552 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" (UID: "dbe098fc-dcb5-4c9a-a6e3-dc58830990a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.845561 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-s9x62" podStartSLOduration=3.2819954239999998 podStartE2EDuration="5.845539683s" podCreationTimestamp="2026-03-18 12:26:26 +0000 UTC" firstStartedPulling="2026-03-18 12:26:28.138665528 +0000 UTC m=+1021.854491052" lastFinishedPulling="2026-03-18 12:26:30.702209787 +0000 UTC m=+1024.418035311" observedRunningTime="2026-03-18 12:26:31.839638265 +0000 UTC m=+1025.555463799" watchObservedRunningTime="2026-03-18 12:26:31.845539683 +0000 UTC m=+1025.561365217" Mar 18 12:26:31 crc kubenswrapper[4843]: I0318 12:26:31.880012 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4843]: I0318 12:26:32.150496 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9sllb"] Mar 18 12:26:32 crc kubenswrapper[4843]: I0318 12:26:32.155085 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9sllb"] Mar 18 12:26:32 crc kubenswrapper[4843]: I0318 12:26:32.255119 4843 scope.go:117] "RemoveContainer" containerID="78d880078bdd102ed131150d950f4ccde6b8edd37f1e45497029764ad94368b1" Mar 18 12:26:32 crc kubenswrapper[4843]: I0318 12:26:32.295000 4843 scope.go:117] "RemoveContainer" containerID="31130ac8e6b03530011d8851948e5d31bb5282a089af0a059106df784fc2b76e" Mar 18 12:26:32 crc kubenswrapper[4843]: I0318 12:26:32.310942 4843 scope.go:117] "RemoveContainer" containerID="f17f0df7c1ab6883900d3697030bb46eaf39ca3740d51a6cccd345c319885821" Mar 18 12:26:32 crc kubenswrapper[4843]: E0318 12:26:32.311902 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f17f0df7c1ab6883900d3697030bb46eaf39ca3740d51a6cccd345c319885821\": container with ID starting with f17f0df7c1ab6883900d3697030bb46eaf39ca3740d51a6cccd345c319885821 not found: ID does not exist" containerID="f17f0df7c1ab6883900d3697030bb46eaf39ca3740d51a6cccd345c319885821" Mar 18 12:26:32 crc kubenswrapper[4843]: I0318 12:26:32.311937 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17f0df7c1ab6883900d3697030bb46eaf39ca3740d51a6cccd345c319885821"} err="failed to get container status \"f17f0df7c1ab6883900d3697030bb46eaf39ca3740d51a6cccd345c319885821\": rpc error: code = NotFound desc = could not find container \"f17f0df7c1ab6883900d3697030bb46eaf39ca3740d51a6cccd345c319885821\": container with ID starting with f17f0df7c1ab6883900d3697030bb46eaf39ca3740d51a6cccd345c319885821 not found: ID does not exist" Mar 18 12:26:32 crc kubenswrapper[4843]: I0318 12:26:32.311958 4843 scope.go:117] "RemoveContainer" containerID="78d880078bdd102ed131150d950f4ccde6b8edd37f1e45497029764ad94368b1" Mar 18 12:26:32 crc kubenswrapper[4843]: E0318 12:26:32.312247 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d880078bdd102ed131150d950f4ccde6b8edd37f1e45497029764ad94368b1\": container with ID starting with 78d880078bdd102ed131150d950f4ccde6b8edd37f1e45497029764ad94368b1 not found: ID does not exist" containerID="78d880078bdd102ed131150d950f4ccde6b8edd37f1e45497029764ad94368b1" Mar 18 12:26:32 crc kubenswrapper[4843]: I0318 12:26:32.312277 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d880078bdd102ed131150d950f4ccde6b8edd37f1e45497029764ad94368b1"} err="failed to get container status \"78d880078bdd102ed131150d950f4ccde6b8edd37f1e45497029764ad94368b1\": rpc error: code = NotFound desc = could not find container \"78d880078bdd102ed131150d950f4ccde6b8edd37f1e45497029764ad94368b1\": container with ID starting with 78d880078bdd102ed131150d950f4ccde6b8edd37f1e45497029764ad94368b1 not found: ID does not exist" Mar 18 12:26:32 crc kubenswrapper[4843]: I0318 12:26:32.312294 4843 scope.go:117] "RemoveContainer" containerID="31130ac8e6b03530011d8851948e5d31bb5282a089af0a059106df784fc2b76e" Mar 18 12:26:32 crc kubenswrapper[4843]: E0318 12:26:32.312720 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31130ac8e6b03530011d8851948e5d31bb5282a089af0a059106df784fc2b76e\": container with ID starting with 31130ac8e6b03530011d8851948e5d31bb5282a089af0a059106df784fc2b76e not found: ID does not exist" containerID="31130ac8e6b03530011d8851948e5d31bb5282a089af0a059106df784fc2b76e" Mar 18 12:26:32 crc kubenswrapper[4843]: I0318 12:26:32.312749 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31130ac8e6b03530011d8851948e5d31bb5282a089af0a059106df784fc2b76e"} err="failed to get container status \"31130ac8e6b03530011d8851948e5d31bb5282a089af0a059106df784fc2b76e\": rpc error: code = NotFound desc = could not find container \"31130ac8e6b03530011d8851948e5d31bb5282a089af0a059106df784fc2b76e\": container with ID starting with 31130ac8e6b03530011d8851948e5d31bb5282a089af0a059106df784fc2b76e not found: ID does not exist" Mar 18 12:26:32 crc kubenswrapper[4843]: I0318 12:26:32.830439 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-slwpg" event={"ID":"c219dce4-3b4d-4b0f-b8bc-52313c223e06","Type":"ContainerStarted","Data":"016d5a5701500c010a037e2ac7d243d57b31bf1f2e7039c9fa4e2f376b79c802"} Mar 18 12:26:32 crc kubenswrapper[4843]: I0318 12:26:32.992618 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" path="/var/lib/kubelet/pods/dbe098fc-dcb5-4c9a-a6e3-dc58830990a8/volumes" Mar 18 12:26:37 crc kubenswrapper[4843]: I0318 12:26:37.176380 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-llzl5" Mar 18 12:26:37 crc kubenswrapper[4843]: I0318 12:26:37.196616 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-slwpg" podStartSLOduration=6.2687836 podStartE2EDuration="11.196598379s" podCreationTimestamp="2026-03-18 12:26:26 +0000 UTC" firstStartedPulling="2026-03-18 12:26:27.386399813 +0000 UTC m=+1021.102225327" lastFinishedPulling="2026-03-18 12:26:32.314214582 +0000 UTC m=+1026.030040106" observedRunningTime="2026-03-18 12:26:32.871167913 +0000 UTC m=+1026.586993487" watchObservedRunningTime="2026-03-18 12:26:37.196598379 +0000 UTC m=+1030.912423903" Mar 18 12:26:37 crc kubenswrapper[4843]: I0318 12:26:37.472950 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:37 crc kubenswrapper[4843]: I0318 12:26:37.473035 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:37 crc kubenswrapper[4843]: I0318 12:26:37.482164 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:37 crc kubenswrapper[4843]: I0318 12:26:37.892712 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7dff655cb6-4tpqr" Mar 18 12:26:37 crc kubenswrapper[4843]: I0318 12:26:37.954108 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ppqpl"] Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.716915 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b45bn"] Mar 18 12:26:44 crc kubenswrapper[4843]: E0318 12:26:44.718847 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" containerName="registry-server" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.718929 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" containerName="registry-server" Mar 18 12:26:44 crc kubenswrapper[4843]: E0318 12:26:44.719004 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" containerName="extract-utilities" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.719064 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" containerName="extract-utilities" Mar 18 12:26:44 crc kubenswrapper[4843]: E0318 12:26:44.719129 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" containerName="extract-content" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.719201 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" containerName="extract-content" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.719463 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbe098fc-dcb5-4c9a-a6e3-dc58830990a8" containerName="registry-server" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.720683 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.728615 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b45bn"] Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.796380 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtmqf\" (UniqueName: \"kubernetes.io/projected/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-kube-api-access-qtmqf\") pod \"certified-operators-b45bn\" (UID: \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\") " pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.796430 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-catalog-content\") pod \"certified-operators-b45bn\" (UID: \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\") " pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.796501 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-utilities\") pod \"certified-operators-b45bn\" (UID: \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\") " pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.897497 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-utilities\") pod \"certified-operators-b45bn\" (UID: \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\") " pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.897893 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtmqf\" (UniqueName: \"kubernetes.io/projected/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-kube-api-access-qtmqf\") pod \"certified-operators-b45bn\" (UID: \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\") " pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.898005 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-catalog-content\") pod \"certified-operators-b45bn\" (UID: \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\") " pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.898029 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-utilities\") pod \"certified-operators-b45bn\" (UID: \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\") " pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.898423 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-catalog-content\") pod \"certified-operators-b45bn\" (UID: \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\") " pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:44 crc kubenswrapper[4843]: I0318 12:26:44.919563 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtmqf\" (UniqueName: \"kubernetes.io/projected/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-kube-api-access-qtmqf\") pod \"certified-operators-b45bn\" (UID: \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\") " pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:45 crc kubenswrapper[4843]: I0318 12:26:45.042109 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:45 crc kubenswrapper[4843]: I0318 12:26:45.730544 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b45bn"] Mar 18 12:26:45 crc kubenswrapper[4843]: W0318 12:26:45.744240 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeac2af3_bfd1_4e6a_a7c3_690b9017a2ed.slice/crio-671886e03a1c889c72810ce6b93db37c7e64a6a0c9022ce975c53207f644bd54 WatchSource:0}: Error finding container 671886e03a1c889c72810ce6b93db37c7e64a6a0c9022ce975c53207f644bd54: Status 404 returned error can't find the container with id 671886e03a1c889c72810ce6b93db37c7e64a6a0c9022ce975c53207f644bd54 Mar 18 12:26:45 crc kubenswrapper[4843]: I0318 12:26:45.995791 4843 generic.go:334] "Generic (PLEG): container finished" podID="aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" containerID="b12619d14c12a277216b7c9aaea36316b84f72833ee900b864762c28f5add38a" exitCode=0 Mar 18 12:26:45 crc kubenswrapper[4843]: I0318 12:26:45.995840 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b45bn" event={"ID":"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed","Type":"ContainerDied","Data":"b12619d14c12a277216b7c9aaea36316b84f72833ee900b864762c28f5add38a"} Mar 18 12:26:45 crc kubenswrapper[4843]: I0318 12:26:45.995866 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b45bn" event={"ID":"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed","Type":"ContainerStarted","Data":"671886e03a1c889c72810ce6b93db37c7e64a6a0c9022ce975c53207f644bd54"} Mar 18 12:26:47 crc kubenswrapper[4843]: I0318 12:26:47.702452 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2lmd4" Mar 18 12:26:48 crc kubenswrapper[4843]: I0318 12:26:48.267430 4843 generic.go:334] "Generic (PLEG): container finished" podID="aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" containerID="260002802aafa57cd57d81e81cdd920b4b8ab0f2fad371986f803174c0fceff8" exitCode=0 Mar 18 12:26:48 crc kubenswrapper[4843]: I0318 12:26:48.267472 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b45bn" event={"ID":"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed","Type":"ContainerDied","Data":"260002802aafa57cd57d81e81cdd920b4b8ab0f2fad371986f803174c0fceff8"} Mar 18 12:26:49 crc kubenswrapper[4843]: I0318 12:26:49.275269 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b45bn" event={"ID":"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed","Type":"ContainerStarted","Data":"f70bea2265d83596c1c4d416a54a1176f99b260405bb87871f51ffb7489321c0"} Mar 18 12:26:49 crc kubenswrapper[4843]: I0318 12:26:49.296365 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b45bn" podStartSLOduration=2.483477385 podStartE2EDuration="5.296345114s" podCreationTimestamp="2026-03-18 12:26:44 +0000 UTC" firstStartedPulling="2026-03-18 12:26:45.997134482 +0000 UTC m=+1039.712960006" lastFinishedPulling="2026-03-18 12:26:48.810002211 +0000 UTC m=+1042.525827735" observedRunningTime="2026-03-18 12:26:49.291500636 +0000 UTC m=+1043.007326160" watchObservedRunningTime="2026-03-18 12:26:49.296345114 +0000 UTC m=+1043.012170638" Mar 18 12:26:50 crc kubenswrapper[4843]: I0318 12:26:50.034541 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:26:50 crc kubenswrapper[4843]: I0318 12:26:50.034619 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:26:55 crc kubenswrapper[4843]: I0318 12:26:55.189325 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:55 crc kubenswrapper[4843]: I0318 12:26:55.189809 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:55 crc kubenswrapper[4843]: I0318 12:26:55.227845 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:55 crc kubenswrapper[4843]: I0318 12:26:55.348095 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:55 crc kubenswrapper[4843]: I0318 12:26:55.490077 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b45bn"] Mar 18 12:26:56 crc kubenswrapper[4843]: I0318 12:26:56.776283 4843 scope.go:117] "RemoveContainer" containerID="26c6ce1823f1c66b746ee92acfb9fc33322775aebb2130c80065909fd8f65dd5" Mar 18 12:26:57 crc kubenswrapper[4843]: I0318 12:26:57.321202 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b45bn" podUID="aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" containerName="registry-server" containerID="cri-o://f70bea2265d83596c1c4d416a54a1176f99b260405bb87871f51ffb7489321c0" gracePeriod=2 Mar 18 12:26:57 crc kubenswrapper[4843]: I0318 12:26:57.681600 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:57 crc kubenswrapper[4843]: I0318 12:26:57.815937 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-catalog-content\") pod \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\" (UID: \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\") " Mar 18 12:26:57 crc kubenswrapper[4843]: I0318 12:26:57.816401 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-utilities\") pod \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\" (UID: \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\") " Mar 18 12:26:57 crc kubenswrapper[4843]: I0318 12:26:57.816516 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtmqf\" (UniqueName: \"kubernetes.io/projected/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-kube-api-access-qtmqf\") pod \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\" (UID: \"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed\") " Mar 18 12:26:57 crc kubenswrapper[4843]: I0318 12:26:57.817073 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-utilities" (OuterVolumeSpecName: "utilities") pod "aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" (UID: "aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:26:57 crc kubenswrapper[4843]: I0318 12:26:57.822285 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-kube-api-access-qtmqf" (OuterVolumeSpecName: "kube-api-access-qtmqf") pod "aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" (UID: "aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed"). InnerVolumeSpecName "kube-api-access-qtmqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:26:57 crc kubenswrapper[4843]: I0318 12:26:57.899359 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" (UID: "aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:26:57 crc kubenswrapper[4843]: I0318 12:26:57.918273 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:57 crc kubenswrapper[4843]: I0318 12:26:57.918300 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:57 crc kubenswrapper[4843]: I0318 12:26:57.918309 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtmqf\" (UniqueName: \"kubernetes.io/projected/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed-kube-api-access-qtmqf\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.331910 4843 generic.go:334] "Generic (PLEG): container finished" podID="aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" containerID="f70bea2265d83596c1c4d416a54a1176f99b260405bb87871f51ffb7489321c0" exitCode=0 Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.331987 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b45bn" Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.331987 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b45bn" event={"ID":"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed","Type":"ContainerDied","Data":"f70bea2265d83596c1c4d416a54a1176f99b260405bb87871f51ffb7489321c0"} Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.332465 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b45bn" event={"ID":"aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed","Type":"ContainerDied","Data":"671886e03a1c889c72810ce6b93db37c7e64a6a0c9022ce975c53207f644bd54"} Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.332493 4843 scope.go:117] "RemoveContainer" containerID="f70bea2265d83596c1c4d416a54a1176f99b260405bb87871f51ffb7489321c0" Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.355353 4843 scope.go:117] "RemoveContainer" containerID="260002802aafa57cd57d81e81cdd920b4b8ab0f2fad371986f803174c0fceff8" Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.370220 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b45bn"] Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.376715 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b45bn"] Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.398407 4843 scope.go:117] "RemoveContainer" containerID="b12619d14c12a277216b7c9aaea36316b84f72833ee900b864762c28f5add38a" Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.422432 4843 scope.go:117] "RemoveContainer" containerID="f70bea2265d83596c1c4d416a54a1176f99b260405bb87871f51ffb7489321c0" Mar 18 12:26:58 crc kubenswrapper[4843]: E0318 12:26:58.422838 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70bea2265d83596c1c4d416a54a1176f99b260405bb87871f51ffb7489321c0\": container with ID starting with f70bea2265d83596c1c4d416a54a1176f99b260405bb87871f51ffb7489321c0 not found: ID does not exist" containerID="f70bea2265d83596c1c4d416a54a1176f99b260405bb87871f51ffb7489321c0" Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.422873 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70bea2265d83596c1c4d416a54a1176f99b260405bb87871f51ffb7489321c0"} err="failed to get container status \"f70bea2265d83596c1c4d416a54a1176f99b260405bb87871f51ffb7489321c0\": rpc error: code = NotFound desc = could not find container \"f70bea2265d83596c1c4d416a54a1176f99b260405bb87871f51ffb7489321c0\": container with ID starting with f70bea2265d83596c1c4d416a54a1176f99b260405bb87871f51ffb7489321c0 not found: ID does not exist" Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.422894 4843 scope.go:117] "RemoveContainer" containerID="260002802aafa57cd57d81e81cdd920b4b8ab0f2fad371986f803174c0fceff8" Mar 18 12:26:58 crc kubenswrapper[4843]: E0318 12:26:58.423709 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260002802aafa57cd57d81e81cdd920b4b8ab0f2fad371986f803174c0fceff8\": container with ID starting with 260002802aafa57cd57d81e81cdd920b4b8ab0f2fad371986f803174c0fceff8 not found: ID does not exist" containerID="260002802aafa57cd57d81e81cdd920b4b8ab0f2fad371986f803174c0fceff8" Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.423749 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260002802aafa57cd57d81e81cdd920b4b8ab0f2fad371986f803174c0fceff8"} err="failed to get container status \"260002802aafa57cd57d81e81cdd920b4b8ab0f2fad371986f803174c0fceff8\": rpc error: code = NotFound desc = could not find container \"260002802aafa57cd57d81e81cdd920b4b8ab0f2fad371986f803174c0fceff8\": container with ID starting with 260002802aafa57cd57d81e81cdd920b4b8ab0f2fad371986f803174c0fceff8 not found: ID does not exist" Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.423784 4843 scope.go:117] "RemoveContainer" containerID="b12619d14c12a277216b7c9aaea36316b84f72833ee900b864762c28f5add38a" Mar 18 12:26:58 crc kubenswrapper[4843]: E0318 12:26:58.424144 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b12619d14c12a277216b7c9aaea36316b84f72833ee900b864762c28f5add38a\": container with ID starting with b12619d14c12a277216b7c9aaea36316b84f72833ee900b864762c28f5add38a not found: ID does not exist" containerID="b12619d14c12a277216b7c9aaea36316b84f72833ee900b864762c28f5add38a" Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.424168 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b12619d14c12a277216b7c9aaea36316b84f72833ee900b864762c28f5add38a"} err="failed to get container status \"b12619d14c12a277216b7c9aaea36316b84f72833ee900b864762c28f5add38a\": rpc error: code = NotFound desc = could not find container \"b12619d14c12a277216b7c9aaea36316b84f72833ee900b864762c28f5add38a\": container with ID starting with b12619d14c12a277216b7c9aaea36316b84f72833ee900b864762c28f5add38a not found: ID does not exist" Mar 18 12:26:58 crc kubenswrapper[4843]: I0318 12:26:58.997005 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" path="/var/lib/kubelet/pods/aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed/volumes" Mar 18 12:27:00 crc kubenswrapper[4843]: I0318 12:27:00.908243 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5"] Mar 18 12:27:00 crc kubenswrapper[4843]: E0318 12:27:00.908879 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" containerName="registry-server" Mar 18 12:27:00 crc kubenswrapper[4843]: I0318 12:27:00.908893 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" containerName="registry-server" Mar 18 12:27:00 crc kubenswrapper[4843]: E0318 12:27:00.908913 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" containerName="extract-content" Mar 18 12:27:00 crc kubenswrapper[4843]: I0318 12:27:00.908918 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" containerName="extract-content" Mar 18 12:27:00 crc kubenswrapper[4843]: E0318 12:27:00.908929 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" containerName="extract-utilities" Mar 18 12:27:00 crc kubenswrapper[4843]: I0318 12:27:00.908937 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" containerName="extract-utilities" Mar 18 12:27:00 crc kubenswrapper[4843]: I0318 12:27:00.909053 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeac2af3-bfd1-4e6a-a7c3-690b9017a2ed" containerName="registry-server" Mar 18 12:27:00 crc kubenswrapper[4843]: I0318 12:27:00.909798 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" Mar 18 12:27:00 crc kubenswrapper[4843]: I0318 12:27:00.912809 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 12:27:00 crc kubenswrapper[4843]: I0318 12:27:00.937694 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5"] Mar 18 12:27:01 crc kubenswrapper[4843]: I0318 12:27:01.054183 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bd5afa8-337e-4384-ae10-8689ce534039-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5\" (UID: \"4bd5afa8-337e-4384-ae10-8689ce534039\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" Mar 18 12:27:01 crc kubenswrapper[4843]: I0318 12:27:01.054781 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bd5afa8-337e-4384-ae10-8689ce534039-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5\" (UID: \"4bd5afa8-337e-4384-ae10-8689ce534039\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" Mar 18 12:27:01 crc kubenswrapper[4843]: I0318 12:27:01.055456 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sltrw\" (UniqueName: \"kubernetes.io/projected/4bd5afa8-337e-4384-ae10-8689ce534039-kube-api-access-sltrw\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5\" (UID: \"4bd5afa8-337e-4384-ae10-8689ce534039\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" Mar 18 12:27:01 crc kubenswrapper[4843]: I0318 12:27:01.157486 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bd5afa8-337e-4384-ae10-8689ce534039-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5\" (UID: \"4bd5afa8-337e-4384-ae10-8689ce534039\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" Mar 18 12:27:01 crc kubenswrapper[4843]: I0318 12:27:01.157535 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bd5afa8-337e-4384-ae10-8689ce534039-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5\" (UID: \"4bd5afa8-337e-4384-ae10-8689ce534039\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" Mar 18 12:27:01 crc kubenswrapper[4843]: I0318 12:27:01.157639 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sltrw\" (UniqueName: \"kubernetes.io/projected/4bd5afa8-337e-4384-ae10-8689ce534039-kube-api-access-sltrw\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5\" (UID: \"4bd5afa8-337e-4384-ae10-8689ce534039\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" Mar 18 12:27:01 crc kubenswrapper[4843]: I0318 12:27:01.158094 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bd5afa8-337e-4384-ae10-8689ce534039-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5\" (UID: \"4bd5afa8-337e-4384-ae10-8689ce534039\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" Mar 18 12:27:01 crc kubenswrapper[4843]: I0318 12:27:01.158171 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bd5afa8-337e-4384-ae10-8689ce534039-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5\" (UID: \"4bd5afa8-337e-4384-ae10-8689ce534039\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" Mar 18 12:27:01 crc kubenswrapper[4843]: I0318 12:27:01.178757 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sltrw\" (UniqueName: \"kubernetes.io/projected/4bd5afa8-337e-4384-ae10-8689ce534039-kube-api-access-sltrw\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5\" (UID: \"4bd5afa8-337e-4384-ae10-8689ce534039\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" Mar 18 12:27:01 crc kubenswrapper[4843]: I0318 12:27:01.242631 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" Mar 18 12:27:01 crc kubenswrapper[4843]: I0318 12:27:01.444146 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5"] Mar 18 12:27:02 crc kubenswrapper[4843]: I0318 12:27:02.389236 4843 generic.go:334] "Generic (PLEG): container finished" podID="4bd5afa8-337e-4384-ae10-8689ce534039" containerID="4e1e608ef855dc699a335149fd0f0aa2f74c3f9f478398479542400bb9fbfea4" exitCode=0 Mar 18 12:27:02 crc kubenswrapper[4843]: I0318 12:27:02.389337 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" event={"ID":"4bd5afa8-337e-4384-ae10-8689ce534039","Type":"ContainerDied","Data":"4e1e608ef855dc699a335149fd0f0aa2f74c3f9f478398479542400bb9fbfea4"} Mar 18 12:27:02 crc kubenswrapper[4843]: I0318 12:27:02.389516 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" event={"ID":"4bd5afa8-337e-4384-ae10-8689ce534039","Type":"ContainerStarted","Data":"5f61ca3fe76c6c7ef4cf6c8d613d90bb9fbcaf164de2f9c12249c9e38d5cccc3"} Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.009592 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ppqpl" podUID="b999a5c0-f4e8-499b-8f81-283c3a2cf495" containerName="console" containerID="cri-o://6e1888400dfedd6fd9fb5e94372b9dd93191f93afb383a975f3992e3b221c86a" gracePeriod=15 Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.373168 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ppqpl_b999a5c0-f4e8-499b-8f81-283c3a2cf495/console/0.log" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.373348 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.401320 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ppqpl_b999a5c0-f4e8-499b-8f81-283c3a2cf495/console/0.log" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.401368 4843 generic.go:334] "Generic (PLEG): container finished" podID="b999a5c0-f4e8-499b-8f81-283c3a2cf495" containerID="6e1888400dfedd6fd9fb5e94372b9dd93191f93afb383a975f3992e3b221c86a" exitCode=2 Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.401400 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ppqpl" event={"ID":"b999a5c0-f4e8-499b-8f81-283c3a2cf495","Type":"ContainerDied","Data":"6e1888400dfedd6fd9fb5e94372b9dd93191f93afb383a975f3992e3b221c86a"} Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.401427 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ppqpl" event={"ID":"b999a5c0-f4e8-499b-8f81-283c3a2cf495","Type":"ContainerDied","Data":"97d1c79c4916e9ac1e3fde35595152116b62cecdf61234a4b1b4d938f7b3cae4"} Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.401445 4843 scope.go:117] "RemoveContainer" containerID="6e1888400dfedd6fd9fb5e94372b9dd93191f93afb383a975f3992e3b221c86a" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.401561 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ppqpl" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.419694 4843 scope.go:117] "RemoveContainer" containerID="6e1888400dfedd6fd9fb5e94372b9dd93191f93afb383a975f3992e3b221c86a" Mar 18 12:27:03 crc kubenswrapper[4843]: E0318 12:27:03.420258 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e1888400dfedd6fd9fb5e94372b9dd93191f93afb383a975f3992e3b221c86a\": container with ID starting with 6e1888400dfedd6fd9fb5e94372b9dd93191f93afb383a975f3992e3b221c86a not found: ID does not exist" containerID="6e1888400dfedd6fd9fb5e94372b9dd93191f93afb383a975f3992e3b221c86a" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.420316 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e1888400dfedd6fd9fb5e94372b9dd93191f93afb383a975f3992e3b221c86a"} err="failed to get container status \"6e1888400dfedd6fd9fb5e94372b9dd93191f93afb383a975f3992e3b221c86a\": rpc error: code = NotFound desc = could not find container \"6e1888400dfedd6fd9fb5e94372b9dd93191f93afb383a975f3992e3b221c86a\": container with ID starting with 6e1888400dfedd6fd9fb5e94372b9dd93191f93afb383a975f3992e3b221c86a not found: ID does not exist" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.494191 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-service-ca\") pod \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.494267 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-config\") pod \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.494296 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-serving-cert\") pod \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.494338 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-oauth-config\") pod \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.494386 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-oauth-serving-cert\") pod \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.494429 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-trusted-ca-bundle\") pod \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.494448 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vfvl\" (UniqueName: \"kubernetes.io/projected/b999a5c0-f4e8-499b-8f81-283c3a2cf495-kube-api-access-9vfvl\") pod \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\" (UID: \"b999a5c0-f4e8-499b-8f81-283c3a2cf495\") " Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.496063 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b999a5c0-f4e8-499b-8f81-283c3a2cf495" (UID: "b999a5c0-f4e8-499b-8f81-283c3a2cf495"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.496073 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-config" (OuterVolumeSpecName: "console-config") pod "b999a5c0-f4e8-499b-8f81-283c3a2cf495" (UID: "b999a5c0-f4e8-499b-8f81-283c3a2cf495"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.496138 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b999a5c0-f4e8-499b-8f81-283c3a2cf495" (UID: "b999a5c0-f4e8-499b-8f81-283c3a2cf495"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.496730 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-service-ca" (OuterVolumeSpecName: "service-ca") pod "b999a5c0-f4e8-499b-8f81-283c3a2cf495" (UID: "b999a5c0-f4e8-499b-8f81-283c3a2cf495"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.500332 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b999a5c0-f4e8-499b-8f81-283c3a2cf495-kube-api-access-9vfvl" (OuterVolumeSpecName: "kube-api-access-9vfvl") pod "b999a5c0-f4e8-499b-8f81-283c3a2cf495" (UID: "b999a5c0-f4e8-499b-8f81-283c3a2cf495"). InnerVolumeSpecName "kube-api-access-9vfvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.500608 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b999a5c0-f4e8-499b-8f81-283c3a2cf495" (UID: "b999a5c0-f4e8-499b-8f81-283c3a2cf495"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.502412 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b999a5c0-f4e8-499b-8f81-283c3a2cf495" (UID: "b999a5c0-f4e8-499b-8f81-283c3a2cf495"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.596533 4843 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.596598 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vfvl\" (UniqueName: \"kubernetes.io/projected/b999a5c0-f4e8-499b-8f81-283c3a2cf495-kube-api-access-9vfvl\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.596623 4843 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.596641 4843 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.596711 4843 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.596739 4843 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b999a5c0-f4e8-499b-8f81-283c3a2cf495-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.596762 4843 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b999a5c0-f4e8-499b-8f81-283c3a2cf495-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.740976 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ppqpl"] Mar 18 12:27:03 crc kubenswrapper[4843]: I0318 12:27:03.749138 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ppqpl"] Mar 18 12:27:04 crc kubenswrapper[4843]: I0318 12:27:04.408341 4843 generic.go:334] "Generic (PLEG): container finished" podID="4bd5afa8-337e-4384-ae10-8689ce534039" containerID="db0b515adcaa74ee423f6c155ed0500d49dd296e565875efdc8bf9fc1063022b" exitCode=0 Mar 18 12:27:04 crc kubenswrapper[4843]: I0318 12:27:04.408435 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" event={"ID":"4bd5afa8-337e-4384-ae10-8689ce534039","Type":"ContainerDied","Data":"db0b515adcaa74ee423f6c155ed0500d49dd296e565875efdc8bf9fc1063022b"} Mar 18 12:27:04 crc kubenswrapper[4843]: I0318 12:27:04.993344 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b999a5c0-f4e8-499b-8f81-283c3a2cf495" path="/var/lib/kubelet/pods/b999a5c0-f4e8-499b-8f81-283c3a2cf495/volumes" Mar 18 12:27:05 crc kubenswrapper[4843]: I0318 12:27:05.422136 4843 generic.go:334] "Generic (PLEG): container finished" podID="4bd5afa8-337e-4384-ae10-8689ce534039" containerID="1d45b0207e5b2789df56e5f777e5c386b565d941bf536cc9424e2978741b0ed7" exitCode=0 Mar 18 12:27:05 crc kubenswrapper[4843]: I0318 12:27:05.422189 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" event={"ID":"4bd5afa8-337e-4384-ae10-8689ce534039","Type":"ContainerDied","Data":"1d45b0207e5b2789df56e5f777e5c386b565d941bf536cc9424e2978741b0ed7"} Mar 18 12:27:06 crc kubenswrapper[4843]: I0318 12:27:06.732935 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" Mar 18 12:27:06 crc kubenswrapper[4843]: I0318 12:27:06.844188 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bd5afa8-337e-4384-ae10-8689ce534039-util\") pod \"4bd5afa8-337e-4384-ae10-8689ce534039\" (UID: \"4bd5afa8-337e-4384-ae10-8689ce534039\") " Mar 18 12:27:06 crc kubenswrapper[4843]: I0318 12:27:06.844583 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bd5afa8-337e-4384-ae10-8689ce534039-bundle\") pod \"4bd5afa8-337e-4384-ae10-8689ce534039\" (UID: \"4bd5afa8-337e-4384-ae10-8689ce534039\") " Mar 18 12:27:06 crc kubenswrapper[4843]: I0318 12:27:06.844645 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sltrw\" (UniqueName: \"kubernetes.io/projected/4bd5afa8-337e-4384-ae10-8689ce534039-kube-api-access-sltrw\") pod \"4bd5afa8-337e-4384-ae10-8689ce534039\" (UID: \"4bd5afa8-337e-4384-ae10-8689ce534039\") " Mar 18 12:27:06 crc kubenswrapper[4843]: I0318 12:27:06.845360 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bd5afa8-337e-4384-ae10-8689ce534039-bundle" (OuterVolumeSpecName: "bundle") pod "4bd5afa8-337e-4384-ae10-8689ce534039" (UID: "4bd5afa8-337e-4384-ae10-8689ce534039"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:06 crc kubenswrapper[4843]: I0318 12:27:06.852390 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd5afa8-337e-4384-ae10-8689ce534039-kube-api-access-sltrw" (OuterVolumeSpecName: "kube-api-access-sltrw") pod "4bd5afa8-337e-4384-ae10-8689ce534039" (UID: "4bd5afa8-337e-4384-ae10-8689ce534039"). InnerVolumeSpecName "kube-api-access-sltrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:27:06 crc kubenswrapper[4843]: I0318 12:27:06.870781 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bd5afa8-337e-4384-ae10-8689ce534039-util" (OuterVolumeSpecName: "util") pod "4bd5afa8-337e-4384-ae10-8689ce534039" (UID: "4bd5afa8-337e-4384-ae10-8689ce534039"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:06 crc kubenswrapper[4843]: I0318 12:27:06.946307 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sltrw\" (UniqueName: \"kubernetes.io/projected/4bd5afa8-337e-4384-ae10-8689ce534039-kube-api-access-sltrw\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:06 crc kubenswrapper[4843]: I0318 12:27:06.946357 4843 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bd5afa8-337e-4384-ae10-8689ce534039-util\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:06 crc kubenswrapper[4843]: I0318 12:27:06.946372 4843 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bd5afa8-337e-4384-ae10-8689ce534039-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:07 crc kubenswrapper[4843]: I0318 12:27:07.438994 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" event={"ID":"4bd5afa8-337e-4384-ae10-8689ce534039","Type":"ContainerDied","Data":"5f61ca3fe76c6c7ef4cf6c8d613d90bb9fbcaf164de2f9c12249c9e38d5cccc3"} Mar 18 12:27:07 crc kubenswrapper[4843]: I0318 12:27:07.439031 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f61ca3fe76c6c7ef4cf6c8d613d90bb9fbcaf164de2f9c12249c9e38d5cccc3" Mar 18 12:27:07 crc kubenswrapper[4843]: I0318 12:27:07.439078 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.508550 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4"] Mar 18 12:27:15 crc kubenswrapper[4843]: E0318 12:27:15.509455 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd5afa8-337e-4384-ae10-8689ce534039" containerName="pull" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.509473 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd5afa8-337e-4384-ae10-8689ce534039" containerName="pull" Mar 18 12:27:15 crc kubenswrapper[4843]: E0318 12:27:15.509499 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd5afa8-337e-4384-ae10-8689ce534039" containerName="util" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.509507 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd5afa8-337e-4384-ae10-8689ce534039" containerName="util" Mar 18 12:27:15 crc kubenswrapper[4843]: E0318 12:27:15.509521 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b999a5c0-f4e8-499b-8f81-283c3a2cf495" containerName="console" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.509529 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="b999a5c0-f4e8-499b-8f81-283c3a2cf495" containerName="console" Mar 18 12:27:15 crc kubenswrapper[4843]: E0318 12:27:15.509545 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd5afa8-337e-4384-ae10-8689ce534039" containerName="extract" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.509553 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd5afa8-337e-4384-ae10-8689ce534039" containerName="extract" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.509703 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="b999a5c0-f4e8-499b-8f81-283c3a2cf495" containerName="console" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.509721 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd5afa8-337e-4384-ae10-8689ce534039" containerName="extract" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.510353 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.520070 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-kv86w" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.521279 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.521492 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.521715 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.526719 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4"] Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.528341 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.673203 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb52c05a-e620-4914-94e4-d485168aec35-webhook-cert\") pod \"metallb-operator-controller-manager-84c485cc8b-2wjb4\" (UID: \"eb52c05a-e620-4914-94e4-d485168aec35\") " pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.673284 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb52c05a-e620-4914-94e4-d485168aec35-apiservice-cert\") pod \"metallb-operator-controller-manager-84c485cc8b-2wjb4\" (UID: \"eb52c05a-e620-4914-94e4-d485168aec35\") " pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.673318 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgljc\" (UniqueName: \"kubernetes.io/projected/eb52c05a-e620-4914-94e4-d485168aec35-kube-api-access-zgljc\") pod \"metallb-operator-controller-manager-84c485cc8b-2wjb4\" (UID: \"eb52c05a-e620-4914-94e4-d485168aec35\") " pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.734621 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4"] Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.735312 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.745311 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qhs67" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.746569 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.748800 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.772974 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4"] Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.774067 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab023c74-77a1-4169-8539-24dd12546dad-apiservice-cert\") pod \"metallb-operator-webhook-server-6c9ffdb45c-wwbs4\" (UID: \"ab023c74-77a1-4169-8539-24dd12546dad\") " pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.774120 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65jgm\" (UniqueName: \"kubernetes.io/projected/ab023c74-77a1-4169-8539-24dd12546dad-kube-api-access-65jgm\") pod \"metallb-operator-webhook-server-6c9ffdb45c-wwbs4\" (UID: \"ab023c74-77a1-4169-8539-24dd12546dad\") " pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.774154 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb52c05a-e620-4914-94e4-d485168aec35-webhook-cert\") pod \"metallb-operator-controller-manager-84c485cc8b-2wjb4\" (UID: \"eb52c05a-e620-4914-94e4-d485168aec35\") " pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.774173 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab023c74-77a1-4169-8539-24dd12546dad-webhook-cert\") pod \"metallb-operator-webhook-server-6c9ffdb45c-wwbs4\" (UID: \"ab023c74-77a1-4169-8539-24dd12546dad\") " pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.774201 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb52c05a-e620-4914-94e4-d485168aec35-apiservice-cert\") pod \"metallb-operator-controller-manager-84c485cc8b-2wjb4\" (UID: \"eb52c05a-e620-4914-94e4-d485168aec35\") " pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.774220 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgljc\" (UniqueName: \"kubernetes.io/projected/eb52c05a-e620-4914-94e4-d485168aec35-kube-api-access-zgljc\") pod \"metallb-operator-controller-manager-84c485cc8b-2wjb4\" (UID: \"eb52c05a-e620-4914-94e4-d485168aec35\") " pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.783395 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb52c05a-e620-4914-94e4-d485168aec35-webhook-cert\") pod \"metallb-operator-controller-manager-84c485cc8b-2wjb4\" (UID: \"eb52c05a-e620-4914-94e4-d485168aec35\") " pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.788757 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb52c05a-e620-4914-94e4-d485168aec35-apiservice-cert\") pod \"metallb-operator-controller-manager-84c485cc8b-2wjb4\" (UID: \"eb52c05a-e620-4914-94e4-d485168aec35\") " pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.814337 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgljc\" (UniqueName: \"kubernetes.io/projected/eb52c05a-e620-4914-94e4-d485168aec35-kube-api-access-zgljc\") pod \"metallb-operator-controller-manager-84c485cc8b-2wjb4\" (UID: \"eb52c05a-e620-4914-94e4-d485168aec35\") " pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.827952 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.879360 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab023c74-77a1-4169-8539-24dd12546dad-apiservice-cert\") pod \"metallb-operator-webhook-server-6c9ffdb45c-wwbs4\" (UID: \"ab023c74-77a1-4169-8539-24dd12546dad\") " pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.879433 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65jgm\" (UniqueName: \"kubernetes.io/projected/ab023c74-77a1-4169-8539-24dd12546dad-kube-api-access-65jgm\") pod \"metallb-operator-webhook-server-6c9ffdb45c-wwbs4\" (UID: \"ab023c74-77a1-4169-8539-24dd12546dad\") " pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.879471 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab023c74-77a1-4169-8539-24dd12546dad-webhook-cert\") pod \"metallb-operator-webhook-server-6c9ffdb45c-wwbs4\" (UID: \"ab023c74-77a1-4169-8539-24dd12546dad\") " pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.887390 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab023c74-77a1-4169-8539-24dd12546dad-apiservice-cert\") pod \"metallb-operator-webhook-server-6c9ffdb45c-wwbs4\" (UID: \"ab023c74-77a1-4169-8539-24dd12546dad\") " pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.888051 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab023c74-77a1-4169-8539-24dd12546dad-webhook-cert\") pod \"metallb-operator-webhook-server-6c9ffdb45c-wwbs4\" (UID: \"ab023c74-77a1-4169-8539-24dd12546dad\") " pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" Mar 18 12:27:15 crc kubenswrapper[4843]: I0318 12:27:15.905385 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65jgm\" (UniqueName: \"kubernetes.io/projected/ab023c74-77a1-4169-8539-24dd12546dad-kube-api-access-65jgm\") pod \"metallb-operator-webhook-server-6c9ffdb45c-wwbs4\" (UID: \"ab023c74-77a1-4169-8539-24dd12546dad\") " pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" Mar 18 12:27:16 crc kubenswrapper[4843]: I0318 12:27:16.051373 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" Mar 18 12:27:16 crc kubenswrapper[4843]: I0318 12:27:16.252435 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4"] Mar 18 12:27:16 crc kubenswrapper[4843]: W0318 12:27:16.264346 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab023c74_77a1_4169_8539_24dd12546dad.slice/crio-c8d283b8efad5ca4d97bda89396de62fe64a85b2855a2834ce9d8bc2ab0c4bbc WatchSource:0}: Error finding container c8d283b8efad5ca4d97bda89396de62fe64a85b2855a2834ce9d8bc2ab0c4bbc: Status 404 returned error can't find the container with id c8d283b8efad5ca4d97bda89396de62fe64a85b2855a2834ce9d8bc2ab0c4bbc Mar 18 12:27:16 crc kubenswrapper[4843]: I0318 12:27:16.291355 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4"] Mar 18 12:27:16 crc kubenswrapper[4843]: W0318 12:27:16.298811 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb52c05a_e620_4914_94e4_d485168aec35.slice/crio-93e407fabd7f04c784949ae20ecd38a4edf4376148fdc605dceecbae8757d24b WatchSource:0}: Error finding container 93e407fabd7f04c784949ae20ecd38a4edf4376148fdc605dceecbae8757d24b: Status 404 returned error can't find the container with id 93e407fabd7f04c784949ae20ecd38a4edf4376148fdc605dceecbae8757d24b Mar 18 12:27:16 crc kubenswrapper[4843]: I0318 12:27:16.494160 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" event={"ID":"ab023c74-77a1-4169-8539-24dd12546dad","Type":"ContainerStarted","Data":"c8d283b8efad5ca4d97bda89396de62fe64a85b2855a2834ce9d8bc2ab0c4bbc"} Mar 18 12:27:16 crc kubenswrapper[4843]: I0318 12:27:16.495072 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" event={"ID":"eb52c05a-e620-4914-94e4-d485168aec35","Type":"ContainerStarted","Data":"93e407fabd7f04c784949ae20ecd38a4edf4376148fdc605dceecbae8757d24b"} Mar 18 12:27:19 crc kubenswrapper[4843]: I0318 12:27:19.519134 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" event={"ID":"eb52c05a-e620-4914-94e4-d485168aec35","Type":"ContainerStarted","Data":"9f48f918d605c1643f2c29ebf7c903dce45c534e83097dc11a7da9bb99debffe"} Mar 18 12:27:19 crc kubenswrapper[4843]: I0318 12:27:19.550987 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" podStartSLOduration=1.761067454 podStartE2EDuration="4.55096332s" podCreationTimestamp="2026-03-18 12:27:15 +0000 UTC" firstStartedPulling="2026-03-18 12:27:16.300906787 +0000 UTC m=+1070.016732311" lastFinishedPulling="2026-03-18 12:27:19.090802643 +0000 UTC m=+1072.806628177" observedRunningTime="2026-03-18 12:27:19.538787664 +0000 UTC m=+1073.254613238" watchObservedRunningTime="2026-03-18 12:27:19.55096332 +0000 UTC m=+1073.266788844" Mar 18 12:27:20 crc kubenswrapper[4843]: I0318 12:27:20.035728 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:27:20 crc kubenswrapper[4843]: I0318 12:27:20.035819 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:27:20 crc kubenswrapper[4843]: I0318 12:27:20.035892 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:27:20 crc kubenswrapper[4843]: I0318 12:27:20.037213 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"156751a099ebefa58e45dd19fa380fffeac977e92f6bd61d7c8b0b1be68aae80"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:27:20 crc kubenswrapper[4843]: I0318 12:27:20.037310 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://156751a099ebefa58e45dd19fa380fffeac977e92f6bd61d7c8b0b1be68aae80" gracePeriod=600 Mar 18 12:27:20 crc kubenswrapper[4843]: I0318 12:27:20.526929 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="156751a099ebefa58e45dd19fa380fffeac977e92f6bd61d7c8b0b1be68aae80" exitCode=0 Mar 18 12:27:20 crc kubenswrapper[4843]: I0318 12:27:20.527317 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"156751a099ebefa58e45dd19fa380fffeac977e92f6bd61d7c8b0b1be68aae80"} Mar 18 12:27:20 crc kubenswrapper[4843]: I0318 12:27:20.527475 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" Mar 18 12:27:20 crc kubenswrapper[4843]: I0318 12:27:20.527517 4843 scope.go:117] "RemoveContainer" containerID="900a6973f4fc33d51f048ffe75a20de2a300ad414b4d778db1160e26d9d43452" Mar 18 12:27:21 crc kubenswrapper[4843]: I0318 12:27:21.534461 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" event={"ID":"ab023c74-77a1-4169-8539-24dd12546dad","Type":"ContainerStarted","Data":"0f9c0e16230e8ba7c12c44ae57cdf8ae345f04fd69b3a8e3bd9a950c6448a3ee"} Mar 18 12:27:21 crc kubenswrapper[4843]: I0318 12:27:21.535005 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" Mar 18 12:27:21 crc kubenswrapper[4843]: I0318 12:27:21.536877 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"7fcc44fd473fc2d97d7be9aa8e61f5a92c58d2a0df082678596236e3adb17e3e"} Mar 18 12:27:21 crc kubenswrapper[4843]: I0318 12:27:21.551526 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" podStartSLOduration=1.872812682 podStartE2EDuration="6.551504927s" podCreationTimestamp="2026-03-18 12:27:15 +0000 UTC" firstStartedPulling="2026-03-18 12:27:16.267329512 +0000 UTC m=+1069.983155036" lastFinishedPulling="2026-03-18 12:27:20.946021757 +0000 UTC m=+1074.661847281" observedRunningTime="2026-03-18 12:27:21.5512388 +0000 UTC m=+1075.267064324" watchObservedRunningTime="2026-03-18 12:27:21.551504927 +0000 UTC m=+1075.267330451" Mar 18 12:27:36 crc kubenswrapper[4843]: I0318 12:27:36.057838 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6c9ffdb45c-wwbs4" Mar 18 12:27:43 crc kubenswrapper[4843]: I0318 12:27:43.616598 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-btbr2"] Mar 18 12:27:43 crc kubenswrapper[4843]: I0318 12:27:43.620917 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:43 crc kubenswrapper[4843]: I0318 12:27:43.650804 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btbr2"] Mar 18 12:27:43 crc kubenswrapper[4843]: I0318 12:27:43.752875 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-utilities\") pod \"redhat-marketplace-btbr2\" (UID: \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\") " pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:43 crc kubenswrapper[4843]: I0318 12:27:43.752961 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-catalog-content\") pod \"redhat-marketplace-btbr2\" (UID: \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\") " pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:43 crc kubenswrapper[4843]: I0318 12:27:43.753436 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6dl9\" (UniqueName: \"kubernetes.io/projected/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-kube-api-access-n6dl9\") pod \"redhat-marketplace-btbr2\" (UID: \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\") " pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:43 crc kubenswrapper[4843]: I0318 12:27:43.854693 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6dl9\" (UniqueName: \"kubernetes.io/projected/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-kube-api-access-n6dl9\") pod \"redhat-marketplace-btbr2\" (UID: \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\") " pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:43 crc kubenswrapper[4843]: I0318 12:27:43.854743 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-utilities\") pod \"redhat-marketplace-btbr2\" (UID: \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\") " pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:43 crc kubenswrapper[4843]: I0318 12:27:43.854781 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-catalog-content\") pod \"redhat-marketplace-btbr2\" (UID: \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\") " pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:43 crc kubenswrapper[4843]: I0318 12:27:43.855312 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-catalog-content\") pod \"redhat-marketplace-btbr2\" (UID: \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\") " pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:43 crc kubenswrapper[4843]: I0318 12:27:43.855609 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-utilities\") pod \"redhat-marketplace-btbr2\" (UID: \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\") " pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:43 crc kubenswrapper[4843]: I0318 12:27:43.885166 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6dl9\" (UniqueName: \"kubernetes.io/projected/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-kube-api-access-n6dl9\") pod \"redhat-marketplace-btbr2\" (UID: \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\") " pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:43 crc kubenswrapper[4843]: I0318 12:27:43.948859 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:44 crc kubenswrapper[4843]: I0318 12:27:44.614676 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btbr2"] Mar 18 12:27:44 crc kubenswrapper[4843]: W0318 12:27:44.621849 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ca2cc41_0eba_4d18_aa21_0143d2bbb5d0.slice/crio-1febb5266dc6bbdbbc3e07053fc9600b407e40f8fa743ae249a0f70cf297cd83 WatchSource:0}: Error finding container 1febb5266dc6bbdbbc3e07053fc9600b407e40f8fa743ae249a0f70cf297cd83: Status 404 returned error can't find the container with id 1febb5266dc6bbdbbc3e07053fc9600b407e40f8fa743ae249a0f70cf297cd83 Mar 18 12:27:44 crc kubenswrapper[4843]: I0318 12:27:44.800906 4843 generic.go:334] "Generic (PLEG): container finished" podID="1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" containerID="19b16aa11ebaf8e249f740d2a9e8204cd81e85bbbfd4f1807d5ca6a14a08d7ae" exitCode=0 Mar 18 12:27:44 crc kubenswrapper[4843]: I0318 12:27:44.800950 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btbr2" event={"ID":"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0","Type":"ContainerDied","Data":"19b16aa11ebaf8e249f740d2a9e8204cd81e85bbbfd4f1807d5ca6a14a08d7ae"} Mar 18 12:27:44 crc kubenswrapper[4843]: I0318 12:27:44.800981 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btbr2" event={"ID":"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0","Type":"ContainerStarted","Data":"1febb5266dc6bbdbbc3e07053fc9600b407e40f8fa743ae249a0f70cf297cd83"} Mar 18 12:27:45 crc kubenswrapper[4843]: I0318 12:27:45.807996 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btbr2" event={"ID":"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0","Type":"ContainerStarted","Data":"6fbb898e7f5770536ef065cc9b5b9650437dac810b132efa41d6aa6dae2b3576"} Mar 18 12:27:46 crc kubenswrapper[4843]: I0318 12:27:46.819096 4843 generic.go:334] "Generic (PLEG): container finished" podID="1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" containerID="6fbb898e7f5770536ef065cc9b5b9650437dac810b132efa41d6aa6dae2b3576" exitCode=0 Mar 18 12:27:46 crc kubenswrapper[4843]: I0318 12:27:46.819168 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btbr2" event={"ID":"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0","Type":"ContainerDied","Data":"6fbb898e7f5770536ef065cc9b5b9650437dac810b132efa41d6aa6dae2b3576"} Mar 18 12:27:47 crc kubenswrapper[4843]: I0318 12:27:47.830703 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btbr2" event={"ID":"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0","Type":"ContainerStarted","Data":"c0a23102cc424b0d493f96fd0ff7f7290109e27ec29f947d12144167fa1a3293"} Mar 18 12:27:47 crc kubenswrapper[4843]: I0318 12:27:47.857737 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-btbr2" podStartSLOduration=2.168385323 podStartE2EDuration="4.857718178s" podCreationTimestamp="2026-03-18 12:27:43 +0000 UTC" firstStartedPulling="2026-03-18 12:27:44.802442095 +0000 UTC m=+1098.518267639" lastFinishedPulling="2026-03-18 12:27:47.49177493 +0000 UTC m=+1101.207600494" observedRunningTime="2026-03-18 12:27:47.855222267 +0000 UTC m=+1101.571047801" watchObservedRunningTime="2026-03-18 12:27:47.857718178 +0000 UTC m=+1101.573543712" Mar 18 12:27:53 crc kubenswrapper[4843]: I0318 12:27:53.949741 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:53 crc kubenswrapper[4843]: I0318 12:27:53.952073 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:54 crc kubenswrapper[4843]: I0318 12:27:54.018381 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:54 crc kubenswrapper[4843]: I0318 12:27:54.954345 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:55 crc kubenswrapper[4843]: I0318 12:27:55.026882 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btbr2"] Mar 18 12:27:55 crc kubenswrapper[4843]: I0318 12:27:55.832184 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84c485cc8b-2wjb4" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.614888 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-x7lj9"] Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.636505 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp"] Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.638591 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.638717 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.644478 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.644809 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.644818 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.645065 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-nwr7v" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.658665 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp"] Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.696908 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-k26nm"] Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.698226 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-k26nm" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.701580 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.701695 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.701820 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.701836 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-htq9n" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.716236 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-54rq8"] Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.717118 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3dc20e76-907f-4852-91af-a114f966c97b-metrics\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.717172 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3dc20e76-907f-4852-91af-a114f966c97b-reloader\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.717205 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3dc20e76-907f-4852-91af-a114f966c97b-metrics-certs\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.717231 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6925\" (UniqueName: \"kubernetes.io/projected/f29dedb4-cde5-4f7a-9710-9dd1de387482-kube-api-access-j6925\") pod \"frr-k8s-webhook-server-bcc4b6f68-bn9hp\" (UID: \"f29dedb4-cde5-4f7a-9710-9dd1de387482\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.717271 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vllf\" (UniqueName: \"kubernetes.io/projected/3dc20e76-907f-4852-91af-a114f966c97b-kube-api-access-8vllf\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.717291 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3dc20e76-907f-4852-91af-a114f966c97b-frr-conf\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.717316 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f29dedb4-cde5-4f7a-9710-9dd1de387482-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-bn9hp\" (UID: \"f29dedb4-cde5-4f7a-9710-9dd1de387482\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.717355 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3dc20e76-907f-4852-91af-a114f966c97b-frr-sockets\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.717378 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3dc20e76-907f-4852-91af-a114f966c97b-frr-startup\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.721466 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-54rq8" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.725001 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.734802 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-54rq8"] Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818267 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3dc20e76-907f-4852-91af-a114f966c97b-frr-sockets\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818308 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3dc20e76-907f-4852-91af-a114f966c97b-frr-startup\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818348 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e058656-5aa9-4672-97fd-fa0c64ff46b5-metrics-certs\") pod \"controller-7bb4cc7c98-54rq8\" (UID: \"7e058656-5aa9-4672-97fd-fa0c64ff46b5\") " pod="metallb-system/controller-7bb4cc7c98-54rq8" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818375 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3dc20e76-907f-4852-91af-a114f966c97b-metrics\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818400 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d989f9e2-655b-468f-9fbc-e65fecdd3303-memberlist\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818417 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3dc20e76-907f-4852-91af-a114f966c97b-reloader\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818613 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3dc20e76-907f-4852-91af-a114f966c97b-metrics-certs\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818686 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6925\" (UniqueName: \"kubernetes.io/projected/f29dedb4-cde5-4f7a-9710-9dd1de387482-kube-api-access-j6925\") pod \"frr-k8s-webhook-server-bcc4b6f68-bn9hp\" (UID: \"f29dedb4-cde5-4f7a-9710-9dd1de387482\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818726 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e058656-5aa9-4672-97fd-fa0c64ff46b5-cert\") pod \"controller-7bb4cc7c98-54rq8\" (UID: \"7e058656-5aa9-4672-97fd-fa0c64ff46b5\") " pod="metallb-system/controller-7bb4cc7c98-54rq8" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818754 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d989f9e2-655b-468f-9fbc-e65fecdd3303-metallb-excludel2\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818811 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vllf\" (UniqueName: \"kubernetes.io/projected/3dc20e76-907f-4852-91af-a114f966c97b-kube-api-access-8vllf\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818849 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d989f9e2-655b-468f-9fbc-e65fecdd3303-metrics-certs\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818823 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3dc20e76-907f-4852-91af-a114f966c97b-frr-sockets\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818869 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xglnb\" (UniqueName: \"kubernetes.io/projected/7e058656-5aa9-4672-97fd-fa0c64ff46b5-kube-api-access-xglnb\") pod \"controller-7bb4cc7c98-54rq8\" (UID: \"7e058656-5aa9-4672-97fd-fa0c64ff46b5\") " pod="metallb-system/controller-7bb4cc7c98-54rq8" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818886 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3dc20e76-907f-4852-91af-a114f966c97b-frr-conf\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818912 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3dc20e76-907f-4852-91af-a114f966c97b-metrics\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.818976 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f29dedb4-cde5-4f7a-9710-9dd1de387482-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-bn9hp\" (UID: \"f29dedb4-cde5-4f7a-9710-9dd1de387482\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.819012 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqzss\" (UniqueName: \"kubernetes.io/projected/d989f9e2-655b-468f-9fbc-e65fecdd3303-kube-api-access-tqzss\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:56 crc kubenswrapper[4843]: E0318 12:27:56.819127 4843 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 18 12:27:56 crc kubenswrapper[4843]: E0318 12:27:56.819189 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f29dedb4-cde5-4f7a-9710-9dd1de387482-cert podName:f29dedb4-cde5-4f7a-9710-9dd1de387482 nodeName:}" failed. No retries permitted until 2026-03-18 12:27:57.319166926 +0000 UTC m=+1111.034992450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f29dedb4-cde5-4f7a-9710-9dd1de387482-cert") pod "frr-k8s-webhook-server-bcc4b6f68-bn9hp" (UID: "f29dedb4-cde5-4f7a-9710-9dd1de387482") : secret "frr-k8s-webhook-server-cert" not found Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.819269 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3dc20e76-907f-4852-91af-a114f966c97b-frr-conf\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.819394 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3dc20e76-907f-4852-91af-a114f966c97b-reloader\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.819498 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3dc20e76-907f-4852-91af-a114f966c97b-frr-startup\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.828362 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3dc20e76-907f-4852-91af-a114f966c97b-metrics-certs\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.840397 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6925\" (UniqueName: \"kubernetes.io/projected/f29dedb4-cde5-4f7a-9710-9dd1de387482-kube-api-access-j6925\") pod \"frr-k8s-webhook-server-bcc4b6f68-bn9hp\" (UID: \"f29dedb4-cde5-4f7a-9710-9dd1de387482\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.843626 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vllf\" (UniqueName: \"kubernetes.io/projected/3dc20e76-907f-4852-91af-a114f966c97b-kube-api-access-8vllf\") pod \"frr-k8s-x7lj9\" (UID: \"3dc20e76-907f-4852-91af-a114f966c97b\") " pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.889815 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-btbr2" podUID="1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" containerName="registry-server" containerID="cri-o://c0a23102cc424b0d493f96fd0ff7f7290109e27ec29f947d12144167fa1a3293" gracePeriod=2 Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.920488 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d989f9e2-655b-468f-9fbc-e65fecdd3303-metrics-certs\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.920534 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xglnb\" (UniqueName: \"kubernetes.io/projected/7e058656-5aa9-4672-97fd-fa0c64ff46b5-kube-api-access-xglnb\") pod \"controller-7bb4cc7c98-54rq8\" (UID: \"7e058656-5aa9-4672-97fd-fa0c64ff46b5\") " pod="metallb-system/controller-7bb4cc7c98-54rq8" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.920573 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqzss\" (UniqueName: \"kubernetes.io/projected/d989f9e2-655b-468f-9fbc-e65fecdd3303-kube-api-access-tqzss\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.920625 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e058656-5aa9-4672-97fd-fa0c64ff46b5-metrics-certs\") pod \"controller-7bb4cc7c98-54rq8\" (UID: \"7e058656-5aa9-4672-97fd-fa0c64ff46b5\") " pod="metallb-system/controller-7bb4cc7c98-54rq8" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.920740 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d989f9e2-655b-468f-9fbc-e65fecdd3303-memberlist\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.920783 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e058656-5aa9-4672-97fd-fa0c64ff46b5-cert\") pod \"controller-7bb4cc7c98-54rq8\" (UID: \"7e058656-5aa9-4672-97fd-fa0c64ff46b5\") " pod="metallb-system/controller-7bb4cc7c98-54rq8" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.920804 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d989f9e2-655b-468f-9fbc-e65fecdd3303-metallb-excludel2\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:56 crc kubenswrapper[4843]: E0318 12:27:56.920836 4843 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 12:27:56 crc kubenswrapper[4843]: E0318 12:27:56.920913 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d989f9e2-655b-468f-9fbc-e65fecdd3303-memberlist podName:d989f9e2-655b-468f-9fbc-e65fecdd3303 nodeName:}" failed. No retries permitted until 2026-03-18 12:27:57.420891211 +0000 UTC m=+1111.136716735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d989f9e2-655b-468f-9fbc-e65fecdd3303-memberlist") pod "speaker-k26nm" (UID: "d989f9e2-655b-468f-9fbc-e65fecdd3303") : secret "metallb-memberlist" not found Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.921725 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d989f9e2-655b-468f-9fbc-e65fecdd3303-metallb-excludel2\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.925628 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d989f9e2-655b-468f-9fbc-e65fecdd3303-metrics-certs\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.927067 4843 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.927463 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e058656-5aa9-4672-97fd-fa0c64ff46b5-metrics-certs\") pod \"controller-7bb4cc7c98-54rq8\" (UID: \"7e058656-5aa9-4672-97fd-fa0c64ff46b5\") " pod="metallb-system/controller-7bb4cc7c98-54rq8" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.934147 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e058656-5aa9-4672-97fd-fa0c64ff46b5-cert\") pod \"controller-7bb4cc7c98-54rq8\" (UID: \"7e058656-5aa9-4672-97fd-fa0c64ff46b5\") " pod="metallb-system/controller-7bb4cc7c98-54rq8" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.941694 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqzss\" (UniqueName: \"kubernetes.io/projected/d989f9e2-655b-468f-9fbc-e65fecdd3303-kube-api-access-tqzss\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.943275 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xglnb\" (UniqueName: \"kubernetes.io/projected/7e058656-5aa9-4672-97fd-fa0c64ff46b5-kube-api-access-xglnb\") pod \"controller-7bb4cc7c98-54rq8\" (UID: \"7e058656-5aa9-4672-97fd-fa0c64ff46b5\") " pod="metallb-system/controller-7bb4cc7c98-54rq8" Mar 18 12:27:56 crc kubenswrapper[4843]: I0318 12:27:56.977266 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.097302 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-54rq8" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.227977 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.325843 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-catalog-content\") pod \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\" (UID: \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\") " Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.326198 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6dl9\" (UniqueName: \"kubernetes.io/projected/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-kube-api-access-n6dl9\") pod \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\" (UID: \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\") " Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.326226 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-utilities\") pod \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\" (UID: \"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0\") " Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.326517 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f29dedb4-cde5-4f7a-9710-9dd1de387482-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-bn9hp\" (UID: \"f29dedb4-cde5-4f7a-9710-9dd1de387482\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.327071 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-utilities" (OuterVolumeSpecName: "utilities") pod "1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" (UID: "1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.332237 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-kube-api-access-n6dl9" (OuterVolumeSpecName: "kube-api-access-n6dl9") pod "1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" (UID: "1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0"). InnerVolumeSpecName "kube-api-access-n6dl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.333183 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f29dedb4-cde5-4f7a-9710-9dd1de387482-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-bn9hp\" (UID: \"f29dedb4-cde5-4f7a-9710-9dd1de387482\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.367611 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" (UID: "1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.428056 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d989f9e2-655b-468f-9fbc-e65fecdd3303-memberlist\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:57 crc kubenswrapper[4843]: E0318 12:27:57.428349 4843 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.428365 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6dl9\" (UniqueName: \"kubernetes.io/projected/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-kube-api-access-n6dl9\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:57 crc kubenswrapper[4843]: E0318 12:27:57.428472 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d989f9e2-655b-468f-9fbc-e65fecdd3303-memberlist podName:d989f9e2-655b-468f-9fbc-e65fecdd3303 nodeName:}" failed. No retries permitted until 2026-03-18 12:27:58.428443853 +0000 UTC m=+1112.144269407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d989f9e2-655b-468f-9fbc-e65fecdd3303-memberlist") pod "speaker-k26nm" (UID: "d989f9e2-655b-468f-9fbc-e65fecdd3303") : secret "metallb-memberlist" not found Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.428515 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.428540 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.531571 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-54rq8"] Mar 18 12:27:57 crc kubenswrapper[4843]: W0318 12:27:57.543149 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e058656_5aa9_4672_97fd_fa0c64ff46b5.slice/crio-27d1cba948aba4a448d72c875675a89f05f6070884ef20a10ed1e9d739750276 WatchSource:0}: Error finding container 27d1cba948aba4a448d72c875675a89f05f6070884ef20a10ed1e9d739750276: Status 404 returned error can't find the container with id 27d1cba948aba4a448d72c875675a89f05f6070884ef20a10ed1e9d739750276 Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.567392 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.942897 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x7lj9" event={"ID":"3dc20e76-907f-4852-91af-a114f966c97b","Type":"ContainerStarted","Data":"e84812c26f7099e3b6ded184ddb25b3863e05063b0e5eaf0b6f98da1ec3020c1"} Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.947663 4843 generic.go:334] "Generic (PLEG): container finished" podID="1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" containerID="c0a23102cc424b0d493f96fd0ff7f7290109e27ec29f947d12144167fa1a3293" exitCode=0 Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.947750 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btbr2" event={"ID":"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0","Type":"ContainerDied","Data":"c0a23102cc424b0d493f96fd0ff7f7290109e27ec29f947d12144167fa1a3293"} Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.947763 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btbr2" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.947784 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btbr2" event={"ID":"1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0","Type":"ContainerDied","Data":"1febb5266dc6bbdbbc3e07053fc9600b407e40f8fa743ae249a0f70cf297cd83"} Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.947811 4843 scope.go:117] "RemoveContainer" containerID="c0a23102cc424b0d493f96fd0ff7f7290109e27ec29f947d12144167fa1a3293" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.954155 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-54rq8" event={"ID":"7e058656-5aa9-4672-97fd-fa0c64ff46b5","Type":"ContainerStarted","Data":"f4f210d3493086c11f28a1c92523d2fb5777a680ab3449d2044a494a05360e23"} Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.954183 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-54rq8" event={"ID":"7e058656-5aa9-4672-97fd-fa0c64ff46b5","Type":"ContainerStarted","Data":"27d1cba948aba4a448d72c875675a89f05f6070884ef20a10ed1e9d739750276"} Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.968120 4843 scope.go:117] "RemoveContainer" containerID="6fbb898e7f5770536ef065cc9b5b9650437dac810b132efa41d6aa6dae2b3576" Mar 18 12:27:57 crc kubenswrapper[4843]: I0318 12:27:57.996823 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btbr2"] Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.000416 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-btbr2"] Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.021743 4843 scope.go:117] "RemoveContainer" containerID="19b16aa11ebaf8e249f740d2a9e8204cd81e85bbbfd4f1807d5ca6a14a08d7ae" Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.075487 4843 scope.go:117] "RemoveContainer" containerID="c0a23102cc424b0d493f96fd0ff7f7290109e27ec29f947d12144167fa1a3293" Mar 18 12:27:58 crc kubenswrapper[4843]: E0318 12:27:58.075995 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0a23102cc424b0d493f96fd0ff7f7290109e27ec29f947d12144167fa1a3293\": container with ID starting with c0a23102cc424b0d493f96fd0ff7f7290109e27ec29f947d12144167fa1a3293 not found: ID does not exist" containerID="c0a23102cc424b0d493f96fd0ff7f7290109e27ec29f947d12144167fa1a3293" Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.076112 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0a23102cc424b0d493f96fd0ff7f7290109e27ec29f947d12144167fa1a3293"} err="failed to get container status \"c0a23102cc424b0d493f96fd0ff7f7290109e27ec29f947d12144167fa1a3293\": rpc error: code = NotFound desc = could not find container \"c0a23102cc424b0d493f96fd0ff7f7290109e27ec29f947d12144167fa1a3293\": container with ID starting with c0a23102cc424b0d493f96fd0ff7f7290109e27ec29f947d12144167fa1a3293 not found: ID does not exist" Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.076238 4843 scope.go:117] "RemoveContainer" containerID="6fbb898e7f5770536ef065cc9b5b9650437dac810b132efa41d6aa6dae2b3576" Mar 18 12:27:58 crc kubenswrapper[4843]: E0318 12:27:58.077942 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fbb898e7f5770536ef065cc9b5b9650437dac810b132efa41d6aa6dae2b3576\": container with ID starting with 6fbb898e7f5770536ef065cc9b5b9650437dac810b132efa41d6aa6dae2b3576 not found: ID does not exist" containerID="6fbb898e7f5770536ef065cc9b5b9650437dac810b132efa41d6aa6dae2b3576" Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.077983 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fbb898e7f5770536ef065cc9b5b9650437dac810b132efa41d6aa6dae2b3576"} err="failed to get container status \"6fbb898e7f5770536ef065cc9b5b9650437dac810b132efa41d6aa6dae2b3576\": rpc error: code = NotFound desc = could not find container \"6fbb898e7f5770536ef065cc9b5b9650437dac810b132efa41d6aa6dae2b3576\": container with ID starting with 6fbb898e7f5770536ef065cc9b5b9650437dac810b132efa41d6aa6dae2b3576 not found: ID does not exist" Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.078007 4843 scope.go:117] "RemoveContainer" containerID="19b16aa11ebaf8e249f740d2a9e8204cd81e85bbbfd4f1807d5ca6a14a08d7ae" Mar 18 12:27:58 crc kubenswrapper[4843]: E0318 12:27:58.081927 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19b16aa11ebaf8e249f740d2a9e8204cd81e85bbbfd4f1807d5ca6a14a08d7ae\": container with ID starting with 19b16aa11ebaf8e249f740d2a9e8204cd81e85bbbfd4f1807d5ca6a14a08d7ae not found: ID does not exist" containerID="19b16aa11ebaf8e249f740d2a9e8204cd81e85bbbfd4f1807d5ca6a14a08d7ae" Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.081967 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19b16aa11ebaf8e249f740d2a9e8204cd81e85bbbfd4f1807d5ca6a14a08d7ae"} err="failed to get container status \"19b16aa11ebaf8e249f740d2a9e8204cd81e85bbbfd4f1807d5ca6a14a08d7ae\": rpc error: code = NotFound desc = could not find container \"19b16aa11ebaf8e249f740d2a9e8204cd81e85bbbfd4f1807d5ca6a14a08d7ae\": container with ID starting with 19b16aa11ebaf8e249f740d2a9e8204cd81e85bbbfd4f1807d5ca6a14a08d7ae not found: ID does not exist" Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.166446 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp"] Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.446386 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d989f9e2-655b-468f-9fbc-e65fecdd3303-memberlist\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.451869 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d989f9e2-655b-468f-9fbc-e65fecdd3303-memberlist\") pod \"speaker-k26nm\" (UID: \"d989f9e2-655b-468f-9fbc-e65fecdd3303\") " pod="metallb-system/speaker-k26nm" Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.512577 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-k26nm" Mar 18 12:27:58 crc kubenswrapper[4843]: W0318 12:27:58.544939 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd989f9e2_655b_468f_9fbc_e65fecdd3303.slice/crio-999794ca03642b885702125d08c834881051c02897f26277fa2db7b6228b3c1d WatchSource:0}: Error finding container 999794ca03642b885702125d08c834881051c02897f26277fa2db7b6228b3c1d: Status 404 returned error can't find the container with id 999794ca03642b885702125d08c834881051c02897f26277fa2db7b6228b3c1d Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.967392 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" event={"ID":"f29dedb4-cde5-4f7a-9710-9dd1de387482","Type":"ContainerStarted","Data":"34f4e89fa1554375a2a5d1fd0b2133dd09914bdd6c456e9375a0589f8d76cb26"} Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.973586 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-k26nm" event={"ID":"d989f9e2-655b-468f-9fbc-e65fecdd3303","Type":"ContainerStarted","Data":"14364c6d040c8ad190f41b5eb36cbf634ebea066da3cb3b4fb8d6843372133af"} Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.973634 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-k26nm" event={"ID":"d989f9e2-655b-468f-9fbc-e65fecdd3303","Type":"ContainerStarted","Data":"999794ca03642b885702125d08c834881051c02897f26277fa2db7b6228b3c1d"} Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.976415 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-54rq8" event={"ID":"7e058656-5aa9-4672-97fd-fa0c64ff46b5","Type":"ContainerStarted","Data":"0b4e633961e13e2461df346f16103966bbba6ae07b475c828a8ce50f4653efc3"} Mar 18 12:27:58 crc kubenswrapper[4843]: I0318 12:27:58.976543 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-54rq8" Mar 18 12:27:59 crc kubenswrapper[4843]: I0318 12:27:59.008927 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-54rq8" podStartSLOduration=3.008912523 podStartE2EDuration="3.008912523s" podCreationTimestamp="2026-03-18 12:27:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:27:59.006162915 +0000 UTC m=+1112.721988439" watchObservedRunningTime="2026-03-18 12:27:59.008912523 +0000 UTC m=+1112.724738047" Mar 18 12:27:59 crc kubenswrapper[4843]: I0318 12:27:59.023362 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" path="/var/lib/kubelet/pods/1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0/volumes" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.093665 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-k26nm" event={"ID":"d989f9e2-655b-468f-9fbc-e65fecdd3303","Type":"ContainerStarted","Data":"1104abe1ac5a01a8667d9c1664e5f72f1e330afbef8acc73c6d4b9557d477cb1"} Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.120394 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-k26nm" podStartSLOduration=4.1203756 podStartE2EDuration="4.1203756s" podCreationTimestamp="2026-03-18 12:27:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:28:00.119739742 +0000 UTC m=+1113.835565266" watchObservedRunningTime="2026-03-18 12:28:00.1203756 +0000 UTC m=+1113.836201124" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.131919 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563948-mh955"] Mar 18 12:28:00 crc kubenswrapper[4843]: E0318 12:28:00.132197 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" containerName="extract-content" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.132221 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" containerName="extract-content" Mar 18 12:28:00 crc kubenswrapper[4843]: E0318 12:28:00.132234 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" containerName="registry-server" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.132242 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" containerName="registry-server" Mar 18 12:28:00 crc kubenswrapper[4843]: E0318 12:28:00.132256 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" containerName="extract-utilities" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.132264 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" containerName="extract-utilities" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.132424 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca2cc41-0eba-4d18-aa21-0143d2bbb5d0" containerName="registry-server" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.132922 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563948-mh955" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.137056 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.137637 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.137899 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.149712 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563948-mh955"] Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.174034 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlwns\" (UniqueName: \"kubernetes.io/projected/c6ff159c-ced7-4e9d-8aac-a6789348ca55-kube-api-access-xlwns\") pod \"auto-csr-approver-29563948-mh955\" (UID: \"c6ff159c-ced7-4e9d-8aac-a6789348ca55\") " pod="openshift-infra/auto-csr-approver-29563948-mh955" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.275596 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlwns\" (UniqueName: \"kubernetes.io/projected/c6ff159c-ced7-4e9d-8aac-a6789348ca55-kube-api-access-xlwns\") pod \"auto-csr-approver-29563948-mh955\" (UID: \"c6ff159c-ced7-4e9d-8aac-a6789348ca55\") " pod="openshift-infra/auto-csr-approver-29563948-mh955" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.304211 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlwns\" (UniqueName: \"kubernetes.io/projected/c6ff159c-ced7-4e9d-8aac-a6789348ca55-kube-api-access-xlwns\") pod \"auto-csr-approver-29563948-mh955\" (UID: \"c6ff159c-ced7-4e9d-8aac-a6789348ca55\") " pod="openshift-infra/auto-csr-approver-29563948-mh955" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.449625 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563948-mh955" Mar 18 12:28:00 crc kubenswrapper[4843]: I0318 12:28:00.731715 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563948-mh955"] Mar 18 12:28:01 crc kubenswrapper[4843]: I0318 12:28:01.101473 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563948-mh955" event={"ID":"c6ff159c-ced7-4e9d-8aac-a6789348ca55","Type":"ContainerStarted","Data":"aef2c42d25c559c61eea341b91303152dca5f5d62ad1dcf48adc0b8bb5b814fd"} Mar 18 12:28:01 crc kubenswrapper[4843]: I0318 12:28:01.101599 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-k26nm" Mar 18 12:28:03 crc kubenswrapper[4843]: I0318 12:28:03.120959 4843 generic.go:334] "Generic (PLEG): container finished" podID="c6ff159c-ced7-4e9d-8aac-a6789348ca55" containerID="659087b443798bf81f6607e29bfccfe43ff01fd2628d51de2aa86bb5afbc54f4" exitCode=0 Mar 18 12:28:03 crc kubenswrapper[4843]: I0318 12:28:03.121362 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563948-mh955" event={"ID":"c6ff159c-ced7-4e9d-8aac-a6789348ca55","Type":"ContainerDied","Data":"659087b443798bf81f6607e29bfccfe43ff01fd2628d51de2aa86bb5afbc54f4"} Mar 18 12:28:05 crc kubenswrapper[4843]: I0318 12:28:05.221916 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563948-mh955" Mar 18 12:28:05 crc kubenswrapper[4843]: I0318 12:28:05.275371 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlwns\" (UniqueName: \"kubernetes.io/projected/c6ff159c-ced7-4e9d-8aac-a6789348ca55-kube-api-access-xlwns\") pod \"c6ff159c-ced7-4e9d-8aac-a6789348ca55\" (UID: \"c6ff159c-ced7-4e9d-8aac-a6789348ca55\") " Mar 18 12:28:05 crc kubenswrapper[4843]: I0318 12:28:05.284763 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ff159c-ced7-4e9d-8aac-a6789348ca55-kube-api-access-xlwns" (OuterVolumeSpecName: "kube-api-access-xlwns") pod "c6ff159c-ced7-4e9d-8aac-a6789348ca55" (UID: "c6ff159c-ced7-4e9d-8aac-a6789348ca55"). InnerVolumeSpecName "kube-api-access-xlwns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:28:05 crc kubenswrapper[4843]: I0318 12:28:05.377264 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlwns\" (UniqueName: \"kubernetes.io/projected/c6ff159c-ced7-4e9d-8aac-a6789348ca55-kube-api-access-xlwns\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:06 crc kubenswrapper[4843]: I0318 12:28:06.190349 4843 generic.go:334] "Generic (PLEG): container finished" podID="3dc20e76-907f-4852-91af-a114f966c97b" containerID="2d3b7c0c108aa0c79ad66be59cc14aa17f6486ddb77af27495c3aecddf561196" exitCode=0 Mar 18 12:28:06 crc kubenswrapper[4843]: I0318 12:28:06.190710 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x7lj9" event={"ID":"3dc20e76-907f-4852-91af-a114f966c97b","Type":"ContainerDied","Data":"2d3b7c0c108aa0c79ad66be59cc14aa17f6486ddb77af27495c3aecddf561196"} Mar 18 12:28:06 crc kubenswrapper[4843]: I0318 12:28:06.192757 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563948-mh955" event={"ID":"c6ff159c-ced7-4e9d-8aac-a6789348ca55","Type":"ContainerDied","Data":"aef2c42d25c559c61eea341b91303152dca5f5d62ad1dcf48adc0b8bb5b814fd"} Mar 18 12:28:06 crc kubenswrapper[4843]: I0318 12:28:06.192821 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aef2c42d25c559c61eea341b91303152dca5f5d62ad1dcf48adc0b8bb5b814fd" Mar 18 12:28:06 crc kubenswrapper[4843]: I0318 12:28:06.192776 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563948-mh955" Mar 18 12:28:06 crc kubenswrapper[4843]: I0318 12:28:06.196515 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" event={"ID":"f29dedb4-cde5-4f7a-9710-9dd1de387482","Type":"ContainerStarted","Data":"f8e84658a0ff7b9fec02582300c2657f05aee90bce23190c0bbf3d5abf8d1717"} Mar 18 12:28:06 crc kubenswrapper[4843]: I0318 12:28:06.196912 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" Mar 18 12:28:06 crc kubenswrapper[4843]: I0318 12:28:06.234466 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" podStartSLOduration=3.122895787 podStartE2EDuration="10.234445682s" podCreationTimestamp="2026-03-18 12:27:56 +0000 UTC" firstStartedPulling="2026-03-18 12:27:58.175924041 +0000 UTC m=+1111.891749565" lastFinishedPulling="2026-03-18 12:28:05.287473936 +0000 UTC m=+1119.003299460" observedRunningTime="2026-03-18 12:28:06.232919008 +0000 UTC m=+1119.948744532" watchObservedRunningTime="2026-03-18 12:28:06.234445682 +0000 UTC m=+1119.950271206" Mar 18 12:28:06 crc kubenswrapper[4843]: I0318 12:28:06.274175 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563942-sgbqf"] Mar 18 12:28:06 crc kubenswrapper[4843]: I0318 12:28:06.278089 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563942-sgbqf"] Mar 18 12:28:06 crc kubenswrapper[4843]: I0318 12:28:06.993615 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e759d3-8475-46c1-b8ac-7cdca013d031" path="/var/lib/kubelet/pods/a7e759d3-8475-46c1-b8ac-7cdca013d031/volumes" Mar 18 12:28:07 crc kubenswrapper[4843]: I0318 12:28:07.103001 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-54rq8" Mar 18 12:28:07 crc kubenswrapper[4843]: I0318 12:28:07.207818 4843 generic.go:334] "Generic (PLEG): container finished" podID="3dc20e76-907f-4852-91af-a114f966c97b" containerID="416bede0f03e9a4eeb33b65283c9aebd323a981f1d1082b636ff325ae162428c" exitCode=0 Mar 18 12:28:07 crc kubenswrapper[4843]: I0318 12:28:07.208962 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x7lj9" event={"ID":"3dc20e76-907f-4852-91af-a114f966c97b","Type":"ContainerDied","Data":"416bede0f03e9a4eeb33b65283c9aebd323a981f1d1082b636ff325ae162428c"} Mar 18 12:28:08 crc kubenswrapper[4843]: I0318 12:28:08.220773 4843 generic.go:334] "Generic (PLEG): container finished" podID="3dc20e76-907f-4852-91af-a114f966c97b" containerID="27dfa1372a01ef53ef14fa06880511925ac5fc9fbb82cd2cdd60d786feb28227" exitCode=0 Mar 18 12:28:08 crc kubenswrapper[4843]: I0318 12:28:08.220982 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x7lj9" event={"ID":"3dc20e76-907f-4852-91af-a114f966c97b","Type":"ContainerDied","Data":"27dfa1372a01ef53ef14fa06880511925ac5fc9fbb82cd2cdd60d786feb28227"} Mar 18 12:28:08 crc kubenswrapper[4843]: I0318 12:28:08.518053 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-k26nm" Mar 18 12:28:09 crc kubenswrapper[4843]: I0318 12:28:09.260704 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x7lj9" event={"ID":"3dc20e76-907f-4852-91af-a114f966c97b","Type":"ContainerStarted","Data":"608261193a852522ff3c500a5559ee465cf64b6acb38aed61a42c425c3bfcafc"} Mar 18 12:28:09 crc kubenswrapper[4843]: I0318 12:28:09.260972 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x7lj9" event={"ID":"3dc20e76-907f-4852-91af-a114f966c97b","Type":"ContainerStarted","Data":"fdf9aae3ad72360eeebea0fd7932793b346f9ebc671e8ae6b9e35b10eeb0982a"} Mar 18 12:28:09 crc kubenswrapper[4843]: I0318 12:28:09.260982 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x7lj9" event={"ID":"3dc20e76-907f-4852-91af-a114f966c97b","Type":"ContainerStarted","Data":"0eb180533a6660ffc9ba6ea019dd058c772389826bbfebe806df7ecd2c5f6a18"} Mar 18 12:28:09 crc kubenswrapper[4843]: I0318 12:28:09.260990 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x7lj9" event={"ID":"3dc20e76-907f-4852-91af-a114f966c97b","Type":"ContainerStarted","Data":"5f594663cc4f9b25727d1fb0681f3638dbdb73f3a169012369868717451cc43b"} Mar 18 12:28:10 crc kubenswrapper[4843]: I0318 12:28:10.275605 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x7lj9" event={"ID":"3dc20e76-907f-4852-91af-a114f966c97b","Type":"ContainerStarted","Data":"18b6dac5b28647970129e36803eafe78e6fcdc90910b5895fb6a35acb73dc485"} Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.290755 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x7lj9" event={"ID":"3dc20e76-907f-4852-91af-a114f966c97b","Type":"ContainerStarted","Data":"a4a6edd8f426d431cc3c079eb3e9e6d9d303f2e4dfb493dd7d5a33fdab468182"} Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.290948 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.322022 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-x7lj9" podStartSLOduration=7.238940827 podStartE2EDuration="15.322000395s" podCreationTimestamp="2026-03-18 12:27:56 +0000 UTC" firstStartedPulling="2026-03-18 12:27:57.150457403 +0000 UTC m=+1110.866282927" lastFinishedPulling="2026-03-18 12:28:05.233516971 +0000 UTC m=+1118.949342495" observedRunningTime="2026-03-18 12:28:11.319445872 +0000 UTC m=+1125.035271406" watchObservedRunningTime="2026-03-18 12:28:11.322000395 +0000 UTC m=+1125.037825929" Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.559922 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2nxpx"] Mar 18 12:28:11 crc kubenswrapper[4843]: E0318 12:28:11.560504 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ff159c-ced7-4e9d-8aac-a6789348ca55" containerName="oc" Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.560529 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ff159c-ced7-4e9d-8aac-a6789348ca55" containerName="oc" Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.560703 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ff159c-ced7-4e9d-8aac-a6789348ca55" containerName="oc" Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.561208 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2nxpx" Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.563552 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.563590 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.563860 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-8bg2n" Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.578605 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2nxpx"] Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.671929 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvg7r\" (UniqueName: \"kubernetes.io/projected/c3a3a5a5-d99e-4c14-98b0-3e84695b91c9-kube-api-access-rvg7r\") pod \"openstack-operator-index-2nxpx\" (UID: \"c3a3a5a5-d99e-4c14-98b0-3e84695b91c9\") " pod="openstack-operators/openstack-operator-index-2nxpx" Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.772840 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvg7r\" (UniqueName: \"kubernetes.io/projected/c3a3a5a5-d99e-4c14-98b0-3e84695b91c9-kube-api-access-rvg7r\") pod \"openstack-operator-index-2nxpx\" (UID: \"c3a3a5a5-d99e-4c14-98b0-3e84695b91c9\") " pod="openstack-operators/openstack-operator-index-2nxpx" Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.791141 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvg7r\" (UniqueName: \"kubernetes.io/projected/c3a3a5a5-d99e-4c14-98b0-3e84695b91c9-kube-api-access-rvg7r\") pod \"openstack-operator-index-2nxpx\" (UID: \"c3a3a5a5-d99e-4c14-98b0-3e84695b91c9\") " pod="openstack-operators/openstack-operator-index-2nxpx" Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.882922 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2nxpx" Mar 18 12:28:11 crc kubenswrapper[4843]: I0318 12:28:11.977950 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:28:12 crc kubenswrapper[4843]: I0318 12:28:12.016947 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:28:12 crc kubenswrapper[4843]: I0318 12:28:12.104498 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2nxpx"] Mar 18 12:28:12 crc kubenswrapper[4843]: I0318 12:28:12.304730 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2nxpx" event={"ID":"c3a3a5a5-d99e-4c14-98b0-3e84695b91c9","Type":"ContainerStarted","Data":"af38b9987fc6546b7bfbf3ab1ca637c4bb47aab49f16a852bc65a980488eafc9"} Mar 18 12:28:14 crc kubenswrapper[4843]: I0318 12:28:14.518735 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2nxpx"] Mar 18 12:28:15 crc kubenswrapper[4843]: I0318 12:28:15.365265 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2nxpx" event={"ID":"c3a3a5a5-d99e-4c14-98b0-3e84695b91c9","Type":"ContainerStarted","Data":"c4b7d43b98a1d6b91bfc66fd8ffde4fcd5f3e6065c96343f9e4986e4b04d531c"} Mar 18 12:28:15 crc kubenswrapper[4843]: I0318 12:28:15.365489 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-2nxpx" podUID="c3a3a5a5-d99e-4c14-98b0-3e84695b91c9" containerName="registry-server" containerID="cri-o://c4b7d43b98a1d6b91bfc66fd8ffde4fcd5f3e6065c96343f9e4986e4b04d531c" gracePeriod=2 Mar 18 12:28:15 crc kubenswrapper[4843]: I0318 12:28:15.375769 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rtqgz"] Mar 18 12:28:15 crc kubenswrapper[4843]: I0318 12:28:15.377460 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rtqgz" Mar 18 12:28:15 crc kubenswrapper[4843]: I0318 12:28:15.387557 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2nxpx" podStartSLOduration=1.914943741 podStartE2EDuration="4.387540158s" podCreationTimestamp="2026-03-18 12:28:11 +0000 UTC" firstStartedPulling="2026-03-18 12:28:12.10661855 +0000 UTC m=+1125.822444074" lastFinishedPulling="2026-03-18 12:28:14.579214967 +0000 UTC m=+1128.295040491" observedRunningTime="2026-03-18 12:28:15.384507581 +0000 UTC m=+1129.100333105" watchObservedRunningTime="2026-03-18 12:28:15.387540158 +0000 UTC m=+1129.103365682" Mar 18 12:28:15 crc kubenswrapper[4843]: I0318 12:28:15.390336 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rtqgz"] Mar 18 12:28:15 crc kubenswrapper[4843]: I0318 12:28:15.448753 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gljxv\" (UniqueName: \"kubernetes.io/projected/05c2657b-cf87-40d7-9078-432f19509383-kube-api-access-gljxv\") pod \"openstack-operator-index-rtqgz\" (UID: \"05c2657b-cf87-40d7-9078-432f19509383\") " pod="openstack-operators/openstack-operator-index-rtqgz" Mar 18 12:28:15 crc kubenswrapper[4843]: I0318 12:28:15.550111 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gljxv\" (UniqueName: \"kubernetes.io/projected/05c2657b-cf87-40d7-9078-432f19509383-kube-api-access-gljxv\") pod \"openstack-operator-index-rtqgz\" (UID: \"05c2657b-cf87-40d7-9078-432f19509383\") " pod="openstack-operators/openstack-operator-index-rtqgz" Mar 18 12:28:15 crc kubenswrapper[4843]: I0318 12:28:15.580349 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gljxv\" (UniqueName: \"kubernetes.io/projected/05c2657b-cf87-40d7-9078-432f19509383-kube-api-access-gljxv\") pod \"openstack-operator-index-rtqgz\" (UID: \"05c2657b-cf87-40d7-9078-432f19509383\") " pod="openstack-operators/openstack-operator-index-rtqgz" Mar 18 12:28:15 crc kubenswrapper[4843]: I0318 12:28:15.702833 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rtqgz" Mar 18 12:28:15 crc kubenswrapper[4843]: I0318 12:28:15.763647 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2nxpx" Mar 18 12:28:15 crc kubenswrapper[4843]: I0318 12:28:15.958803 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvg7r\" (UniqueName: \"kubernetes.io/projected/c3a3a5a5-d99e-4c14-98b0-3e84695b91c9-kube-api-access-rvg7r\") pod \"c3a3a5a5-d99e-4c14-98b0-3e84695b91c9\" (UID: \"c3a3a5a5-d99e-4c14-98b0-3e84695b91c9\") " Mar 18 12:28:15 crc kubenswrapper[4843]: I0318 12:28:15.968949 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a3a5a5-d99e-4c14-98b0-3e84695b91c9-kube-api-access-rvg7r" (OuterVolumeSpecName: "kube-api-access-rvg7r") pod "c3a3a5a5-d99e-4c14-98b0-3e84695b91c9" (UID: "c3a3a5a5-d99e-4c14-98b0-3e84695b91c9"). InnerVolumeSpecName "kube-api-access-rvg7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.060131 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvg7r\" (UniqueName: \"kubernetes.io/projected/c3a3a5a5-d99e-4c14-98b0-3e84695b91c9-kube-api-access-rvg7r\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.106737 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rtqgz"] Mar 18 12:28:16 crc kubenswrapper[4843]: W0318 12:28:16.111817 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05c2657b_cf87_40d7_9078_432f19509383.slice/crio-9a121dd0f3567f44d0e7c226cd5cdebd9bda74dda1f9c29a44b760210525aae3 WatchSource:0}: Error finding container 9a121dd0f3567f44d0e7c226cd5cdebd9bda74dda1f9c29a44b760210525aae3: Status 404 returned error can't find the container with id 9a121dd0f3567f44d0e7c226cd5cdebd9bda74dda1f9c29a44b760210525aae3 Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.376396 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rtqgz" event={"ID":"05c2657b-cf87-40d7-9078-432f19509383","Type":"ContainerStarted","Data":"e634b6015969f77797a49577a8de061b76cc21beec5796c8ecac968456595a4a"} Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.376618 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rtqgz" event={"ID":"05c2657b-cf87-40d7-9078-432f19509383","Type":"ContainerStarted","Data":"9a121dd0f3567f44d0e7c226cd5cdebd9bda74dda1f9c29a44b760210525aae3"} Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.380189 4843 generic.go:334] "Generic (PLEG): container finished" podID="c3a3a5a5-d99e-4c14-98b0-3e84695b91c9" containerID="c4b7d43b98a1d6b91bfc66fd8ffde4fcd5f3e6065c96343f9e4986e4b04d531c" exitCode=0 Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.380232 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2nxpx" event={"ID":"c3a3a5a5-d99e-4c14-98b0-3e84695b91c9","Type":"ContainerDied","Data":"c4b7d43b98a1d6b91bfc66fd8ffde4fcd5f3e6065c96343f9e4986e4b04d531c"} Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.380269 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2nxpx" event={"ID":"c3a3a5a5-d99e-4c14-98b0-3e84695b91c9","Type":"ContainerDied","Data":"af38b9987fc6546b7bfbf3ab1ca637c4bb47aab49f16a852bc65a980488eafc9"} Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.380290 4843 scope.go:117] "RemoveContainer" containerID="c4b7d43b98a1d6b91bfc66fd8ffde4fcd5f3e6065c96343f9e4986e4b04d531c" Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.380322 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2nxpx" Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.403599 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rtqgz" podStartSLOduration=1.355892751 podStartE2EDuration="1.403580358s" podCreationTimestamp="2026-03-18 12:28:15 +0000 UTC" firstStartedPulling="2026-03-18 12:28:16.167590013 +0000 UTC m=+1129.883415547" lastFinishedPulling="2026-03-18 12:28:16.21527761 +0000 UTC m=+1129.931103154" observedRunningTime="2026-03-18 12:28:16.395131797 +0000 UTC m=+1130.110957311" watchObservedRunningTime="2026-03-18 12:28:16.403580358 +0000 UTC m=+1130.119405882" Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.417225 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2nxpx"] Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.425489 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-2nxpx"] Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.431521 4843 scope.go:117] "RemoveContainer" containerID="c4b7d43b98a1d6b91bfc66fd8ffde4fcd5f3e6065c96343f9e4986e4b04d531c" Mar 18 12:28:16 crc kubenswrapper[4843]: E0318 12:28:16.432118 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4b7d43b98a1d6b91bfc66fd8ffde4fcd5f3e6065c96343f9e4986e4b04d531c\": container with ID starting with c4b7d43b98a1d6b91bfc66fd8ffde4fcd5f3e6065c96343f9e4986e4b04d531c not found: ID does not exist" containerID="c4b7d43b98a1d6b91bfc66fd8ffde4fcd5f3e6065c96343f9e4986e4b04d531c" Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.432185 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b7d43b98a1d6b91bfc66fd8ffde4fcd5f3e6065c96343f9e4986e4b04d531c"} err="failed to get container status \"c4b7d43b98a1d6b91bfc66fd8ffde4fcd5f3e6065c96343f9e4986e4b04d531c\": rpc error: code = NotFound desc = could not find container \"c4b7d43b98a1d6b91bfc66fd8ffde4fcd5f3e6065c96343f9e4986e4b04d531c\": container with ID starting with c4b7d43b98a1d6b91bfc66fd8ffde4fcd5f3e6065c96343f9e4986e4b04d531c not found: ID does not exist" Mar 18 12:28:16 crc kubenswrapper[4843]: I0318 12:28:16.996570 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3a3a5a5-d99e-4c14-98b0-3e84695b91c9" path="/var/lib/kubelet/pods/c3a3a5a5-d99e-4c14-98b0-3e84695b91c9/volumes" Mar 18 12:28:17 crc kubenswrapper[4843]: I0318 12:28:17.579751 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-bn9hp" Mar 18 12:28:25 crc kubenswrapper[4843]: I0318 12:28:25.703873 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-rtqgz" Mar 18 12:28:25 crc kubenswrapper[4843]: I0318 12:28:25.706231 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-rtqgz" Mar 18 12:28:25 crc kubenswrapper[4843]: I0318 12:28:25.746944 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-rtqgz" Mar 18 12:28:26 crc kubenswrapper[4843]: I0318 12:28:26.522362 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-rtqgz" Mar 18 12:28:27 crc kubenswrapper[4843]: I0318 12:28:26.999986 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-x7lj9" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.534892 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb"] Mar 18 12:28:31 crc kubenswrapper[4843]: E0318 12:28:31.535603 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3a3a5a5-d99e-4c14-98b0-3e84695b91c9" containerName="registry-server" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.535625 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a3a5a5-d99e-4c14-98b0-3e84695b91c9" containerName="registry-server" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.535863 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3a3a5a5-d99e-4c14-98b0-3e84695b91c9" containerName="registry-server" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.537864 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.541982 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-7bkl9" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.547090 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb"] Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.576140 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-util\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb\" (UID: \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.576503 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-bundle\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb\" (UID: \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.576747 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd8hp\" (UniqueName: \"kubernetes.io/projected/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-kube-api-access-dd8hp\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb\" (UID: \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.678040 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-util\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb\" (UID: \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.678308 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-bundle\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb\" (UID: \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.678395 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd8hp\" (UniqueName: \"kubernetes.io/projected/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-kube-api-access-dd8hp\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb\" (UID: \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.679512 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-bundle\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb\" (UID: \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.680007 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-util\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb\" (UID: \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.695943 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd8hp\" (UniqueName: \"kubernetes.io/projected/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-kube-api-access-dd8hp\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb\" (UID: \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" Mar 18 12:28:31 crc kubenswrapper[4843]: I0318 12:28:31.865960 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" Mar 18 12:28:32 crc kubenswrapper[4843]: I0318 12:28:32.175469 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb"] Mar 18 12:28:32 crc kubenswrapper[4843]: W0318 12:28:32.180812 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4c2f3d7_acdb_4867_88ea_4c27f043a32b.slice/crio-dbf6c23473c9816eea6651b362c84da32661445725c01ce85bb675b3faeba86e WatchSource:0}: Error finding container dbf6c23473c9816eea6651b362c84da32661445725c01ce85bb675b3faeba86e: Status 404 returned error can't find the container with id dbf6c23473c9816eea6651b362c84da32661445725c01ce85bb675b3faeba86e Mar 18 12:28:32 crc kubenswrapper[4843]: I0318 12:28:32.544070 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4c2f3d7-acdb-4867-88ea-4c27f043a32b" containerID="49246457c1d4445dfa68b9215bbc7f6ef042d4424031c6a72fe8bb30afe1e7df" exitCode=0 Mar 18 12:28:32 crc kubenswrapper[4843]: I0318 12:28:32.544130 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" event={"ID":"f4c2f3d7-acdb-4867-88ea-4c27f043a32b","Type":"ContainerDied","Data":"49246457c1d4445dfa68b9215bbc7f6ef042d4424031c6a72fe8bb30afe1e7df"} Mar 18 12:28:32 crc kubenswrapper[4843]: I0318 12:28:32.544823 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" event={"ID":"f4c2f3d7-acdb-4867-88ea-4c27f043a32b","Type":"ContainerStarted","Data":"dbf6c23473c9816eea6651b362c84da32661445725c01ce85bb675b3faeba86e"} Mar 18 12:28:33 crc kubenswrapper[4843]: I0318 12:28:33.553504 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4c2f3d7-acdb-4867-88ea-4c27f043a32b" containerID="07f34cffc60f66edacd5c66cb53fe2d55a1c4a273c55a5a5a20b3dfabdc1753b" exitCode=0 Mar 18 12:28:33 crc kubenswrapper[4843]: I0318 12:28:33.553599 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" event={"ID":"f4c2f3d7-acdb-4867-88ea-4c27f043a32b","Type":"ContainerDied","Data":"07f34cffc60f66edacd5c66cb53fe2d55a1c4a273c55a5a5a20b3dfabdc1753b"} Mar 18 12:28:34 crc kubenswrapper[4843]: I0318 12:28:34.563837 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4c2f3d7-acdb-4867-88ea-4c27f043a32b" containerID="f2108a38ac85f6a9d465d33f7391c3bd1a3df478ac46204dfffe22360d67fc9f" exitCode=0 Mar 18 12:28:34 crc kubenswrapper[4843]: I0318 12:28:34.563884 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" event={"ID":"f4c2f3d7-acdb-4867-88ea-4c27f043a32b","Type":"ContainerDied","Data":"f2108a38ac85f6a9d465d33f7391c3bd1a3df478ac46204dfffe22360d67fc9f"} Mar 18 12:28:35 crc kubenswrapper[4843]: I0318 12:28:35.882189 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" Mar 18 12:28:36 crc kubenswrapper[4843]: I0318 12:28:36.037279 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-util\") pod \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\" (UID: \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\") " Mar 18 12:28:36 crc kubenswrapper[4843]: I0318 12:28:36.037358 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd8hp\" (UniqueName: \"kubernetes.io/projected/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-kube-api-access-dd8hp\") pod \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\" (UID: \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\") " Mar 18 12:28:36 crc kubenswrapper[4843]: I0318 12:28:36.037412 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-bundle\") pod \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\" (UID: \"f4c2f3d7-acdb-4867-88ea-4c27f043a32b\") " Mar 18 12:28:36 crc kubenswrapper[4843]: I0318 12:28:36.039354 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-bundle" (OuterVolumeSpecName: "bundle") pod "f4c2f3d7-acdb-4867-88ea-4c27f043a32b" (UID: "f4c2f3d7-acdb-4867-88ea-4c27f043a32b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:28:36 crc kubenswrapper[4843]: I0318 12:28:36.045401 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-kube-api-access-dd8hp" (OuterVolumeSpecName: "kube-api-access-dd8hp") pod "f4c2f3d7-acdb-4867-88ea-4c27f043a32b" (UID: "f4c2f3d7-acdb-4867-88ea-4c27f043a32b"). InnerVolumeSpecName "kube-api-access-dd8hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:28:36 crc kubenswrapper[4843]: I0318 12:28:36.053350 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-util" (OuterVolumeSpecName: "util") pod "f4c2f3d7-acdb-4867-88ea-4c27f043a32b" (UID: "f4c2f3d7-acdb-4867-88ea-4c27f043a32b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:28:36 crc kubenswrapper[4843]: I0318 12:28:36.139789 4843 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-util\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:36 crc kubenswrapper[4843]: I0318 12:28:36.139841 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd8hp\" (UniqueName: \"kubernetes.io/projected/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-kube-api-access-dd8hp\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:36 crc kubenswrapper[4843]: I0318 12:28:36.139859 4843 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4c2f3d7-acdb-4867-88ea-4c27f043a32b-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:36 crc kubenswrapper[4843]: I0318 12:28:36.593757 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" event={"ID":"f4c2f3d7-acdb-4867-88ea-4c27f043a32b","Type":"ContainerDied","Data":"dbf6c23473c9816eea6651b362c84da32661445725c01ce85bb675b3faeba86e"} Mar 18 12:28:36 crc kubenswrapper[4843]: I0318 12:28:36.593845 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbf6c23473c9816eea6651b362c84da32661445725c01ce85bb675b3faeba86e" Mar 18 12:28:36 crc kubenswrapper[4843]: I0318 12:28:36.593858 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb" Mar 18 12:28:38 crc kubenswrapper[4843]: I0318 12:28:38.621951 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9"] Mar 18 12:28:38 crc kubenswrapper[4843]: E0318 12:28:38.622696 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c2f3d7-acdb-4867-88ea-4c27f043a32b" containerName="util" Mar 18 12:28:38 crc kubenswrapper[4843]: I0318 12:28:38.622730 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c2f3d7-acdb-4867-88ea-4c27f043a32b" containerName="util" Mar 18 12:28:38 crc kubenswrapper[4843]: E0318 12:28:38.622804 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c2f3d7-acdb-4867-88ea-4c27f043a32b" containerName="extract" Mar 18 12:28:38 crc kubenswrapper[4843]: I0318 12:28:38.622825 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c2f3d7-acdb-4867-88ea-4c27f043a32b" containerName="extract" Mar 18 12:28:38 crc kubenswrapper[4843]: E0318 12:28:38.622850 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c2f3d7-acdb-4867-88ea-4c27f043a32b" containerName="pull" Mar 18 12:28:38 crc kubenswrapper[4843]: I0318 12:28:38.622863 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c2f3d7-acdb-4867-88ea-4c27f043a32b" containerName="pull" Mar 18 12:28:38 crc kubenswrapper[4843]: I0318 12:28:38.623104 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c2f3d7-acdb-4867-88ea-4c27f043a32b" containerName="extract" Mar 18 12:28:38 crc kubenswrapper[4843]: I0318 12:28:38.623935 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9" Mar 18 12:28:38 crc kubenswrapper[4843]: I0318 12:28:38.628158 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-pvpwd" Mar 18 12:28:38 crc kubenswrapper[4843]: I0318 12:28:38.653475 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9"] Mar 18 12:28:38 crc kubenswrapper[4843]: I0318 12:28:38.780784 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t8rg\" (UniqueName: \"kubernetes.io/projected/95fc7104-316c-4699-98b1-3ff394a0c609-kube-api-access-5t8rg\") pod \"openstack-operator-controller-init-68ccf9867-wdzw9\" (UID: \"95fc7104-316c-4699-98b1-3ff394a0c609\") " pod="openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9" Mar 18 12:28:38 crc kubenswrapper[4843]: I0318 12:28:38.882072 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t8rg\" (UniqueName: \"kubernetes.io/projected/95fc7104-316c-4699-98b1-3ff394a0c609-kube-api-access-5t8rg\") pod \"openstack-operator-controller-init-68ccf9867-wdzw9\" (UID: \"95fc7104-316c-4699-98b1-3ff394a0c609\") " pod="openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9" Mar 18 12:28:38 crc kubenswrapper[4843]: I0318 12:28:38.918457 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t8rg\" (UniqueName: \"kubernetes.io/projected/95fc7104-316c-4699-98b1-3ff394a0c609-kube-api-access-5t8rg\") pod \"openstack-operator-controller-init-68ccf9867-wdzw9\" (UID: \"95fc7104-316c-4699-98b1-3ff394a0c609\") " pod="openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9" Mar 18 12:28:38 crc kubenswrapper[4843]: I0318 12:28:38.944763 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9" Mar 18 12:28:39 crc kubenswrapper[4843]: I0318 12:28:39.169282 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9"] Mar 18 12:28:39 crc kubenswrapper[4843]: I0318 12:28:39.754211 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9" event={"ID":"95fc7104-316c-4699-98b1-3ff394a0c609","Type":"ContainerStarted","Data":"421405a94a3a489372dee0f2da419cdaf6e5dcc950ef3cfb416b7729971274ef"} Mar 18 12:28:43 crc kubenswrapper[4843]: I0318 12:28:43.789094 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9" event={"ID":"95fc7104-316c-4699-98b1-3ff394a0c609","Type":"ContainerStarted","Data":"62e6016201e2b42def18dde263d67e18b6d09b1ace8db6f0339db34db7d4942b"} Mar 18 12:28:43 crc kubenswrapper[4843]: I0318 12:28:43.789582 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9" Mar 18 12:28:43 crc kubenswrapper[4843]: I0318 12:28:43.818540 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9" podStartSLOduration=1.434836306 podStartE2EDuration="5.818520901s" podCreationTimestamp="2026-03-18 12:28:38 +0000 UTC" firstStartedPulling="2026-03-18 12:28:39.181776545 +0000 UTC m=+1152.897602069" lastFinishedPulling="2026-03-18 12:28:43.56546114 +0000 UTC m=+1157.281286664" observedRunningTime="2026-03-18 12:28:43.812840089 +0000 UTC m=+1157.528665633" watchObservedRunningTime="2026-03-18 12:28:43.818520901 +0000 UTC m=+1157.534346425" Mar 18 12:28:48 crc kubenswrapper[4843]: I0318 12:28:48.948573 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9" Mar 18 12:28:56 crc kubenswrapper[4843]: I0318 12:28:56.934999 4843 scope.go:117] "RemoveContainer" containerID="8abbda86c72abc940167f7d62e51c0a7bbfb31a957120d6605251ddb64ca44d8" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.116906 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-27wfc"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.118021 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-27wfc" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.120589 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-czlg7" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.124530 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-mr8bb"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.125632 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mr8bb" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.127590 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-kvr8r" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.129562 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-27wfc"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.148447 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-mr8bb"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.171755 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnnmx\" (UniqueName: \"kubernetes.io/projected/e8181c6a-2135-4961-8b68-410a69e807f2-kube-api-access-dnnmx\") pod \"barbican-operator-controller-manager-59bc569d95-27wfc\" (UID: \"e8181c6a-2135-4961-8b68-410a69e807f2\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-27wfc" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.172119 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wjvn\" (UniqueName: \"kubernetes.io/projected/5d1a187a-26d1-4e3f-84b2-54e7c5c596f9-kube-api-access-5wjvn\") pod \"cinder-operator-controller-manager-8d58dc466-mr8bb\" (UID: \"5d1a187a-26d1-4e3f-84b2-54e7c5c596f9\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mr8bb" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.179294 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-qfshp"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.180572 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-qfshp" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.185208 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-f274r" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.185478 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-m49jb"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.186534 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m49jb" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.190399 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-5ncsp" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.197420 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-m49jb"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.206720 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-hdzfh"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.207759 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hdzfh" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.211384 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-b4nbx" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.225580 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9q56"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.226900 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9q56" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.231595 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-fn99x" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.234853 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9q56"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.258720 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.259835 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.269454 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.269745 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-c7sfc" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.276291 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-qfshp"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.283113 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wjvn\" (UniqueName: \"kubernetes.io/projected/5d1a187a-26d1-4e3f-84b2-54e7c5c596f9-kube-api-access-5wjvn\") pod \"cinder-operator-controller-manager-8d58dc466-mr8bb\" (UID: \"5d1a187a-26d1-4e3f-84b2-54e7c5c596f9\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mr8bb" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.283185 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz6wx\" (UniqueName: \"kubernetes.io/projected/78bea798-f276-4bec-9b9f-32148b813f3e-kube-api-access-rz6wx\") pod \"infra-operator-controller-manager-7b9c774f96-92l6c\" (UID: \"78bea798-f276-4bec-9b9f-32148b813f3e\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.283227 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48qk4\" (UniqueName: \"kubernetes.io/projected/d375d53d-379d-43a3-ae17-3b4853b69f08-kube-api-access-48qk4\") pod \"designate-operator-controller-manager-588d4d986b-hdzfh\" (UID: \"d375d53d-379d-43a3-ae17-3b4853b69f08\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hdzfh" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.283260 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert\") pod \"infra-operator-controller-manager-7b9c774f96-92l6c\" (UID: \"78bea798-f276-4bec-9b9f-32148b813f3e\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.283326 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnxhz\" (UniqueName: \"kubernetes.io/projected/65ec012e-03c7-44d2-a280-4349c82db2b9-kube-api-access-gnxhz\") pod \"horizon-operator-controller-manager-8464cc45fb-z9q56\" (UID: \"65ec012e-03c7-44d2-a280-4349c82db2b9\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9q56" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.283556 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-754hd\" (UniqueName: \"kubernetes.io/projected/e55d0291-8b77-48ba-921f-ae8f2b3f7289-kube-api-access-754hd\") pod \"glance-operator-controller-manager-79df6bcc97-qfshp\" (UID: \"e55d0291-8b77-48ba-921f-ae8f2b3f7289\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-qfshp" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.283714 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh22l\" (UniqueName: \"kubernetes.io/projected/7676b45d-b133-49b7-80ce-b8cfd4bb27bc-kube-api-access-qh22l\") pod \"heat-operator-controller-manager-67dd5f86f5-m49jb\" (UID: \"7676b45d-b133-49b7-80ce-b8cfd4bb27bc\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m49jb" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.283858 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnnmx\" (UniqueName: \"kubernetes.io/projected/e8181c6a-2135-4961-8b68-410a69e807f2-kube-api-access-dnnmx\") pod \"barbican-operator-controller-manager-59bc569d95-27wfc\" (UID: \"e8181c6a-2135-4961-8b68-410a69e807f2\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-27wfc" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.328956 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.329902 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.339824 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.346619 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnnmx\" (UniqueName: \"kubernetes.io/projected/e8181c6a-2135-4961-8b68-410a69e807f2-kube-api-access-dnnmx\") pod \"barbican-operator-controller-manager-59bc569d95-27wfc\" (UID: \"e8181c6a-2135-4961-8b68-410a69e807f2\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-27wfc" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.347440 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-jcprb" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.353210 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wjvn\" (UniqueName: \"kubernetes.io/projected/5d1a187a-26d1-4e3f-84b2-54e7c5c596f9-kube-api-access-5wjvn\") pod \"cinder-operator-controller-manager-8d58dc466-mr8bb\" (UID: \"5d1a187a-26d1-4e3f-84b2-54e7c5c596f9\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mr8bb" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.356624 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.357693 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.366028 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-hdzfh"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.367972 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-vm6fb" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.371331 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.379459 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.384834 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5tqz\" (UniqueName: \"kubernetes.io/projected/36803f94-ff88-4cf3-ad7b-6fd7e6222a96-kube-api-access-g5tqz\") pod \"ironic-operator-controller-manager-6f787dddc9-vrstp\" (UID: \"36803f94-ff88-4cf3-ad7b-6fd7e6222a96\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.384880 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48qk4\" (UniqueName: \"kubernetes.io/projected/d375d53d-379d-43a3-ae17-3b4853b69f08-kube-api-access-48qk4\") pod \"designate-operator-controller-manager-588d4d986b-hdzfh\" (UID: \"d375d53d-379d-43a3-ae17-3b4853b69f08\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hdzfh" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.384899 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz6wx\" (UniqueName: \"kubernetes.io/projected/78bea798-f276-4bec-9b9f-32148b813f3e-kube-api-access-rz6wx\") pod \"infra-operator-controller-manager-7b9c774f96-92l6c\" (UID: \"78bea798-f276-4bec-9b9f-32148b813f3e\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.384923 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert\") pod \"infra-operator-controller-manager-7b9c774f96-92l6c\" (UID: \"78bea798-f276-4bec-9b9f-32148b813f3e\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.384940 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mjkb\" (UniqueName: \"kubernetes.io/projected/3d98bd17-e9d9-414f-a4b6-95975ab5ba2f-kube-api-access-8mjkb\") pod \"keystone-operator-controller-manager-768b96df4c-bs4lw\" (UID: \"3d98bd17-e9d9-414f-a4b6-95975ab5ba2f\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.384966 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnxhz\" (UniqueName: \"kubernetes.io/projected/65ec012e-03c7-44d2-a280-4349c82db2b9-kube-api-access-gnxhz\") pod \"horizon-operator-controller-manager-8464cc45fb-z9q56\" (UID: \"65ec012e-03c7-44d2-a280-4349c82db2b9\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9q56" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.384999 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-754hd\" (UniqueName: \"kubernetes.io/projected/e55d0291-8b77-48ba-921f-ae8f2b3f7289-kube-api-access-754hd\") pod \"glance-operator-controller-manager-79df6bcc97-qfshp\" (UID: \"e55d0291-8b77-48ba-921f-ae8f2b3f7289\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-qfshp" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.385047 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh22l\" (UniqueName: \"kubernetes.io/projected/7676b45d-b133-49b7-80ce-b8cfd4bb27bc-kube-api-access-qh22l\") pod \"heat-operator-controller-manager-67dd5f86f5-m49jb\" (UID: \"7676b45d-b133-49b7-80ce-b8cfd4bb27bc\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m49jb" Mar 18 12:29:09 crc kubenswrapper[4843]: E0318 12:29:09.385546 4843 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:09 crc kubenswrapper[4843]: E0318 12:29:09.385589 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert podName:78bea798-f276-4bec-9b9f-32148b813f3e nodeName:}" failed. No retries permitted until 2026-03-18 12:29:09.885576105 +0000 UTC m=+1183.601401629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert") pod "infra-operator-controller-manager-7b9c774f96-92l6c" (UID: "78bea798-f276-4bec-9b9f-32148b813f3e") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.412458 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48qk4\" (UniqueName: \"kubernetes.io/projected/d375d53d-379d-43a3-ae17-3b4853b69f08-kube-api-access-48qk4\") pod \"designate-operator-controller-manager-588d4d986b-hdzfh\" (UID: \"d375d53d-379d-43a3-ae17-3b4853b69f08\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hdzfh" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.414329 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz6wx\" (UniqueName: \"kubernetes.io/projected/78bea798-f276-4bec-9b9f-32148b813f3e-kube-api-access-rz6wx\") pod \"infra-operator-controller-manager-7b9c774f96-92l6c\" (UID: \"78bea798-f276-4bec-9b9f-32148b813f3e\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.421256 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-754hd\" (UniqueName: \"kubernetes.io/projected/e55d0291-8b77-48ba-921f-ae8f2b3f7289-kube-api-access-754hd\") pod \"glance-operator-controller-manager-79df6bcc97-qfshp\" (UID: \"e55d0291-8b77-48ba-921f-ae8f2b3f7289\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-qfshp" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.421325 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-sg29f"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.422219 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sg29f" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.425519 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnxhz\" (UniqueName: \"kubernetes.io/projected/65ec012e-03c7-44d2-a280-4349c82db2b9-kube-api-access-gnxhz\") pod \"horizon-operator-controller-manager-8464cc45fb-z9q56\" (UID: \"65ec012e-03c7-44d2-a280-4349c82db2b9\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9q56" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.427223 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh22l\" (UniqueName: \"kubernetes.io/projected/7676b45d-b133-49b7-80ce-b8cfd4bb27bc-kube-api-access-qh22l\") pod \"heat-operator-controller-manager-67dd5f86f5-m49jb\" (UID: \"7676b45d-b133-49b7-80ce-b8cfd4bb27bc\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m49jb" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.432242 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-lnbwv" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.441676 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-27wfc" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.442730 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lg5fn"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.443517 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lg5fn" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.455486 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mr8bb" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.456968 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-sg29f"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.457477 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-hhzfw" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.471701 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lg5fn"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.488116 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrvfj\" (UniqueName: \"kubernetes.io/projected/c4ef047f-6e31-4338-8710-556d04c03f41-kube-api-access-wrvfj\") pod \"manila-operator-controller-manager-55f864c847-sg29f\" (UID: \"c4ef047f-6e31-4338-8710-556d04c03f41\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-sg29f" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.488169 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5tqz\" (UniqueName: \"kubernetes.io/projected/36803f94-ff88-4cf3-ad7b-6fd7e6222a96-kube-api-access-g5tqz\") pod \"ironic-operator-controller-manager-6f787dddc9-vrstp\" (UID: \"36803f94-ff88-4cf3-ad7b-6fd7e6222a96\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.488194 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g78g\" (UniqueName: \"kubernetes.io/projected/c827df2a-19e1-4a20-bf4e-dffab3abe636-kube-api-access-6g78g\") pod \"mariadb-operator-controller-manager-67ccfc9778-lg5fn\" (UID: \"c827df2a-19e1-4a20-bf4e-dffab3abe636\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lg5fn" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.488233 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mjkb\" (UniqueName: \"kubernetes.io/projected/3d98bd17-e9d9-414f-a4b6-95975ab5ba2f-kube-api-access-8mjkb\") pod \"keystone-operator-controller-manager-768b96df4c-bs4lw\" (UID: \"3d98bd17-e9d9-414f-a4b6-95975ab5ba2f\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.491206 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xkf8s"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.492065 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xkf8s" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.499464 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-wwrw4" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.504570 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-qfshp" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.518349 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m49jb" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.518776 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xkf8s"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.525598 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5tqz\" (UniqueName: \"kubernetes.io/projected/36803f94-ff88-4cf3-ad7b-6fd7e6222a96-kube-api-access-g5tqz\") pod \"ironic-operator-controller-manager-6f787dddc9-vrstp\" (UID: \"36803f94-ff88-4cf3-ad7b-6fd7e6222a96\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.529916 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mjkb\" (UniqueName: \"kubernetes.io/projected/3d98bd17-e9d9-414f-a4b6-95975ab5ba2f-kube-api-access-8mjkb\") pod \"keystone-operator-controller-manager-768b96df4c-bs4lw\" (UID: \"3d98bd17-e9d9-414f-a4b6-95975ab5ba2f\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.534863 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hdzfh" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.550762 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.551560 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.552285 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9q56" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.556592 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-zgwk5" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.558968 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-rt7zn"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.560096 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-rt7zn" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.562355 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-qrrjf" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.565845 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.574191 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-rt7zn"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.589878 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g78g\" (UniqueName: \"kubernetes.io/projected/c827df2a-19e1-4a20-bf4e-dffab3abe636-kube-api-access-6g78g\") pod \"mariadb-operator-controller-manager-67ccfc9778-lg5fn\" (UID: \"c827df2a-19e1-4a20-bf4e-dffab3abe636\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lg5fn" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.589941 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbhg\" (UniqueName: \"kubernetes.io/projected/548351fd-949d-4cb3-be2d-b8a5f07d1c49-kube-api-access-fzbhg\") pod \"octavia-operator-controller-manager-5b9f45d989-rt7zn\" (UID: \"548351fd-949d-4cb3-be2d-b8a5f07d1c49\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-rt7zn" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.589976 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w72p7\" (UniqueName: \"kubernetes.io/projected/20256137-950a-4141-b73a-5993a2bc30d8-kube-api-access-w72p7\") pod \"neutron-operator-controller-manager-767865f676-xkf8s\" (UID: \"20256137-950a-4141-b73a-5993a2bc30d8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xkf8s" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.590022 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cr4t\" (UniqueName: \"kubernetes.io/projected/61be3a4c-f87e-4be8-a440-4359a84464c9-kube-api-access-4cr4t\") pod \"nova-operator-controller-manager-5d488d59fb-vzr9g\" (UID: \"61be3a4c-f87e-4be8-a440-4359a84464c9\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.590130 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrvfj\" (UniqueName: \"kubernetes.io/projected/c4ef047f-6e31-4338-8710-556d04c03f41-kube-api-access-wrvfj\") pod \"manila-operator-controller-manager-55f864c847-sg29f\" (UID: \"c4ef047f-6e31-4338-8710-556d04c03f41\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-sg29f" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.604041 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.604954 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.609596 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.610614 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.614216 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-t7th8" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.614444 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-jdgmv" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.616518 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.617120 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.627696 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.638534 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g78g\" (UniqueName: \"kubernetes.io/projected/c827df2a-19e1-4a20-bf4e-dffab3abe636-kube-api-access-6g78g\") pod \"mariadb-operator-controller-manager-67ccfc9778-lg5fn\" (UID: \"c827df2a-19e1-4a20-bf4e-dffab3abe636\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lg5fn" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.643131 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrvfj\" (UniqueName: \"kubernetes.io/projected/c4ef047f-6e31-4338-8710-556d04c03f41-kube-api-access-wrvfj\") pod \"manila-operator-controller-manager-55f864c847-sg29f\" (UID: \"c4ef047f-6e31-4338-8710-556d04c03f41\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-sg29f" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.660854 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-hfp4q"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.661956 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hfp4q" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.665190 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-8hm8d" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.690721 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7rjd\" (UniqueName: \"kubernetes.io/projected/61ad1951-c777-4392-be9f-e968600ccfc2-kube-api-access-m7rjd\") pod \"ovn-operator-controller-manager-884679f54-j7b9v\" (UID: \"61ad1951-c777-4392-be9f-e968600ccfc2\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.690765 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz4c9\" (UniqueName: \"kubernetes.io/projected/510ea8f8-3bdc-4db8-b4bf-accdfc6fe8e1-kube-api-access-kz4c9\") pod \"placement-operator-controller-manager-5784578c99-hfp4q\" (UID: \"510ea8f8-3bdc-4db8-b4bf-accdfc6fe8e1\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-hfp4q" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.690802 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbhg\" (UniqueName: \"kubernetes.io/projected/548351fd-949d-4cb3-be2d-b8a5f07d1c49-kube-api-access-fzbhg\") pod \"octavia-operator-controller-manager-5b9f45d989-rt7zn\" (UID: \"548351fd-949d-4cb3-be2d-b8a5f07d1c49\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-rt7zn" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.690824 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w72p7\" (UniqueName: \"kubernetes.io/projected/20256137-950a-4141-b73a-5993a2bc30d8-kube-api-access-w72p7\") pod \"neutron-operator-controller-manager-767865f676-xkf8s\" (UID: \"20256137-950a-4141-b73a-5993a2bc30d8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xkf8s" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.690849 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-6lz54\" (UID: \"503b0a02-e1c5-4375-94f7-466bb80e97c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.690878 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cr4t\" (UniqueName: \"kubernetes.io/projected/61be3a4c-f87e-4be8-a440-4359a84464c9-kube-api-access-4cr4t\") pod \"nova-operator-controller-manager-5d488d59fb-vzr9g\" (UID: \"61be3a4c-f87e-4be8-a440-4359a84464c9\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.690899 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdplb\" (UniqueName: \"kubernetes.io/projected/503b0a02-e1c5-4375-94f7-466bb80e97c8-kube-api-access-rdplb\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-6lz54\" (UID: \"503b0a02-e1c5-4375-94f7-466bb80e97c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.697763 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.714762 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbhg\" (UniqueName: \"kubernetes.io/projected/548351fd-949d-4cb3-be2d-b8a5f07d1c49-kube-api-access-fzbhg\") pod \"octavia-operator-controller-manager-5b9f45d989-rt7zn\" (UID: \"548351fd-949d-4cb3-be2d-b8a5f07d1c49\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-rt7zn" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.719393 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cr4t\" (UniqueName: \"kubernetes.io/projected/61be3a4c-f87e-4be8-a440-4359a84464c9-kube-api-access-4cr4t\") pod \"nova-operator-controller-manager-5d488d59fb-vzr9g\" (UID: \"61be3a4c-f87e-4be8-a440-4359a84464c9\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.721827 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w72p7\" (UniqueName: \"kubernetes.io/projected/20256137-950a-4141-b73a-5993a2bc30d8-kube-api-access-w72p7\") pod \"neutron-operator-controller-manager-767865f676-xkf8s\" (UID: \"20256137-950a-4141-b73a-5993a2bc30d8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xkf8s" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.734019 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-hfp4q"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.738908 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.745685 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.746584 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.753196 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-f6tcl" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.763851 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.781547 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.782453 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.786715 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vrdzj" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.791923 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7rjd\" (UniqueName: \"kubernetes.io/projected/61ad1951-c777-4392-be9f-e968600ccfc2-kube-api-access-m7rjd\") pod \"ovn-operator-controller-manager-884679f54-j7b9v\" (UID: \"61ad1951-c777-4392-be9f-e968600ccfc2\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.791973 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz4c9\" (UniqueName: \"kubernetes.io/projected/510ea8f8-3bdc-4db8-b4bf-accdfc6fe8e1-kube-api-access-kz4c9\") pod \"placement-operator-controller-manager-5784578c99-hfp4q\" (UID: \"510ea8f8-3bdc-4db8-b4bf-accdfc6fe8e1\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-hfp4q" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.792038 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-6lz54\" (UID: \"503b0a02-e1c5-4375-94f7-466bb80e97c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.793393 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdplb\" (UniqueName: \"kubernetes.io/projected/503b0a02-e1c5-4375-94f7-466bb80e97c8-kube-api-access-rdplb\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-6lz54\" (UID: \"503b0a02-e1c5-4375-94f7-466bb80e97c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:09 crc kubenswrapper[4843]: E0318 12:29:09.793801 4843 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:09 crc kubenswrapper[4843]: E0318 12:29:09.793871 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert podName:503b0a02-e1c5-4375-94f7-466bb80e97c8 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:10.293855362 +0000 UTC m=+1184.009680886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-6lz54" (UID: "503b0a02-e1c5-4375-94f7-466bb80e97c8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.807138 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.825751 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7rjd\" (UniqueName: \"kubernetes.io/projected/61ad1951-c777-4392-be9f-e968600ccfc2-kube-api-access-m7rjd\") pod \"ovn-operator-controller-manager-884679f54-j7b9v\" (UID: \"61ad1951-c777-4392-be9f-e968600ccfc2\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.827470 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdplb\" (UniqueName: \"kubernetes.io/projected/503b0a02-e1c5-4375-94f7-466bb80e97c8-kube-api-access-rdplb\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-6lz54\" (UID: \"503b0a02-e1c5-4375-94f7-466bb80e97c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.832412 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz4c9\" (UniqueName: \"kubernetes.io/projected/510ea8f8-3bdc-4db8-b4bf-accdfc6fe8e1-kube-api-access-kz4c9\") pod \"placement-operator-controller-manager-5784578c99-hfp4q\" (UID: \"510ea8f8-3bdc-4db8-b4bf-accdfc6fe8e1\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-hfp4q" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.842762 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sg29f" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.858416 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lg5fn" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.867817 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.869170 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.874213 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-6vrbk" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.881153 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xkf8s" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.893432 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.894937 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert\") pod \"infra-operator-controller-manager-7b9c774f96-92l6c\" (UID: \"78bea798-f276-4bec-9b9f-32148b813f3e\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.895052 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8jf6\" (UniqueName: \"kubernetes.io/projected/e210a78c-667d-46a0-a0bc-7c9371ecf962-kube-api-access-d8jf6\") pod \"telemetry-operator-controller-manager-d6b694c5-wg5ch\" (UID: \"e210a78c-667d-46a0-a0bc-7c9371ecf962\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.895153 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj69q\" (UniqueName: \"kubernetes.io/projected/c4e6b3ff-9964-4ef4-b0bf-2e1573be138c-kube-api-access-cj69q\") pod \"swift-operator-controller-manager-c674c5965-r8fkk\" (UID: \"c4e6b3ff-9964-4ef4-b0bf-2e1573be138c\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk" Mar 18 12:29:09 crc kubenswrapper[4843]: E0318 12:29:09.895369 4843 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:09 crc kubenswrapper[4843]: E0318 12:29:09.895473 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert podName:78bea798-f276-4bec-9b9f-32148b813f3e nodeName:}" failed. No retries permitted until 2026-03-18 12:29:10.895456893 +0000 UTC m=+1184.611282417 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert") pod "infra-operator-controller-manager-7b9c774f96-92l6c" (UID: "78bea798-f276-4bec-9b9f-32148b813f3e") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.926565 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.949820 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-rt7zn" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.960585 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.961539 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.963873 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-z6rkx" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.971911 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.985301 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.990735 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.991944 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.997319 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.997435 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9"] Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.997529 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.997538 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-77b2w" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.998200 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzwnf\" (UniqueName: \"kubernetes.io/projected/6372bc8a-9914-4ecf-bca4-f6998231babb-kube-api-access-fzwnf\") pod \"test-operator-controller-manager-5c5cb9c4d7-cfxn4\" (UID: \"6372bc8a-9914-4ecf-bca4-f6998231babb\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.998312 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8jf6\" (UniqueName: \"kubernetes.io/projected/e210a78c-667d-46a0-a0bc-7c9371ecf962-kube-api-access-d8jf6\") pod \"telemetry-operator-controller-manager-d6b694c5-wg5ch\" (UID: \"e210a78c-667d-46a0-a0bc-7c9371ecf962\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch" Mar 18 12:29:09 crc kubenswrapper[4843]: I0318 12:29:09.999609 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj69q\" (UniqueName: \"kubernetes.io/projected/c4e6b3ff-9964-4ef4-b0bf-2e1573be138c-kube-api-access-cj69q\") pod \"swift-operator-controller-manager-c674c5965-r8fkk\" (UID: \"c4e6b3ff-9964-4ef4-b0bf-2e1573be138c\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.005808 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk"] Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.006885 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.015772 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk"] Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.016896 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-xm7qm" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.021444 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hfp4q" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.021774 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj69q\" (UniqueName: \"kubernetes.io/projected/c4e6b3ff-9964-4ef4-b0bf-2e1573be138c-kube-api-access-cj69q\") pod \"swift-operator-controller-manager-c674c5965-r8fkk\" (UID: \"c4e6b3ff-9964-4ef4-b0bf-2e1573be138c\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.025073 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8jf6\" (UniqueName: \"kubernetes.io/projected/e210a78c-667d-46a0-a0bc-7c9371ecf962-kube-api-access-d8jf6\") pod \"telemetry-operator-controller-manager-d6b694c5-wg5ch\" (UID: \"e210a78c-667d-46a0-a0bc-7c9371ecf962\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.145819 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2k8d\" (UniqueName: \"kubernetes.io/projected/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-kube-api-access-m2k8d\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.145866 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzwnf\" (UniqueName: \"kubernetes.io/projected/6372bc8a-9914-4ecf-bca4-f6998231babb-kube-api-access-fzwnf\") pod \"test-operator-controller-manager-5c5cb9c4d7-cfxn4\" (UID: \"6372bc8a-9914-4ecf-bca4-f6998231babb\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.145961 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvxnt\" (UniqueName: \"kubernetes.io/projected/c01f9d62-3fbf-4eae-a515-6a8e8c41897c-kube-api-access-bvxnt\") pod \"watcher-operator-controller-manager-6c4d75f7f9-w4wgm\" (UID: \"c01f9d62-3fbf-4eae-a515-6a8e8c41897c\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.146019 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.146040 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.147739 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.148677 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.185599 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzwnf\" (UniqueName: \"kubernetes.io/projected/6372bc8a-9914-4ecf-bca4-f6998231babb-kube-api-access-fzwnf\") pod \"test-operator-controller-manager-5c5cb9c4d7-cfxn4\" (UID: \"6372bc8a-9914-4ecf-bca4-f6998231babb\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.213138 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-m49jb"] Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.213492 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.249112 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2k8d\" (UniqueName: \"kubernetes.io/projected/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-kube-api-access-m2k8d\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.249156 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvxnt\" (UniqueName: \"kubernetes.io/projected/c01f9d62-3fbf-4eae-a515-6a8e8c41897c-kube-api-access-bvxnt\") pod \"watcher-operator-controller-manager-6c4d75f7f9-w4wgm\" (UID: \"c01f9d62-3fbf-4eae-a515-6a8e8c41897c\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.249213 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdlgg\" (UniqueName: \"kubernetes.io/projected/c7dae85f-590a-445f-ad19-dcd447f77980-kube-api-access-jdlgg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5zfhk\" (UID: \"c7dae85f-590a-445f-ad19-dcd447f77980\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.249236 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.249258 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:10 crc kubenswrapper[4843]: E0318 12:29:10.249436 4843 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:29:10 crc kubenswrapper[4843]: E0318 12:29:10.249481 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs podName:f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:10.749466416 +0000 UTC m=+1184.465291940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-hxxp9" (UID: "f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79") : secret "metrics-server-cert" not found Mar 18 12:29:10 crc kubenswrapper[4843]: E0318 12:29:10.250218 4843 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:29:10 crc kubenswrapper[4843]: E0318 12:29:10.250248 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs podName:f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:10.750241078 +0000 UTC m=+1184.466066602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-hxxp9" (UID: "f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79") : secret "webhook-server-cert" not found Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.280992 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvxnt\" (UniqueName: \"kubernetes.io/projected/c01f9d62-3fbf-4eae-a515-6a8e8c41897c-kube-api-access-bvxnt\") pod \"watcher-operator-controller-manager-6c4d75f7f9-w4wgm\" (UID: \"c01f9d62-3fbf-4eae-a515-6a8e8c41897c\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.285933 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.293724 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-mr8bb"] Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.297328 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2k8d\" (UniqueName: \"kubernetes.io/projected/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-kube-api-access-m2k8d\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.302769 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-27wfc"] Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.356455 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-6lz54\" (UID: \"503b0a02-e1c5-4375-94f7-466bb80e97c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.356552 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdlgg\" (UniqueName: \"kubernetes.io/projected/c7dae85f-590a-445f-ad19-dcd447f77980-kube-api-access-jdlgg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5zfhk\" (UID: \"c7dae85f-590a-445f-ad19-dcd447f77980\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk" Mar 18 12:29:10 crc kubenswrapper[4843]: E0318 12:29:10.356923 4843 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:10 crc kubenswrapper[4843]: E0318 12:29:10.356963 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert podName:503b0a02-e1c5-4375-94f7-466bb80e97c8 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:11.356948665 +0000 UTC m=+1185.072774189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-6lz54" (UID: "503b0a02-e1c5-4375-94f7-466bb80e97c8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.387422 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdlgg\" (UniqueName: \"kubernetes.io/projected/c7dae85f-590a-445f-ad19-dcd447f77980-kube-api-access-jdlgg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5zfhk\" (UID: \"c7dae85f-590a-445f-ad19-dcd447f77980\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.444064 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-qfshp"] Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.448489 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk" Mar 18 12:29:10 crc kubenswrapper[4843]: W0318 12:29:10.468617 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8181c6a_2135_4961_8b68_410a69e807f2.slice/crio-ef98d321e8020c8d9f499edc248a8fc76e0e7f8774e6bed8a8712acb80581fc2 WatchSource:0}: Error finding container ef98d321e8020c8d9f499edc248a8fc76e0e7f8774e6bed8a8712acb80581fc2: Status 404 returned error can't find the container with id ef98d321e8020c8d9f499edc248a8fc76e0e7f8774e6bed8a8712acb80581fc2 Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.580318 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-hdzfh"] Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.604981 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp"] Mar 18 12:29:10 crc kubenswrapper[4843]: W0318 12:29:10.627275 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd375d53d_379d_43a3_ae17_3b4853b69f08.slice/crio-efb40065d2f65f1e90fbe89fed7c85c6fb28204b1275e777fb91b59806cbf140 WatchSource:0}: Error finding container efb40065d2f65f1e90fbe89fed7c85c6fb28204b1275e777fb91b59806cbf140: Status 404 returned error can't find the container with id efb40065d2f65f1e90fbe89fed7c85c6fb28204b1275e777fb91b59806cbf140 Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.652129 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw"] Mar 18 12:29:10 crc kubenswrapper[4843]: W0318 12:29:10.659483 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36803f94_ff88_4cf3_ad7b_6fd7e6222a96.slice/crio-787e17df2b5a33a19bfeaba103f4b34eb6cd1c05558907c763439a5334aaf237 WatchSource:0}: Error finding container 787e17df2b5a33a19bfeaba103f4b34eb6cd1c05558907c763439a5334aaf237: Status 404 returned error can't find the container with id 787e17df2b5a33a19bfeaba103f4b34eb6cd1c05558907c763439a5334aaf237 Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.684582 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9q56"] Mar 18 12:29:10 crc kubenswrapper[4843]: W0318 12:29:10.698043 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65ec012e_03c7_44d2_a280_4349c82db2b9.slice/crio-92442b2ce9dd99535021bbe257540ee5e77f6f6dfe18d5224912bb0fa762b95d WatchSource:0}: Error finding container 92442b2ce9dd99535021bbe257540ee5e77f6f6dfe18d5224912bb0fa762b95d: Status 404 returned error can't find the container with id 92442b2ce9dd99535021bbe257540ee5e77f6f6dfe18d5224912bb0fa762b95d Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.761812 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.762084 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:10 crc kubenswrapper[4843]: E0318 12:29:10.762023 4843 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:29:10 crc kubenswrapper[4843]: E0318 12:29:10.762206 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs podName:f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:11.762180995 +0000 UTC m=+1185.478006519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-hxxp9" (UID: "f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79") : secret "webhook-server-cert" not found Mar 18 12:29:10 crc kubenswrapper[4843]: E0318 12:29:10.762264 4843 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:29:10 crc kubenswrapper[4843]: E0318 12:29:10.762319 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs podName:f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:11.762305269 +0000 UTC m=+1185.478130793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-hxxp9" (UID: "f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79") : secret "metrics-server-cert" not found Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.898723 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xkf8s"] Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.912333 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-rt7zn"] Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.920906 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-sg29f"] Mar 18 12:29:10 crc kubenswrapper[4843]: W0318 12:29:10.922768 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20256137_950a_4141_b73a_5993a2bc30d8.slice/crio-e5b28009a57da387bec912dd31e1e7073c4f024cfdad0d09880012c6c1bd3b41 WatchSource:0}: Error finding container e5b28009a57da387bec912dd31e1e7073c4f024cfdad0d09880012c6c1bd3b41: Status 404 returned error can't find the container with id e5b28009a57da387bec912dd31e1e7073c4f024cfdad0d09880012c6c1bd3b41 Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.928628 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lg5fn"] Mar 18 12:29:10 crc kubenswrapper[4843]: I0318 12:29:10.967940 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert\") pod \"infra-operator-controller-manager-7b9c774f96-92l6c\" (UID: \"78bea798-f276-4bec-9b9f-32148b813f3e\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:10 crc kubenswrapper[4843]: E0318 12:29:10.968120 4843 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:10 crc kubenswrapper[4843]: E0318 12:29:10.968190 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert podName:78bea798-f276-4bec-9b9f-32148b813f3e nodeName:}" failed. No retries permitted until 2026-03-18 12:29:12.968173397 +0000 UTC m=+1186.683998921 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert") pod "infra-operator-controller-manager-7b9c774f96-92l6c" (UID: "78bea798-f276-4bec-9b9f-32148b813f3e") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.023274 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-hfp4q"] Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.027460 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g"] Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.065902 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4cr4t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-vzr9g_openstack-operators(61be3a4c-f87e-4be8-a440-4359a84464c9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.067108 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g" podUID="61be3a4c-f87e-4be8-a440-4359a84464c9" Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.128555 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk"] Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.139682 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v"] Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.142043 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jdlgg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-5zfhk_openstack-operators(c7dae85f-590a-445f-ad19-dcd447f77980): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.143505 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk" podUID="c7dae85f-590a-445f-ad19-dcd447f77980" Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.148616 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm"] Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.149127 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m7rjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-j7b9v_openstack-operators(61ad1951-c777-4392-be9f-e968600ccfc2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.150282 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v" podUID="61ad1951-c777-4392-be9f-e968600ccfc2" Mar 18 12:29:11 crc kubenswrapper[4843]: W0318 12:29:11.156741 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc01f9d62_3fbf_4eae_a515_6a8e8c41897c.slice/crio-a3d7d036a39eb7a382f2d54475455e64d79aae84ca92f2a77d6224d79cc16848 WatchSource:0}: Error finding container a3d7d036a39eb7a382f2d54475455e64d79aae84ca92f2a77d6224d79cc16848: Status 404 returned error can't find the container with id a3d7d036a39eb7a382f2d54475455e64d79aae84ca92f2a77d6224d79cc16848 Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.157997 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fzwnf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-cfxn4_openstack-operators(6372bc8a-9914-4ecf-bca4-f6998231babb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.167729 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch"] Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.167836 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4" podUID="6372bc8a-9914-4ecf-bca4-f6998231babb" Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.174698 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4"] Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.186502 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d8jf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-wg5ch_openstack-operators(e210a78c-667d-46a0-a0bc-7c9371ecf962): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.187909 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch" podUID="e210a78c-667d-46a0-a0bc-7c9371ecf962" Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.191176 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk" event={"ID":"c7dae85f-590a-445f-ad19-dcd447f77980","Type":"ContainerStarted","Data":"50eafcd79849f024259fd28044e9707737defeaaa79d7fbb19a1acb2801c1048"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.193440 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp" event={"ID":"36803f94-ff88-4cf3-ad7b-6fd7e6222a96","Type":"ContainerStarted","Data":"787e17df2b5a33a19bfeaba103f4b34eb6cd1c05558907c763439a5334aaf237"} Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.201801 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk" podUID="c7dae85f-590a-445f-ad19-dcd447f77980" Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.206658 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-27wfc" event={"ID":"e8181c6a-2135-4961-8b68-410a69e807f2","Type":"ContainerStarted","Data":"ef98d321e8020c8d9f499edc248a8fc76e0e7f8774e6bed8a8712acb80581fc2"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.207974 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-rt7zn" event={"ID":"548351fd-949d-4cb3-be2d-b8a5f07d1c49","Type":"ContainerStarted","Data":"9683c684f4e432d6969bf93fbc2b778777df77181a2d4dad61bf7558b6f2adaf"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.209318 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m49jb" event={"ID":"7676b45d-b133-49b7-80ce-b8cfd4bb27bc","Type":"ContainerStarted","Data":"08714d8c0e8044052fe29ff4c4e3940ccff121c721d8790f5e41ea53b07aecb4"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.223972 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sg29f" event={"ID":"c4ef047f-6e31-4338-8710-556d04c03f41","Type":"ContainerStarted","Data":"6842e4c77b7b9d3d87760f481e46fe157c34d8063ad91438cdc105ef3cf1ed87"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.234803 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mr8bb" event={"ID":"5d1a187a-26d1-4e3f-84b2-54e7c5c596f9","Type":"ContainerStarted","Data":"82131e6d48fc8bbae3f2ccc4d57858b9334de2876b36f55de840365b6f386e09"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.249126 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm" event={"ID":"c01f9d62-3fbf-4eae-a515-6a8e8c41897c","Type":"ContainerStarted","Data":"a3d7d036a39eb7a382f2d54475455e64d79aae84ca92f2a77d6224d79cc16848"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.253317 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4" event={"ID":"6372bc8a-9914-4ecf-bca4-f6998231babb","Type":"ContainerStarted","Data":"d7c562bddc1cb1b23237025bbb4e6998ea593ef74299de9ffb664014e0ad4224"} Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.254855 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4" podUID="6372bc8a-9914-4ecf-bca4-f6998231babb" Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.273126 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xkf8s" event={"ID":"20256137-950a-4141-b73a-5993a2bc30d8","Type":"ContainerStarted","Data":"e5b28009a57da387bec912dd31e1e7073c4f024cfdad0d09880012c6c1bd3b41"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.281977 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk"] Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.298446 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw" event={"ID":"3d98bd17-e9d9-414f-a4b6-95975ab5ba2f","Type":"ContainerStarted","Data":"bef2286564d5fd03498d72e930e2309dae33af1188f91dcb7a0e8027a65a502f"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.300301 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hfp4q" event={"ID":"510ea8f8-3bdc-4db8-b4bf-accdfc6fe8e1","Type":"ContainerStarted","Data":"760922021be83b5aa2dc71c8d022d27393226b5b93aabcceeaaf016e61e674d4"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.301892 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g" event={"ID":"61be3a4c-f87e-4be8-a440-4359a84464c9","Type":"ContainerStarted","Data":"84613417924a24ac252ebdf9ccc99b3bbe98d6bdc367a24638ea5192749ef318"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.303429 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v" event={"ID":"61ad1951-c777-4392-be9f-e968600ccfc2","Type":"ContainerStarted","Data":"0e19a5f47474ad27ae5b8e8d381e71036b51f943e7f2ce58e2e46484d363f813"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.304854 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9q56" event={"ID":"65ec012e-03c7-44d2-a280-4349c82db2b9","Type":"ContainerStarted","Data":"92442b2ce9dd99535021bbe257540ee5e77f6f6dfe18d5224912bb0fa762b95d"} Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.305789 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v" podUID="61ad1951-c777-4392-be9f-e968600ccfc2" Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.306572 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g" podUID="61be3a4c-f87e-4be8-a440-4359a84464c9" Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.311319 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-qfshp" event={"ID":"e55d0291-8b77-48ba-921f-ae8f2b3f7289","Type":"ContainerStarted","Data":"a01468a1f84d4d1e4491b01d311ac9fc742a02bbe998de8c35b395506ab9ffdd"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.312724 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hdzfh" event={"ID":"d375d53d-379d-43a3-ae17-3b4853b69f08","Type":"ContainerStarted","Data":"efb40065d2f65f1e90fbe89fed7c85c6fb28204b1275e777fb91b59806cbf140"} Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.318916 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lg5fn" event={"ID":"c827df2a-19e1-4a20-bf4e-dffab3abe636","Type":"ContainerStarted","Data":"367477284b77c984c595c07db53a35180e60a575ccac8466bf9b36490d6918e2"} Mar 18 12:29:11 crc kubenswrapper[4843]: W0318 12:29:11.318984 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4e6b3ff_9964_4ef4_b0bf_2e1573be138c.slice/crio-2853b2103ee858d2d2837b091249e0dc795881e4b3287edada19869668571283 WatchSource:0}: Error finding container 2853b2103ee858d2d2837b091249e0dc795881e4b3287edada19869668571283: Status 404 returned error can't find the container with id 2853b2103ee858d2d2837b091249e0dc795881e4b3287edada19869668571283 Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.320513 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cj69q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-r8fkk_openstack-operators(c4e6b3ff-9964-4ef4-b0bf-2e1573be138c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.321663 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk" podUID="c4e6b3ff-9964-4ef4-b0bf-2e1573be138c" Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.381137 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-6lz54\" (UID: \"503b0a02-e1c5-4375-94f7-466bb80e97c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.381645 4843 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.381938 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert podName:503b0a02-e1c5-4375-94f7-466bb80e97c8 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:13.38192155 +0000 UTC m=+1187.097747074 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-6lz54" (UID: "503b0a02-e1c5-4375-94f7-466bb80e97c8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.834586 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:11 crc kubenswrapper[4843]: I0318 12:29:11.834637 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.834762 4843 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.834853 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs podName:f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:13.834833917 +0000 UTC m=+1187.550659451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-hxxp9" (UID: "f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79") : secret "webhook-server-cert" not found Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.834777 4843 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:29:11 crc kubenswrapper[4843]: E0318 12:29:11.834906 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs podName:f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:13.834894679 +0000 UTC m=+1187.550720203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-hxxp9" (UID: "f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79") : secret "metrics-server-cert" not found Mar 18 12:29:12 crc kubenswrapper[4843]: I0318 12:29:12.347273 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch" event={"ID":"e210a78c-667d-46a0-a0bc-7c9371ecf962","Type":"ContainerStarted","Data":"a8717332e72410e7dfa9db12f237c623086f9dea77c58437df9f6053f3f73cbe"} Mar 18 12:29:12 crc kubenswrapper[4843]: E0318 12:29:12.350327 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch" podUID="e210a78c-667d-46a0-a0bc-7c9371ecf962" Mar 18 12:29:12 crc kubenswrapper[4843]: I0318 12:29:12.379205 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk" event={"ID":"c4e6b3ff-9964-4ef4-b0bf-2e1573be138c","Type":"ContainerStarted","Data":"2853b2103ee858d2d2837b091249e0dc795881e4b3287edada19869668571283"} Mar 18 12:29:12 crc kubenswrapper[4843]: E0318 12:29:12.381630 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk" podUID="c4e6b3ff-9964-4ef4-b0bf-2e1573be138c" Mar 18 12:29:12 crc kubenswrapper[4843]: E0318 12:29:12.382497 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4" podUID="6372bc8a-9914-4ecf-bca4-f6998231babb" Mar 18 12:29:12 crc kubenswrapper[4843]: E0318 12:29:12.382739 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v" podUID="61ad1951-c777-4392-be9f-e968600ccfc2" Mar 18 12:29:12 crc kubenswrapper[4843]: E0318 12:29:12.384280 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk" podUID="c7dae85f-590a-445f-ad19-dcd447f77980" Mar 18 12:29:12 crc kubenswrapper[4843]: E0318 12:29:12.389324 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g" podUID="61be3a4c-f87e-4be8-a440-4359a84464c9" Mar 18 12:29:13 crc kubenswrapper[4843]: I0318 12:29:13.077284 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert\") pod \"infra-operator-controller-manager-7b9c774f96-92l6c\" (UID: \"78bea798-f276-4bec-9b9f-32148b813f3e\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:13 crc kubenswrapper[4843]: E0318 12:29:13.077824 4843 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:13 crc kubenswrapper[4843]: E0318 12:29:13.077897 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert podName:78bea798-f276-4bec-9b9f-32148b813f3e nodeName:}" failed. No retries permitted until 2026-03-18 12:29:17.077881607 +0000 UTC m=+1190.793707131 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert") pod "infra-operator-controller-manager-7b9c774f96-92l6c" (UID: "78bea798-f276-4bec-9b9f-32148b813f3e") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:13 crc kubenswrapper[4843]: I0318 12:29:13.385135 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-6lz54\" (UID: \"503b0a02-e1c5-4375-94f7-466bb80e97c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:13 crc kubenswrapper[4843]: E0318 12:29:13.385319 4843 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:13 crc kubenswrapper[4843]: E0318 12:29:13.385364 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert podName:503b0a02-e1c5-4375-94f7-466bb80e97c8 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:17.385349915 +0000 UTC m=+1191.101175439 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-6lz54" (UID: "503b0a02-e1c5-4375-94f7-466bb80e97c8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:13 crc kubenswrapper[4843]: E0318 12:29:13.388919 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch" podUID="e210a78c-667d-46a0-a0bc-7c9371ecf962" Mar 18 12:29:13 crc kubenswrapper[4843]: E0318 12:29:13.391376 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk" podUID="c4e6b3ff-9964-4ef4-b0bf-2e1573be138c" Mar 18 12:29:13 crc kubenswrapper[4843]: I0318 12:29:13.837180 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:13 crc kubenswrapper[4843]: I0318 12:29:13.837244 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:13 crc kubenswrapper[4843]: E0318 12:29:13.837437 4843 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:29:13 crc kubenswrapper[4843]: E0318 12:29:13.837497 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs podName:f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:17.837477081 +0000 UTC m=+1191.553302605 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-hxxp9" (UID: "f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79") : secret "metrics-server-cert" not found Mar 18 12:29:13 crc kubenswrapper[4843]: E0318 12:29:13.837918 4843 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:29:13 crc kubenswrapper[4843]: E0318 12:29:13.837956 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs podName:f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:17.837944364 +0000 UTC m=+1191.553769888 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-hxxp9" (UID: "f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79") : secret "webhook-server-cert" not found Mar 18 12:29:17 crc kubenswrapper[4843]: I0318 12:29:17.178506 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert\") pod \"infra-operator-controller-manager-7b9c774f96-92l6c\" (UID: \"78bea798-f276-4bec-9b9f-32148b813f3e\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:17 crc kubenswrapper[4843]: E0318 12:29:17.178919 4843 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:17 crc kubenswrapper[4843]: E0318 12:29:17.178988 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert podName:78bea798-f276-4bec-9b9f-32148b813f3e nodeName:}" failed. No retries permitted until 2026-03-18 12:29:25.178971491 +0000 UTC m=+1198.894797015 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert") pod "infra-operator-controller-manager-7b9c774f96-92l6c" (UID: "78bea798-f276-4bec-9b9f-32148b813f3e") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:17 crc kubenswrapper[4843]: I0318 12:29:17.482216 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-6lz54\" (UID: \"503b0a02-e1c5-4375-94f7-466bb80e97c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:17 crc kubenswrapper[4843]: E0318 12:29:17.482458 4843 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:17 crc kubenswrapper[4843]: E0318 12:29:17.482550 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert podName:503b0a02-e1c5-4375-94f7-466bb80e97c8 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:25.482525168 +0000 UTC m=+1199.198350702 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-6lz54" (UID: "503b0a02-e1c5-4375-94f7-466bb80e97c8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:17 crc kubenswrapper[4843]: I0318 12:29:17.888121 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:17 crc kubenswrapper[4843]: I0318 12:29:17.888568 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:17 crc kubenswrapper[4843]: E0318 12:29:17.888510 4843 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:29:17 crc kubenswrapper[4843]: E0318 12:29:17.888678 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs podName:f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:25.888646134 +0000 UTC m=+1199.604471648 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-hxxp9" (UID: "f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79") : secret "webhook-server-cert" not found Mar 18 12:29:17 crc kubenswrapper[4843]: E0318 12:29:17.888870 4843 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:29:17 crc kubenswrapper[4843]: E0318 12:29:17.888949 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs podName:f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:25.888931622 +0000 UTC m=+1199.604757146 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-hxxp9" (UID: "f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79") : secret "metrics-server-cert" not found Mar 18 12:29:24 crc kubenswrapper[4843]: E0318 12:29:24.764365 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807" Mar 18 12:29:24 crc kubenswrapper[4843]: E0318 12:29:24.765136 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bvxnt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-w4wgm_openstack-operators(c01f9d62-3fbf-4eae-a515-6a8e8c41897c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:29:24 crc kubenswrapper[4843]: E0318 12:29:24.766366 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm" podUID="c01f9d62-3fbf-4eae-a515-6a8e8c41897c" Mar 18 12:29:25 crc kubenswrapper[4843]: I0318 12:29:25.235016 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert\") pod \"infra-operator-controller-manager-7b9c774f96-92l6c\" (UID: \"78bea798-f276-4bec-9b9f-32148b813f3e\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:25 crc kubenswrapper[4843]: E0318 12:29:25.235265 4843 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:25 crc kubenswrapper[4843]: E0318 12:29:25.235382 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert podName:78bea798-f276-4bec-9b9f-32148b813f3e nodeName:}" failed. No retries permitted until 2026-03-18 12:29:41.235357881 +0000 UTC m=+1214.951183405 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert") pod "infra-operator-controller-manager-7b9c774f96-92l6c" (UID: "78bea798-f276-4bec-9b9f-32148b813f3e") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:25 crc kubenswrapper[4843]: E0318 12:29:25.265228 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8" Mar 18 12:29:25 crc kubenswrapper[4843]: E0318 12:29:25.265478 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g5tqz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6f787dddc9-vrstp_openstack-operators(36803f94-ff88-4cf3-ad7b-6fd7e6222a96): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:29:25 crc kubenswrapper[4843]: E0318 12:29:25.266812 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp" podUID="36803f94-ff88-4cf3-ad7b-6fd7e6222a96" Mar 18 12:29:25 crc kubenswrapper[4843]: I0318 12:29:25.538950 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-6lz54\" (UID: \"503b0a02-e1c5-4375-94f7-466bb80e97c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:25 crc kubenswrapper[4843]: E0318 12:29:25.539144 4843 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:25 crc kubenswrapper[4843]: E0318 12:29:25.539218 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert podName:503b0a02-e1c5-4375-94f7-466bb80e97c8 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:41.539201397 +0000 UTC m=+1215.255026911 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-6lz54" (UID: "503b0a02-e1c5-4375-94f7-466bb80e97c8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:25 crc kubenswrapper[4843]: E0318 12:29:25.599774 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp" podUID="36803f94-ff88-4cf3-ad7b-6fd7e6222a96" Mar 18 12:29:25 crc kubenswrapper[4843]: E0318 12:29:25.604058 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm" podUID="c01f9d62-3fbf-4eae-a515-6a8e8c41897c" Mar 18 12:29:25 crc kubenswrapper[4843]: E0318 12:29:25.808916 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 18 12:29:25 crc kubenswrapper[4843]: E0318 12:29:25.809104 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8mjkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-bs4lw_openstack-operators(3d98bd17-e9d9-414f-a4b6-95975ab5ba2f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:29:25 crc kubenswrapper[4843]: E0318 12:29:25.810400 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw" podUID="3d98bd17-e9d9-414f-a4b6-95975ab5ba2f" Mar 18 12:29:25 crc kubenswrapper[4843]: I0318 12:29:25.944705 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:25 crc kubenswrapper[4843]: I0318 12:29:25.944765 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:25 crc kubenswrapper[4843]: I0318 12:29:25.953598 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:25 crc kubenswrapper[4843]: I0318 12:29:25.964058 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-hxxp9\" (UID: \"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:26 crc kubenswrapper[4843]: I0318 12:29:26.225817 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:26 crc kubenswrapper[4843]: E0318 12:29:26.605198 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw" podUID="3d98bd17-e9d9-414f-a4b6-95975ab5ba2f" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.326002 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9"] Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.623212 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xkf8s" event={"ID":"20256137-950a-4141-b73a-5993a2bc30d8","Type":"ContainerStarted","Data":"267cc0e8d297047967dfe92118abebedd3340e2361396a37bfec9fa47d1ef3fd"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.624257 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xkf8s" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.641707 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-qfshp" event={"ID":"e55d0291-8b77-48ba-921f-ae8f2b3f7289","Type":"ContainerStarted","Data":"65bd953e3c8a4e96deae7db7a601a44a3226579de24ef485563863b50a7c86a0"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.642569 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-qfshp" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.653187 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" event={"ID":"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79","Type":"ContainerStarted","Data":"595779ebd6e5e00e8bf960068de57abd7709ce03858fc34a3210af44d3d03ec4"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.653240 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" event={"ID":"f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79","Type":"ContainerStarted","Data":"6266f22bd88f85cd19e895f228bb4b5c6ebf4192f717a41f47645d4b722d74db"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.653947 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.655442 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lg5fn" event={"ID":"c827df2a-19e1-4a20-bf4e-dffab3abe636","Type":"ContainerStarted","Data":"2b6f3ca3408976ec5ba649820c376fc6e71fdd3934ec4bbb9dc0a4d1fdc2ae4e"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.655857 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lg5fn" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.656945 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-27wfc" event={"ID":"e8181c6a-2135-4961-8b68-410a69e807f2","Type":"ContainerStarted","Data":"0aac01a9d7f6979692613c5d97f376356cecb830d672fc76cf647e4f617f32d9"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.657319 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-27wfc" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.661800 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hdzfh" event={"ID":"d375d53d-379d-43a3-ae17-3b4853b69f08","Type":"ContainerStarted","Data":"6c1c7a2eee3bbdf116791b4b40427872474b6156d74c56db2444978de6d99ff2"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.662347 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hdzfh" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.669935 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m49jb" event={"ID":"7676b45d-b133-49b7-80ce-b8cfd4bb27bc","Type":"ContainerStarted","Data":"83be02782061d21cde4d51e54b5b265b6879c70f5626f0a534113604806cd2dd"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.670596 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m49jb" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.678567 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mr8bb" event={"ID":"5d1a187a-26d1-4e3f-84b2-54e7c5c596f9","Type":"ContainerStarted","Data":"a7fea44bb9ec1725bd87cd440ceea2358d9d34cd1d959986406ceb107b07b487"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.679248 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mr8bb" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.687615 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xkf8s" podStartSLOduration=3.31993129 podStartE2EDuration="18.687593068s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:10.933757517 +0000 UTC m=+1184.649583031" lastFinishedPulling="2026-03-18 12:29:26.301419285 +0000 UTC m=+1200.017244809" observedRunningTime="2026-03-18 12:29:27.671786478 +0000 UTC m=+1201.387612002" watchObservedRunningTime="2026-03-18 12:29:27.687593068 +0000 UTC m=+1201.403418592" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.702225 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v" event={"ID":"61ad1951-c777-4392-be9f-e968600ccfc2","Type":"ContainerStarted","Data":"978a5370c9147d3c10bbfb2d0545c674312c5a252bc12370a43e7fd306289915"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.703000 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.705133 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sg29f" event={"ID":"c4ef047f-6e31-4338-8710-556d04c03f41","Type":"ContainerStarted","Data":"11d15ad4a5b0c6c8f0eaf9b90c8b920c6cab6acef210692690b926af5b137740"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.705562 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sg29f" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.711886 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g" event={"ID":"61be3a4c-f87e-4be8-a440-4359a84464c9","Type":"ContainerStarted","Data":"5baadd548ce4b51f004788de48f087f65576e423dfb41c0192dc2f42dccf3a0a"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.712144 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.729049 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hfp4q" event={"ID":"510ea8f8-3bdc-4db8-b4bf-accdfc6fe8e1","Type":"ContainerStarted","Data":"f023d9ed266d67abd6781b5db8cd3369f823570f75c14880a6be8af6c6fa2826"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.729743 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hfp4q" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.742268 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9q56" event={"ID":"65ec012e-03c7-44d2-a280-4349c82db2b9","Type":"ContainerStarted","Data":"07e8170ec06134da39b5a8342f97afe5e1b6140e65b37c2b4d6c8c8ba312b7b3"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.743055 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9q56" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.785595 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-rt7zn" event={"ID":"548351fd-949d-4cb3-be2d-b8a5f07d1c49","Type":"ContainerStarted","Data":"a976878dac4f137c4ba97323d8c39fad4885e22311c90d05d87913b02e23f264"} Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.786306 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-rt7zn" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.954550 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m49jb" podStartSLOduration=3.027051016 podStartE2EDuration="18.954533523s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:10.372566789 +0000 UTC m=+1184.088392303" lastFinishedPulling="2026-03-18 12:29:26.300049266 +0000 UTC m=+1200.015874810" observedRunningTime="2026-03-18 12:29:27.785616967 +0000 UTC m=+1201.501442491" watchObservedRunningTime="2026-03-18 12:29:27.954533523 +0000 UTC m=+1201.670359047" Mar 18 12:29:27 crc kubenswrapper[4843]: I0318 12:29:27.956166 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-qfshp" podStartSLOduration=3.139393432 podStartE2EDuration="18.956160209s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:10.485024819 +0000 UTC m=+1184.200850343" lastFinishedPulling="2026-03-18 12:29:26.301791576 +0000 UTC m=+1200.017617120" observedRunningTime="2026-03-18 12:29:27.948529132 +0000 UTC m=+1201.664354656" watchObservedRunningTime="2026-03-18 12:29:27.956160209 +0000 UTC m=+1201.671985733" Mar 18 12:29:28 crc kubenswrapper[4843]: I0318 12:29:28.133780 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" podStartSLOduration=19.133760823 podStartE2EDuration="19.133760823s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:29:28.107757813 +0000 UTC m=+1201.823583337" watchObservedRunningTime="2026-03-18 12:29:28.133760823 +0000 UTC m=+1201.849586347" Mar 18 12:29:28 crc kubenswrapper[4843]: I0318 12:29:28.154324 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lg5fn" podStartSLOduration=3.217954259 podStartE2EDuration="19.154305808s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:10.937455163 +0000 UTC m=+1184.653280687" lastFinishedPulling="2026-03-18 12:29:26.873806712 +0000 UTC m=+1200.589632236" observedRunningTime="2026-03-18 12:29:28.146812454 +0000 UTC m=+1201.862637968" watchObservedRunningTime="2026-03-18 12:29:28.154305808 +0000 UTC m=+1201.870131332" Mar 18 12:29:28 crc kubenswrapper[4843]: I0318 12:29:28.185537 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hdzfh" podStartSLOduration=3.5156154280000003 podStartE2EDuration="19.185521876s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:10.631528568 +0000 UTC m=+1184.347354092" lastFinishedPulling="2026-03-18 12:29:26.301434996 +0000 UTC m=+1200.017260540" observedRunningTime="2026-03-18 12:29:28.179195246 +0000 UTC m=+1201.895020770" watchObservedRunningTime="2026-03-18 12:29:28.185521876 +0000 UTC m=+1201.901347400" Mar 18 12:29:28 crc kubenswrapper[4843]: I0318 12:29:28.269908 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-27wfc" podStartSLOduration=2.9042905230000002 podStartE2EDuration="19.269890267s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:10.484241747 +0000 UTC m=+1184.200067271" lastFinishedPulling="2026-03-18 12:29:26.849841501 +0000 UTC m=+1200.565667015" observedRunningTime="2026-03-18 12:29:28.218908826 +0000 UTC m=+1201.934734350" watchObservedRunningTime="2026-03-18 12:29:28.269890267 +0000 UTC m=+1201.985715781" Mar 18 12:29:28 crc kubenswrapper[4843]: I0318 12:29:28.271343 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hfp4q" podStartSLOduration=4.006716532 podStartE2EDuration="19.271336078s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:11.036803009 +0000 UTC m=+1184.752628533" lastFinishedPulling="2026-03-18 12:29:26.301422555 +0000 UTC m=+1200.017248079" observedRunningTime="2026-03-18 12:29:28.264557845 +0000 UTC m=+1201.980383369" watchObservedRunningTime="2026-03-18 12:29:28.271336078 +0000 UTC m=+1201.987161602" Mar 18 12:29:28 crc kubenswrapper[4843]: I0318 12:29:28.298505 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v" podStartSLOduration=3.498954002 podStartE2EDuration="19.29848787s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:11.148988091 +0000 UTC m=+1184.864813615" lastFinishedPulling="2026-03-18 12:29:26.948521959 +0000 UTC m=+1200.664347483" observedRunningTime="2026-03-18 12:29:28.298236263 +0000 UTC m=+1202.014061787" watchObservedRunningTime="2026-03-18 12:29:28.29848787 +0000 UTC m=+1202.014313394" Mar 18 12:29:28 crc kubenswrapper[4843]: I0318 12:29:28.332672 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mr8bb" podStartSLOduration=3.493383525 podStartE2EDuration="19.332643932s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:10.462958821 +0000 UTC m=+1184.178784345" lastFinishedPulling="2026-03-18 12:29:26.302219228 +0000 UTC m=+1200.018044752" observedRunningTime="2026-03-18 12:29:28.329554324 +0000 UTC m=+1202.045379848" watchObservedRunningTime="2026-03-18 12:29:28.332643932 +0000 UTC m=+1202.048469456" Mar 18 12:29:28 crc kubenswrapper[4843]: I0318 12:29:28.357959 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sg29f" podStartSLOduration=3.992128076 podStartE2EDuration="19.357942382s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:10.935696052 +0000 UTC m=+1184.651521576" lastFinishedPulling="2026-03-18 12:29:26.301510358 +0000 UTC m=+1200.017335882" observedRunningTime="2026-03-18 12:29:28.357361766 +0000 UTC m=+1202.073187290" watchObservedRunningTime="2026-03-18 12:29:28.357942382 +0000 UTC m=+1202.073767906" Mar 18 12:29:28 crc kubenswrapper[4843]: I0318 12:29:28.385905 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9q56" podStartSLOduration=3.272141299 podStartE2EDuration="19.385886827s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:10.708121337 +0000 UTC m=+1184.423946861" lastFinishedPulling="2026-03-18 12:29:26.821866865 +0000 UTC m=+1200.537692389" observedRunningTime="2026-03-18 12:29:28.378792665 +0000 UTC m=+1202.094618189" watchObservedRunningTime="2026-03-18 12:29:28.385886827 +0000 UTC m=+1202.101712351" Mar 18 12:29:28 crc kubenswrapper[4843]: I0318 12:29:28.422095 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g" podStartSLOduration=3.597729674 podStartE2EDuration="19.422078437s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:11.065711472 +0000 UTC m=+1184.781537006" lastFinishedPulling="2026-03-18 12:29:26.890060235 +0000 UTC m=+1200.605885769" observedRunningTime="2026-03-18 12:29:28.420057119 +0000 UTC m=+1202.135882663" watchObservedRunningTime="2026-03-18 12:29:28.422078437 +0000 UTC m=+1202.137903961" Mar 18 12:29:28 crc kubenswrapper[4843]: I0318 12:29:28.438211 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-rt7zn" podStartSLOduration=4.041606194 podStartE2EDuration="19.438191345s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:10.905946526 +0000 UTC m=+1184.621772050" lastFinishedPulling="2026-03-18 12:29:26.302531677 +0000 UTC m=+1200.018357201" observedRunningTime="2026-03-18 12:29:28.433399159 +0000 UTC m=+1202.149224683" watchObservedRunningTime="2026-03-18 12:29:28.438191345 +0000 UTC m=+1202.154016869" Mar 18 12:29:32 crc kubenswrapper[4843]: I0318 12:29:32.830556 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4" event={"ID":"6372bc8a-9914-4ecf-bca4-f6998231babb","Type":"ContainerStarted","Data":"0c27a7068583be2dd63cedec8d2495fe60bafc3f97cab247a39fc5d1214d417d"} Mar 18 12:29:32 crc kubenswrapper[4843]: I0318 12:29:32.831320 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4" Mar 18 12:29:32 crc kubenswrapper[4843]: I0318 12:29:32.834109 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk" event={"ID":"c4e6b3ff-9964-4ef4-b0bf-2e1573be138c","Type":"ContainerStarted","Data":"caea2eb9260d5a53f8821d58ac9b3e3978e07de4352b7a2a9927dfa4992b27a7"} Mar 18 12:29:32 crc kubenswrapper[4843]: I0318 12:29:32.834456 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk" Mar 18 12:29:32 crc kubenswrapper[4843]: I0318 12:29:32.835906 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch" event={"ID":"e210a78c-667d-46a0-a0bc-7c9371ecf962","Type":"ContainerStarted","Data":"dc116f1578e89135c3d66b4299215d5efcdd74d1796e14ba1fafc41493fee47b"} Mar 18 12:29:32 crc kubenswrapper[4843]: I0318 12:29:32.836222 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch" Mar 18 12:29:32 crc kubenswrapper[4843]: I0318 12:29:32.856823 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4" podStartSLOduration=2.549894789 podStartE2EDuration="23.856802185s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:11.157843983 +0000 UTC m=+1184.873669507" lastFinishedPulling="2026-03-18 12:29:32.464751379 +0000 UTC m=+1206.180576903" observedRunningTime="2026-03-18 12:29:32.848077426 +0000 UTC m=+1206.563902950" watchObservedRunningTime="2026-03-18 12:29:32.856802185 +0000 UTC m=+1206.572627709" Mar 18 12:29:32 crc kubenswrapper[4843]: I0318 12:29:32.867422 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk" podStartSLOduration=2.722545092 podStartE2EDuration="23.867402726s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:11.320391359 +0000 UTC m=+1185.036216883" lastFinishedPulling="2026-03-18 12:29:32.465248993 +0000 UTC m=+1206.181074517" observedRunningTime="2026-03-18 12:29:32.864908315 +0000 UTC m=+1206.580733849" watchObservedRunningTime="2026-03-18 12:29:32.867402726 +0000 UTC m=+1206.583228250" Mar 18 12:29:32 crc kubenswrapper[4843]: I0318 12:29:32.880956 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch" podStartSLOduration=2.6174623820000003 podStartE2EDuration="23.880945562s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:11.186358865 +0000 UTC m=+1184.902184389" lastFinishedPulling="2026-03-18 12:29:32.449842045 +0000 UTC m=+1206.165667569" observedRunningTime="2026-03-18 12:29:32.876800734 +0000 UTC m=+1206.592626258" watchObservedRunningTime="2026-03-18 12:29:32.880945562 +0000 UTC m=+1206.596771086" Mar 18 12:29:33 crc kubenswrapper[4843]: I0318 12:29:33.848856 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk" event={"ID":"c7dae85f-590a-445f-ad19-dcd447f77980","Type":"ContainerStarted","Data":"19b9fbb176e7784136fcb77c48f593bc0d881f348a428bdd8f846e311d3400c2"} Mar 18 12:29:33 crc kubenswrapper[4843]: I0318 12:29:33.873135 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5zfhk" podStartSLOduration=3.5495663950000003 podStartE2EDuration="24.873115794s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:11.14191264 +0000 UTC m=+1184.857738164" lastFinishedPulling="2026-03-18 12:29:32.465461989 +0000 UTC m=+1206.181287563" observedRunningTime="2026-03-18 12:29:33.870801768 +0000 UTC m=+1207.586627372" watchObservedRunningTime="2026-03-18 12:29:33.873115794 +0000 UTC m=+1207.588941328" Mar 18 12:29:36 crc kubenswrapper[4843]: I0318 12:29:36.237281 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" Mar 18 12:29:38 crc kubenswrapper[4843]: I0318 12:29:38.895062 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm" event={"ID":"c01f9d62-3fbf-4eae-a515-6a8e8c41897c","Type":"ContainerStarted","Data":"b897593e534394f7c9580bf3096d93f2059aa3943b868698cd9ebe25c28b30b2"} Mar 18 12:29:38 crc kubenswrapper[4843]: I0318 12:29:38.895740 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm" Mar 18 12:29:38 crc kubenswrapper[4843]: I0318 12:29:38.931478 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm" podStartSLOduration=2.5696815920000002 podStartE2EDuration="29.931451925s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:11.178609124 +0000 UTC m=+1184.894434648" lastFinishedPulling="2026-03-18 12:29:38.540379447 +0000 UTC m=+1212.256204981" observedRunningTime="2026-03-18 12:29:38.922737276 +0000 UTC m=+1212.638562840" watchObservedRunningTime="2026-03-18 12:29:38.931451925 +0000 UTC m=+1212.647277479" Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.446349 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-27wfc" Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.460934 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-mr8bb" Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.507338 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-qfshp" Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.524911 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m49jb" Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.542066 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hdzfh" Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.568587 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9q56" Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.846685 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sg29f" Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.862229 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lg5fn" Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.886849 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xkf8s" Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.898133 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vzr9g" Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.908631 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp" event={"ID":"36803f94-ff88-4cf3-ad7b-6fd7e6222a96","Type":"ContainerStarted","Data":"43458bec31aac0b3c33be3afef9033014a9ccf8f208ecfa6ee51b352411a4ba5"} Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.963699 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp" podStartSLOduration=2.112009519 podStartE2EDuration="30.963681886s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:10.672423041 +0000 UTC m=+1184.388248555" lastFinishedPulling="2026-03-18 12:29:39.524095408 +0000 UTC m=+1213.239920922" observedRunningTime="2026-03-18 12:29:39.962568255 +0000 UTC m=+1213.678393779" watchObservedRunningTime="2026-03-18 12:29:39.963681886 +0000 UTC m=+1213.679507410" Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.969833 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-rt7zn" Mar 18 12:29:39 crc kubenswrapper[4843]: I0318 12:29:39.992296 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-j7b9v" Mar 18 12:29:40 crc kubenswrapper[4843]: I0318 12:29:40.027303 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hfp4q" Mar 18 12:29:40 crc kubenswrapper[4843]: I0318 12:29:40.151333 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-r8fkk" Mar 18 12:29:40 crc kubenswrapper[4843]: I0318 12:29:40.152960 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wg5ch" Mar 18 12:29:40 crc kubenswrapper[4843]: I0318 12:29:40.217049 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfxn4" Mar 18 12:29:41 crc kubenswrapper[4843]: I0318 12:29:41.269006 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert\") pod \"infra-operator-controller-manager-7b9c774f96-92l6c\" (UID: \"78bea798-f276-4bec-9b9f-32148b813f3e\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:41 crc kubenswrapper[4843]: I0318 12:29:41.274127 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78bea798-f276-4bec-9b9f-32148b813f3e-cert\") pod \"infra-operator-controller-manager-7b9c774f96-92l6c\" (UID: \"78bea798-f276-4bec-9b9f-32148b813f3e\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:41 crc kubenswrapper[4843]: I0318 12:29:41.435829 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-c7sfc" Mar 18 12:29:41 crc kubenswrapper[4843]: I0318 12:29:41.445248 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:29:41 crc kubenswrapper[4843]: I0318 12:29:41.578250 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-6lz54\" (UID: \"503b0a02-e1c5-4375-94f7-466bb80e97c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:41 crc kubenswrapper[4843]: I0318 12:29:41.586102 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/503b0a02-e1c5-4375-94f7-466bb80e97c8-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-6lz54\" (UID: \"503b0a02-e1c5-4375-94f7-466bb80e97c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:41 crc kubenswrapper[4843]: I0318 12:29:41.653418 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c"] Mar 18 12:29:41 crc kubenswrapper[4843]: I0318 12:29:41.763490 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-jdgmv" Mar 18 12:29:41 crc kubenswrapper[4843]: I0318 12:29:41.772785 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:41 crc kubenswrapper[4843]: I0318 12:29:41.934605 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" event={"ID":"78bea798-f276-4bec-9b9f-32148b813f3e","Type":"ContainerStarted","Data":"8a0b0f689f32c4b16e10b161ee3de0d44815a3c7e8c94e6c9bedd7c9c9d8c326"} Mar 18 12:29:42 crc kubenswrapper[4843]: I0318 12:29:42.086151 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54"] Mar 18 12:29:42 crc kubenswrapper[4843]: I0318 12:29:42.942681 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" event={"ID":"503b0a02-e1c5-4375-94f7-466bb80e97c8","Type":"ContainerStarted","Data":"ae3e592026599998c0177fcfb121c80dfc6279d7f7ce4d1aa7dd99eb0d1e8358"} Mar 18 12:29:42 crc kubenswrapper[4843]: I0318 12:29:42.943862 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw" event={"ID":"3d98bd17-e9d9-414f-a4b6-95975ab5ba2f","Type":"ContainerStarted","Data":"84b793bf822938de4af402a270c10eb23e9b66b692b4b5a7e09a6954f044eb6d"} Mar 18 12:29:42 crc kubenswrapper[4843]: I0318 12:29:42.944216 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw" Mar 18 12:29:42 crc kubenswrapper[4843]: I0318 12:29:42.966360 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw" podStartSLOduration=2.822599209 podStartE2EDuration="33.966336225s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:10.67238242 +0000 UTC m=+1184.388207944" lastFinishedPulling="2026-03-18 12:29:41.816119436 +0000 UTC m=+1215.531944960" observedRunningTime="2026-03-18 12:29:42.958477511 +0000 UTC m=+1216.674303055" watchObservedRunningTime="2026-03-18 12:29:42.966336225 +0000 UTC m=+1216.682161789" Mar 18 12:29:49 crc kubenswrapper[4843]: I0318 12:29:49.702277 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-bs4lw" Mar 18 12:29:49 crc kubenswrapper[4843]: I0318 12:29:49.739557 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp" Mar 18 12:29:49 crc kubenswrapper[4843]: I0318 12:29:49.750054 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-vrstp" Mar 18 12:29:50 crc kubenswrapper[4843]: I0318 12:29:50.034640 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:29:50 crc kubenswrapper[4843]: I0318 12:29:50.034712 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:29:50 crc kubenswrapper[4843]: I0318 12:29:50.290972 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w4wgm" Mar 18 12:29:52 crc kubenswrapper[4843]: E0318 12:29:52.418569 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:a4cb438fef247332815b032c8a248bc65b873274aaac92478a22aa2f915798db" Mar 18 12:29:52 crc kubenswrapper[4843]: E0318 12:29:52.418837 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:a4cb438fef247332815b032c8a248bc65b873274aaac92478a22aa2f915798db,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rz6wx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-7b9c774f96-92l6c_openstack-operators(78bea798-f276-4bec-9b9f-32148b813f3e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:29:52 crc kubenswrapper[4843]: E0318 12:29:52.420937 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" podUID="78bea798-f276-4bec-9b9f-32148b813f3e" Mar 18 12:29:53 crc kubenswrapper[4843]: E0318 12:29:53.066104 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:a4cb438fef247332815b032c8a248bc65b873274aaac92478a22aa2f915798db\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" podUID="78bea798-f276-4bec-9b9f-32148b813f3e" Mar 18 12:29:54 crc kubenswrapper[4843]: I0318 12:29:54.076105 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" event={"ID":"503b0a02-e1c5-4375-94f7-466bb80e97c8","Type":"ContainerStarted","Data":"9de5e3a47073afb1f51cebb17f254fbb6597827e783a746605a49e92b788f7d0"} Mar 18 12:29:54 crc kubenswrapper[4843]: I0318 12:29:54.079024 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:29:54 crc kubenswrapper[4843]: I0318 12:29:54.131473 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" podStartSLOduration=34.086905546 podStartE2EDuration="45.131436065s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:42.093468458 +0000 UTC m=+1215.809293982" lastFinishedPulling="2026-03-18 12:29:53.137998967 +0000 UTC m=+1226.853824501" observedRunningTime="2026-03-18 12:29:54.116613063 +0000 UTC m=+1227.832438617" watchObservedRunningTime="2026-03-18 12:29:54.131436065 +0000 UTC m=+1227.847261629" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.153786 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563950-8lgnx"] Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.155694 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563950-8lgnx" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.160710 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.162712 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl"] Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.163265 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.163431 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.163817 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.166115 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.166308 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.173595 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563950-8lgnx"] Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.179833 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl"] Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.310713 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e18e83a1-efe3-4695-9897-f0ca13d4bedf-secret-volume\") pod \"collect-profiles-29563950-nqssl\" (UID: \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.311032 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbvgq\" (UniqueName: \"kubernetes.io/projected/514b942b-9005-40b3-8671-8a765b1844d0-kube-api-access-xbvgq\") pod \"auto-csr-approver-29563950-8lgnx\" (UID: \"514b942b-9005-40b3-8671-8a765b1844d0\") " pod="openshift-infra/auto-csr-approver-29563950-8lgnx" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.311356 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljsxl\" (UniqueName: \"kubernetes.io/projected/e18e83a1-efe3-4695-9897-f0ca13d4bedf-kube-api-access-ljsxl\") pod \"collect-profiles-29563950-nqssl\" (UID: \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.311578 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e18e83a1-efe3-4695-9897-f0ca13d4bedf-config-volume\") pod \"collect-profiles-29563950-nqssl\" (UID: \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.412606 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljsxl\" (UniqueName: \"kubernetes.io/projected/e18e83a1-efe3-4695-9897-f0ca13d4bedf-kube-api-access-ljsxl\") pod \"collect-profiles-29563950-nqssl\" (UID: \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.413022 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e18e83a1-efe3-4695-9897-f0ca13d4bedf-config-volume\") pod \"collect-profiles-29563950-nqssl\" (UID: \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.413163 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e18e83a1-efe3-4695-9897-f0ca13d4bedf-secret-volume\") pod \"collect-profiles-29563950-nqssl\" (UID: \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.413266 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbvgq\" (UniqueName: \"kubernetes.io/projected/514b942b-9005-40b3-8671-8a765b1844d0-kube-api-access-xbvgq\") pod \"auto-csr-approver-29563950-8lgnx\" (UID: \"514b942b-9005-40b3-8671-8a765b1844d0\") " pod="openshift-infra/auto-csr-approver-29563950-8lgnx" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.414177 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e18e83a1-efe3-4695-9897-f0ca13d4bedf-config-volume\") pod \"collect-profiles-29563950-nqssl\" (UID: \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.427412 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e18e83a1-efe3-4695-9897-f0ca13d4bedf-secret-volume\") pod \"collect-profiles-29563950-nqssl\" (UID: \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.431904 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbvgq\" (UniqueName: \"kubernetes.io/projected/514b942b-9005-40b3-8671-8a765b1844d0-kube-api-access-xbvgq\") pod \"auto-csr-approver-29563950-8lgnx\" (UID: \"514b942b-9005-40b3-8671-8a765b1844d0\") " pod="openshift-infra/auto-csr-approver-29563950-8lgnx" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.435080 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljsxl\" (UniqueName: \"kubernetes.io/projected/e18e83a1-efe3-4695-9897-f0ca13d4bedf-kube-api-access-ljsxl\") pod \"collect-profiles-29563950-nqssl\" (UID: \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.526817 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563950-8lgnx" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.534224 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" Mar 18 12:30:00 crc kubenswrapper[4843]: I0318 12:30:00.775685 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563950-8lgnx"] Mar 18 12:30:01 crc kubenswrapper[4843]: I0318 12:30:01.031778 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl"] Mar 18 12:30:01 crc kubenswrapper[4843]: W0318 12:30:01.036889 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode18e83a1_efe3_4695_9897_f0ca13d4bedf.slice/crio-6769a5c56b66f217efce5ca0a7a039998d4de8a31ab42ae8f415a6dae810fdef WatchSource:0}: Error finding container 6769a5c56b66f217efce5ca0a7a039998d4de8a31ab42ae8f415a6dae810fdef: Status 404 returned error can't find the container with id 6769a5c56b66f217efce5ca0a7a039998d4de8a31ab42ae8f415a6dae810fdef Mar 18 12:30:01 crc kubenswrapper[4843]: I0318 12:30:01.137085 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" event={"ID":"e18e83a1-efe3-4695-9897-f0ca13d4bedf","Type":"ContainerStarted","Data":"6769a5c56b66f217efce5ca0a7a039998d4de8a31ab42ae8f415a6dae810fdef"} Mar 18 12:30:01 crc kubenswrapper[4843]: I0318 12:30:01.138445 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563950-8lgnx" event={"ID":"514b942b-9005-40b3-8671-8a765b1844d0","Type":"ContainerStarted","Data":"ce1ccc4f511f06f2b51294416c161abf95faf408fa522ce6952d7e256121b381"} Mar 18 12:30:01 crc kubenswrapper[4843]: I0318 12:30:01.781850 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-6lz54" Mar 18 12:30:02 crc kubenswrapper[4843]: I0318 12:30:02.146992 4843 generic.go:334] "Generic (PLEG): container finished" podID="e18e83a1-efe3-4695-9897-f0ca13d4bedf" containerID="29e56a3dc8eada61b02d762e9ae7cda6f7f5ecd86146748f0c2dd10f62883a64" exitCode=0 Mar 18 12:30:02 crc kubenswrapper[4843]: I0318 12:30:02.147052 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" event={"ID":"e18e83a1-efe3-4695-9897-f0ca13d4bedf","Type":"ContainerDied","Data":"29e56a3dc8eada61b02d762e9ae7cda6f7f5ecd86146748f0c2dd10f62883a64"} Mar 18 12:30:03 crc kubenswrapper[4843]: I0318 12:30:03.154052 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563950-8lgnx" event={"ID":"514b942b-9005-40b3-8671-8a765b1844d0","Type":"ContainerStarted","Data":"165a6da61c0dd31b49a64173d4036ea27179b8d3410a706091c805452b80b33d"} Mar 18 12:30:03 crc kubenswrapper[4843]: I0318 12:30:03.179628 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563950-8lgnx" podStartSLOduration=1.38750503 podStartE2EDuration="3.179607054s" podCreationTimestamp="2026-03-18 12:30:00 +0000 UTC" firstStartedPulling="2026-03-18 12:30:00.788036243 +0000 UTC m=+1234.503861767" lastFinishedPulling="2026-03-18 12:30:02.580138267 +0000 UTC m=+1236.295963791" observedRunningTime="2026-03-18 12:30:03.172264445 +0000 UTC m=+1236.888089969" watchObservedRunningTime="2026-03-18 12:30:03.179607054 +0000 UTC m=+1236.895432598" Mar 18 12:30:03 crc kubenswrapper[4843]: I0318 12:30:03.429724 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" Mar 18 12:30:03 crc kubenswrapper[4843]: E0318 12:30:03.433888 4843 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod514b942b_9005_40b3_8671_8a765b1844d0.slice/crio-165a6da61c0dd31b49a64173d4036ea27179b8d3410a706091c805452b80b33d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod514b942b_9005_40b3_8671_8a765b1844d0.slice/crio-conmon-165a6da61c0dd31b49a64173d4036ea27179b8d3410a706091c805452b80b33d.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:30:03 crc kubenswrapper[4843]: I0318 12:30:03.472384 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljsxl\" (UniqueName: \"kubernetes.io/projected/e18e83a1-efe3-4695-9897-f0ca13d4bedf-kube-api-access-ljsxl\") pod \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\" (UID: \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\") " Mar 18 12:30:03 crc kubenswrapper[4843]: I0318 12:30:03.472462 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e18e83a1-efe3-4695-9897-f0ca13d4bedf-secret-volume\") pod \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\" (UID: \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\") " Mar 18 12:30:03 crc kubenswrapper[4843]: I0318 12:30:03.472493 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e18e83a1-efe3-4695-9897-f0ca13d4bedf-config-volume\") pod \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\" (UID: \"e18e83a1-efe3-4695-9897-f0ca13d4bedf\") " Mar 18 12:30:03 crc kubenswrapper[4843]: I0318 12:30:03.473129 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e18e83a1-efe3-4695-9897-f0ca13d4bedf-config-volume" (OuterVolumeSpecName: "config-volume") pod "e18e83a1-efe3-4695-9897-f0ca13d4bedf" (UID: "e18e83a1-efe3-4695-9897-f0ca13d4bedf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:03 crc kubenswrapper[4843]: I0318 12:30:03.478540 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e18e83a1-efe3-4695-9897-f0ca13d4bedf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e18e83a1-efe3-4695-9897-f0ca13d4bedf" (UID: "e18e83a1-efe3-4695-9897-f0ca13d4bedf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:30:03 crc kubenswrapper[4843]: I0318 12:30:03.480352 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e18e83a1-efe3-4695-9897-f0ca13d4bedf-kube-api-access-ljsxl" (OuterVolumeSpecName: "kube-api-access-ljsxl") pod "e18e83a1-efe3-4695-9897-f0ca13d4bedf" (UID: "e18e83a1-efe3-4695-9897-f0ca13d4bedf"). InnerVolumeSpecName "kube-api-access-ljsxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:03 crc kubenswrapper[4843]: I0318 12:30:03.574395 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljsxl\" (UniqueName: \"kubernetes.io/projected/e18e83a1-efe3-4695-9897-f0ca13d4bedf-kube-api-access-ljsxl\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:03 crc kubenswrapper[4843]: I0318 12:30:03.574442 4843 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e18e83a1-efe3-4695-9897-f0ca13d4bedf-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:03 crc kubenswrapper[4843]: I0318 12:30:03.574461 4843 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e18e83a1-efe3-4695-9897-f0ca13d4bedf-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:04 crc kubenswrapper[4843]: I0318 12:30:04.173520 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" Mar 18 12:30:04 crc kubenswrapper[4843]: I0318 12:30:04.173523 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl" event={"ID":"e18e83a1-efe3-4695-9897-f0ca13d4bedf","Type":"ContainerDied","Data":"6769a5c56b66f217efce5ca0a7a039998d4de8a31ab42ae8f415a6dae810fdef"} Mar 18 12:30:04 crc kubenswrapper[4843]: I0318 12:30:04.174529 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6769a5c56b66f217efce5ca0a7a039998d4de8a31ab42ae8f415a6dae810fdef" Mar 18 12:30:04 crc kubenswrapper[4843]: I0318 12:30:04.176506 4843 generic.go:334] "Generic (PLEG): container finished" podID="514b942b-9005-40b3-8671-8a765b1844d0" containerID="165a6da61c0dd31b49a64173d4036ea27179b8d3410a706091c805452b80b33d" exitCode=0 Mar 18 12:30:04 crc kubenswrapper[4843]: I0318 12:30:04.176581 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563950-8lgnx" event={"ID":"514b942b-9005-40b3-8671-8a765b1844d0","Type":"ContainerDied","Data":"165a6da61c0dd31b49a64173d4036ea27179b8d3410a706091c805452b80b33d"} Mar 18 12:30:05 crc kubenswrapper[4843]: I0318 12:30:05.184765 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" event={"ID":"78bea798-f276-4bec-9b9f-32148b813f3e","Type":"ContainerStarted","Data":"76cd0c7f8e62f219e4d454c70c46b064fbfec960d2e7e54a29130b7473ac3463"} Mar 18 12:30:05 crc kubenswrapper[4843]: I0318 12:30:05.184962 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:30:05 crc kubenswrapper[4843]: I0318 12:30:05.205613 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" podStartSLOduration=33.173318732 podStartE2EDuration="56.205595509s" podCreationTimestamp="2026-03-18 12:29:09 +0000 UTC" firstStartedPulling="2026-03-18 12:29:41.661763014 +0000 UTC m=+1215.377588538" lastFinishedPulling="2026-03-18 12:30:04.694039751 +0000 UTC m=+1238.409865315" observedRunningTime="2026-03-18 12:30:05.201439421 +0000 UTC m=+1238.917264945" watchObservedRunningTime="2026-03-18 12:30:05.205595509 +0000 UTC m=+1238.921421043" Mar 18 12:30:05 crc kubenswrapper[4843]: I0318 12:30:05.437815 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563950-8lgnx" Mar 18 12:30:05 crc kubenswrapper[4843]: I0318 12:30:05.610286 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbvgq\" (UniqueName: \"kubernetes.io/projected/514b942b-9005-40b3-8671-8a765b1844d0-kube-api-access-xbvgq\") pod \"514b942b-9005-40b3-8671-8a765b1844d0\" (UID: \"514b942b-9005-40b3-8671-8a765b1844d0\") " Mar 18 12:30:05 crc kubenswrapper[4843]: I0318 12:30:05.617976 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514b942b-9005-40b3-8671-8a765b1844d0-kube-api-access-xbvgq" (OuterVolumeSpecName: "kube-api-access-xbvgq") pod "514b942b-9005-40b3-8671-8a765b1844d0" (UID: "514b942b-9005-40b3-8671-8a765b1844d0"). InnerVolumeSpecName "kube-api-access-xbvgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:05 crc kubenswrapper[4843]: I0318 12:30:05.712292 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbvgq\" (UniqueName: \"kubernetes.io/projected/514b942b-9005-40b3-8671-8a765b1844d0-kube-api-access-xbvgq\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:06 crc kubenswrapper[4843]: I0318 12:30:06.197280 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563950-8lgnx" event={"ID":"514b942b-9005-40b3-8671-8a765b1844d0","Type":"ContainerDied","Data":"ce1ccc4f511f06f2b51294416c161abf95faf408fa522ce6952d7e256121b381"} Mar 18 12:30:06 crc kubenswrapper[4843]: I0318 12:30:06.197328 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563950-8lgnx" Mar 18 12:30:06 crc kubenswrapper[4843]: I0318 12:30:06.197348 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce1ccc4f511f06f2b51294416c161abf95faf408fa522ce6952d7e256121b381" Mar 18 12:30:06 crc kubenswrapper[4843]: I0318 12:30:06.288937 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563944-69jz4"] Mar 18 12:30:06 crc kubenswrapper[4843]: I0318 12:30:06.296017 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563944-69jz4"] Mar 18 12:30:07 crc kubenswrapper[4843]: I0318 12:30:06.999780 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d999a45-20dd-4cc6-820a-1d54ca5f0fe2" path="/var/lib/kubelet/pods/0d999a45-20dd-4cc6-820a-1d54ca5f0fe2/volumes" Mar 18 12:30:11 crc kubenswrapper[4843]: I0318 12:30:11.454716 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-92l6c" Mar 18 12:30:20 crc kubenswrapper[4843]: I0318 12:30:20.034940 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:30:20 crc kubenswrapper[4843]: I0318 12:30:20.035642 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.363704 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ndhdd"] Mar 18 12:30:32 crc kubenswrapper[4843]: E0318 12:30:32.364600 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514b942b-9005-40b3-8671-8a765b1844d0" containerName="oc" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.364615 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="514b942b-9005-40b3-8671-8a765b1844d0" containerName="oc" Mar 18 12:30:32 crc kubenswrapper[4843]: E0318 12:30:32.364680 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18e83a1-efe3-4695-9897-f0ca13d4bedf" containerName="collect-profiles" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.364692 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18e83a1-efe3-4695-9897-f0ca13d4bedf" containerName="collect-profiles" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.364913 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="e18e83a1-efe3-4695-9897-f0ca13d4bedf" containerName="collect-profiles" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.364949 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="514b942b-9005-40b3-8671-8a765b1844d0" containerName="oc" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.366147 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ndhdd" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.368766 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.369392 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lff6d" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.369622 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.369772 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.371615 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ndhdd"] Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.429475 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-52dv9"] Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.432058 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.440739 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-52dv9"] Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.447632 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.481170 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a1a107-f4a2-4d98-9152-33fd271206bc-config\") pod \"dnsmasq-dns-675f4bcbfc-ndhdd\" (UID: \"22a1a107-f4a2-4d98-9152-33fd271206bc\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ndhdd" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.481231 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2rvt\" (UniqueName: \"kubernetes.io/projected/22a1a107-f4a2-4d98-9152-33fd271206bc-kube-api-access-j2rvt\") pod \"dnsmasq-dns-675f4bcbfc-ndhdd\" (UID: \"22a1a107-f4a2-4d98-9152-33fd271206bc\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ndhdd" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.582774 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a1a107-f4a2-4d98-9152-33fd271206bc-config\") pod \"dnsmasq-dns-675f4bcbfc-ndhdd\" (UID: \"22a1a107-f4a2-4d98-9152-33fd271206bc\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ndhdd" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.582890 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-config\") pod \"dnsmasq-dns-78dd6ddcc-52dv9\" (UID: \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.582940 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2rvt\" (UniqueName: \"kubernetes.io/projected/22a1a107-f4a2-4d98-9152-33fd271206bc-kube-api-access-j2rvt\") pod \"dnsmasq-dns-675f4bcbfc-ndhdd\" (UID: \"22a1a107-f4a2-4d98-9152-33fd271206bc\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ndhdd" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.583014 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-52dv9\" (UID: \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.583100 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv6xr\" (UniqueName: \"kubernetes.io/projected/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-kube-api-access-hv6xr\") pod \"dnsmasq-dns-78dd6ddcc-52dv9\" (UID: \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.585438 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a1a107-f4a2-4d98-9152-33fd271206bc-config\") pod \"dnsmasq-dns-675f4bcbfc-ndhdd\" (UID: \"22a1a107-f4a2-4d98-9152-33fd271206bc\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ndhdd" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.608582 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2rvt\" (UniqueName: \"kubernetes.io/projected/22a1a107-f4a2-4d98-9152-33fd271206bc-kube-api-access-j2rvt\") pod \"dnsmasq-dns-675f4bcbfc-ndhdd\" (UID: \"22a1a107-f4a2-4d98-9152-33fd271206bc\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ndhdd" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.684556 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ndhdd" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.684920 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-config\") pod \"dnsmasq-dns-78dd6ddcc-52dv9\" (UID: \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.684986 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-52dv9\" (UID: \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.685026 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv6xr\" (UniqueName: \"kubernetes.io/projected/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-kube-api-access-hv6xr\") pod \"dnsmasq-dns-78dd6ddcc-52dv9\" (UID: \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.686012 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-52dv9\" (UID: \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.686816 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-config\") pod \"dnsmasq-dns-78dd6ddcc-52dv9\" (UID: \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.704382 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv6xr\" (UniqueName: \"kubernetes.io/projected/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-kube-api-access-hv6xr\") pod \"dnsmasq-dns-78dd6ddcc-52dv9\" (UID: \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" Mar 18 12:30:32 crc kubenswrapper[4843]: I0318 12:30:32.756314 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" Mar 18 12:30:33 crc kubenswrapper[4843]: I0318 12:30:33.168041 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ndhdd"] Mar 18 12:30:33 crc kubenswrapper[4843]: I0318 12:30:33.179885 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:30:33 crc kubenswrapper[4843]: I0318 12:30:33.229922 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-52dv9"] Mar 18 12:30:33 crc kubenswrapper[4843]: W0318 12:30:33.233356 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod414f50e8_c87d_4ef6_8cd2_40dcf12f5ade.slice/crio-d73b8b829b45ae69bd69676148279a9b98e575cc1a085003382b5ed7464bd65b WatchSource:0}: Error finding container d73b8b829b45ae69bd69676148279a9b98e575cc1a085003382b5ed7464bd65b: Status 404 returned error can't find the container with id d73b8b829b45ae69bd69676148279a9b98e575cc1a085003382b5ed7464bd65b Mar 18 12:30:33 crc kubenswrapper[4843]: I0318 12:30:33.456913 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" event={"ID":"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade","Type":"ContainerStarted","Data":"d73b8b829b45ae69bd69676148279a9b98e575cc1a085003382b5ed7464bd65b"} Mar 18 12:30:33 crc kubenswrapper[4843]: I0318 12:30:33.458317 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ndhdd" event={"ID":"22a1a107-f4a2-4d98-9152-33fd271206bc","Type":"ContainerStarted","Data":"2313f7aaa9f79f1ea627a9b4b50bef5a1cc8707402a5af583bdcd3659e0216bd"} Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.174222 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ndhdd"] Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.205576 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-v8m9t"] Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.206772 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.242959 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-v8m9t"] Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.331597 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-v8m9t\" (UID: \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.331697 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-config\") pod \"dnsmasq-dns-5ccc8479f9-v8m9t\" (UID: \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.331732 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lngcf\" (UniqueName: \"kubernetes.io/projected/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-kube-api-access-lngcf\") pod \"dnsmasq-dns-5ccc8479f9-v8m9t\" (UID: \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.433605 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-config\") pod \"dnsmasq-dns-5ccc8479f9-v8m9t\" (UID: \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.433930 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lngcf\" (UniqueName: \"kubernetes.io/projected/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-kube-api-access-lngcf\") pod \"dnsmasq-dns-5ccc8479f9-v8m9t\" (UID: \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.434021 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-v8m9t\" (UID: \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.434837 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-config\") pod \"dnsmasq-dns-5ccc8479f9-v8m9t\" (UID: \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.438245 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-v8m9t\" (UID: \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.468028 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lngcf\" (UniqueName: \"kubernetes.io/projected/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-kube-api-access-lngcf\") pod \"dnsmasq-dns-5ccc8479f9-v8m9t\" (UID: \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.474597 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-52dv9"] Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.503286 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bgzck"] Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.507076 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.516766 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bgzck"] Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.529405 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.641397 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91d7bffb-7c24-4a70-a412-258080407683-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bgzck\" (UID: \"91d7bffb-7c24-4a70-a412-258080407683\") " pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.641442 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91d7bffb-7c24-4a70-a412-258080407683-config\") pod \"dnsmasq-dns-57d769cc4f-bgzck\" (UID: \"91d7bffb-7c24-4a70-a412-258080407683\") " pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.641494 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfxsn\" (UniqueName: \"kubernetes.io/projected/91d7bffb-7c24-4a70-a412-258080407683-kube-api-access-bfxsn\") pod \"dnsmasq-dns-57d769cc4f-bgzck\" (UID: \"91d7bffb-7c24-4a70-a412-258080407683\") " pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.743246 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91d7bffb-7c24-4a70-a412-258080407683-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bgzck\" (UID: \"91d7bffb-7c24-4a70-a412-258080407683\") " pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.743572 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91d7bffb-7c24-4a70-a412-258080407683-config\") pod \"dnsmasq-dns-57d769cc4f-bgzck\" (UID: \"91d7bffb-7c24-4a70-a412-258080407683\") " pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.744368 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91d7bffb-7c24-4a70-a412-258080407683-config\") pod \"dnsmasq-dns-57d769cc4f-bgzck\" (UID: \"91d7bffb-7c24-4a70-a412-258080407683\") " pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.744616 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfxsn\" (UniqueName: \"kubernetes.io/projected/91d7bffb-7c24-4a70-a412-258080407683-kube-api-access-bfxsn\") pod \"dnsmasq-dns-57d769cc4f-bgzck\" (UID: \"91d7bffb-7c24-4a70-a412-258080407683\") " pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.744700 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91d7bffb-7c24-4a70-a412-258080407683-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bgzck\" (UID: \"91d7bffb-7c24-4a70-a412-258080407683\") " pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.783163 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfxsn\" (UniqueName: \"kubernetes.io/projected/91d7bffb-7c24-4a70-a412-258080407683-kube-api-access-bfxsn\") pod \"dnsmasq-dns-57d769cc4f-bgzck\" (UID: \"91d7bffb-7c24-4a70-a412-258080407683\") " pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:30:35 crc kubenswrapper[4843]: I0318 12:30:35.835419 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.026136 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-v8m9t"] Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.285221 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bgzck"] Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.367017 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.369968 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.371985 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.372959 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.373190 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.373345 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.373412 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.374924 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rtkzr" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.374925 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.375134 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.457659 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2gkk\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-kube-api-access-r2gkk\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.457708 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.457742 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.457764 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/257240a5-cc42-4354-9079-66e6de070b34-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.457778 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.457807 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.457841 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.457855 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.457871 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/257240a5-cc42-4354-9079-66e6de070b34-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.457890 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.457913 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.559034 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.559077 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.559104 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/257240a5-cc42-4354-9079-66e6de070b34-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.559132 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.559175 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.559228 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2gkk\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-kube-api-access-r2gkk\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.559261 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.559293 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.559319 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/257240a5-cc42-4354-9079-66e6de070b34-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.559340 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.559378 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.559784 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.563155 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/257240a5-cc42-4354-9079-66e6de070b34-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.563369 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.564020 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.564055 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.564635 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.564924 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.565383 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.566554 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/257240a5-cc42-4354-9079-66e6de070b34-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.568493 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.582296 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2gkk\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-kube-api-access-r2gkk\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.583243 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.620631 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.622246 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.625857 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ndvrz" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.625914 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.626188 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.626194 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.626284 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.626427 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.627419 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.630520 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.693689 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.763592 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c41f082-cf59-42b4-8314-64aace288dd1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.763666 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.763905 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppb4q\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-kube-api-access-ppb4q\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.764055 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.764139 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c41f082-cf59-42b4-8314-64aace288dd1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.764339 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.764485 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.764623 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.764815 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-config-data\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.764929 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.765056 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.866705 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c41f082-cf59-42b4-8314-64aace288dd1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.866748 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.866775 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.866799 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.866834 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-config-data\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.866852 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.866875 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.866903 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c41f082-cf59-42b4-8314-64aace288dd1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.866934 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.866956 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppb4q\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-kube-api-access-ppb4q\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.866993 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.867897 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.867950 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.868329 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.868413 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.868432 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-config-data\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.868712 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.872852 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.872609 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c41f082-cf59-42b4-8314-64aace288dd1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.881887 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c41f082-cf59-42b4-8314-64aace288dd1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.881965 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.885343 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppb4q\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-kube-api-access-ppb4q\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.921908 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:36 crc kubenswrapper[4843]: I0318 12:30:36.958060 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.749129 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.751460 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.758090 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.758358 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.758402 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-7x24z" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.758544 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.767932 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.774175 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.886013 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb4fj\" (UniqueName: \"kubernetes.io/projected/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-kube-api-access-nb4fj\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.886068 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.886138 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.886173 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-config-data-default\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.886198 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.886290 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.886315 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.886336 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-kolla-config\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.987291 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-config-data-default\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.987343 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.987423 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.987450 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.987473 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-kolla-config\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.987515 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb4fj\" (UniqueName: \"kubernetes.io/projected/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-kube-api-access-nb4fj\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.987546 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.987608 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.988747 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.988757 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.989560 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-config-data-default\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.990046 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:37 crc kubenswrapper[4843]: I0318 12:30:37.990474 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-kolla-config\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:38 crc kubenswrapper[4843]: I0318 12:30:37.993871 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:38 crc kubenswrapper[4843]: I0318 12:30:37.994330 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:38 crc kubenswrapper[4843]: I0318 12:30:38.008042 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb4fj\" (UniqueName: \"kubernetes.io/projected/a384e3a7-e6ad-4832-8218-ba3f11df2c2f-kube-api-access-nb4fj\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:38 crc kubenswrapper[4843]: I0318 12:30:38.014006 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"a384e3a7-e6ad-4832-8218-ba3f11df2c2f\") " pod="openstack/openstack-galera-0" Mar 18 12:30:38 crc kubenswrapper[4843]: I0318 12:30:38.087682 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.127290 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.128705 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.131346 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.131376 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.131603 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.134860 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9rfsv" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.152813 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.240740 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.240815 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgb9m\" (UniqueName: \"kubernetes.io/projected/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-kube-api-access-lgb9m\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.240939 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.240984 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.241109 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.241167 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.241219 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.241283 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.342403 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.342472 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.342533 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.342563 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgb9m\" (UniqueName: \"kubernetes.io/projected/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-kube-api-access-lgb9m\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.342589 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.342604 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.342639 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.342677 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.342964 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.344173 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.345459 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.355021 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.355661 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.363316 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.366024 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.371206 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgb9m\" (UniqueName: \"kubernetes.io/projected/9f1dd599-3b73-4b6c-8f80-0fbb1ce13520-kube-api-access-lgb9m\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.374896 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.376768 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.396985 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-fcntx" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.397160 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.397242 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.397505 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.403171 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.448541 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.520405 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" event={"ID":"fcc3bbed-56b9-41a8-baaa-b79ff13848ed","Type":"ContainerStarted","Data":"9faa216b243dcd95eac07863d4d200472992f5a609e606e4575b15f6d77f3233"} Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.558642 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c0d669e-b241-4702-96c3-2de893c52987-config-data\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.558765 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c0d669e-b241-4702-96c3-2de893c52987-kolla-config\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.558795 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0d669e-b241-4702-96c3-2de893c52987-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.558832 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm5pn\" (UniqueName: \"kubernetes.io/projected/4c0d669e-b241-4702-96c3-2de893c52987-kube-api-access-wm5pn\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.558914 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c0d669e-b241-4702-96c3-2de893c52987-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.660718 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c0d669e-b241-4702-96c3-2de893c52987-kolla-config\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.660778 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0d669e-b241-4702-96c3-2de893c52987-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.660829 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm5pn\" (UniqueName: \"kubernetes.io/projected/4c0d669e-b241-4702-96c3-2de893c52987-kube-api-access-wm5pn\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.660905 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c0d669e-b241-4702-96c3-2de893c52987-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.660941 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c0d669e-b241-4702-96c3-2de893c52987-config-data\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.661666 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c0d669e-b241-4702-96c3-2de893c52987-kolla-config\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.661779 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c0d669e-b241-4702-96c3-2de893c52987-config-data\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.665174 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0d669e-b241-4702-96c3-2de893c52987-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.668497 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c0d669e-b241-4702-96c3-2de893c52987-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.676416 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm5pn\" (UniqueName: \"kubernetes.io/projected/4c0d669e-b241-4702-96c3-2de893c52987-kube-api-access-wm5pn\") pod \"memcached-0\" (UID: \"4c0d669e-b241-4702-96c3-2de893c52987\") " pod="openstack/memcached-0" Mar 18 12:30:39 crc kubenswrapper[4843]: W0318 12:30:39.690997 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91d7bffb_7c24_4a70_a412_258080407683.slice/crio-ca8bbf1a9465fea6818f7cce1e37652f79ee2cbed8e8ea9238c8fb9944e22b29 WatchSource:0}: Error finding container ca8bbf1a9465fea6818f7cce1e37652f79ee2cbed8e8ea9238c8fb9944e22b29: Status 404 returned error can't find the container with id ca8bbf1a9465fea6818f7cce1e37652f79ee2cbed8e8ea9238c8fb9944e22b29 Mar 18 12:30:39 crc kubenswrapper[4843]: I0318 12:30:39.737075 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 12:30:40 crc kubenswrapper[4843]: I0318 12:30:40.529664 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" event={"ID":"91d7bffb-7c24-4a70-a412-258080407683","Type":"ContainerStarted","Data":"ca8bbf1a9465fea6818f7cce1e37652f79ee2cbed8e8ea9238c8fb9944e22b29"} Mar 18 12:30:41 crc kubenswrapper[4843]: I0318 12:30:41.700637 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:30:41 crc kubenswrapper[4843]: I0318 12:30:41.701761 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:30:41 crc kubenswrapper[4843]: I0318 12:30:41.706127 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-n8x24" Mar 18 12:30:41 crc kubenswrapper[4843]: I0318 12:30:41.715751 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:30:41 crc kubenswrapper[4843]: I0318 12:30:41.805666 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-526cc\" (UniqueName: \"kubernetes.io/projected/77409830-08b4-4a50-8f8e-0e0c3ad009b4-kube-api-access-526cc\") pod \"kube-state-metrics-0\" (UID: \"77409830-08b4-4a50-8f8e-0e0c3ad009b4\") " pod="openstack/kube-state-metrics-0" Mar 18 12:30:41 crc kubenswrapper[4843]: I0318 12:30:41.906846 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-526cc\" (UniqueName: \"kubernetes.io/projected/77409830-08b4-4a50-8f8e-0e0c3ad009b4-kube-api-access-526cc\") pod \"kube-state-metrics-0\" (UID: \"77409830-08b4-4a50-8f8e-0e0c3ad009b4\") " pod="openstack/kube-state-metrics-0" Mar 18 12:30:41 crc kubenswrapper[4843]: I0318 12:30:41.937074 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-526cc\" (UniqueName: \"kubernetes.io/projected/77409830-08b4-4a50-8f8e-0e0c3ad009b4-kube-api-access-526cc\") pod \"kube-state-metrics-0\" (UID: \"77409830-08b4-4a50-8f8e-0e0c3ad009b4\") " pod="openstack/kube-state-metrics-0" Mar 18 12:30:42 crc kubenswrapper[4843]: I0318 12:30:42.030881 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:30:44 crc kubenswrapper[4843]: I0318 12:30:44.956097 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5x5rj"] Mar 18 12:30:44 crc kubenswrapper[4843]: I0318 12:30:44.957855 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:44 crc kubenswrapper[4843]: I0318 12:30:44.959728 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-5xprk" Mar 18 12:30:44 crc kubenswrapper[4843]: I0318 12:30:44.959922 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 12:30:44 crc kubenswrapper[4843]: I0318 12:30:44.959966 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 12:30:44 crc kubenswrapper[4843]: I0318 12:30:44.962028 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-s65qf"] Mar 18 12:30:44 crc kubenswrapper[4843]: I0318 12:30:44.963719 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:44.976778 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5x5rj"] Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:44.993876 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-s65qf"] Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.077632 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-scripts\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.077709 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-var-log\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.077751 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-ovn-controller-tls-certs\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.077782 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzptm\" (UniqueName: \"kubernetes.io/projected/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-kube-api-access-nzptm\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.077807 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-scripts\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.077833 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-var-run-ovn\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.077856 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-etc-ovs\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.077891 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-var-run\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.077923 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-var-run\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.077957 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-combined-ca-bundle\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.077999 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-var-lib\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.078024 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lklv\" (UniqueName: \"kubernetes.io/projected/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-kube-api-access-6lklv\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.078067 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-var-log-ovn\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.179798 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-var-log-ovn\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.179871 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-scripts\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.179897 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-var-log\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.179934 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-ovn-controller-tls-certs\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.179954 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzptm\" (UniqueName: \"kubernetes.io/projected/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-kube-api-access-nzptm\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.179970 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-scripts\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.179985 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-var-run-ovn\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.180005 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-etc-ovs\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.180043 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-var-run\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.180066 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-var-run\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.180088 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-combined-ca-bundle\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.180117 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-var-lib\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.180155 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lklv\" (UniqueName: \"kubernetes.io/projected/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-kube-api-access-6lklv\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.181042 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-var-run\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.181055 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-var-run\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.181096 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-var-run-ovn\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.181248 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-etc-ovs\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.183003 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-scripts\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.185711 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-scripts\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.186599 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-ovn-controller-tls-certs\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.187431 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-combined-ca-bundle\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.193974 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-var-lib\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.199879 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-var-log-ovn\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.199931 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-var-log\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.200749 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzptm\" (UniqueName: \"kubernetes.io/projected/a9cc1cd2-018b-40fc-9434-97d649bdd2a8-kube-api-access-nzptm\") pod \"ovn-controller-5x5rj\" (UID: \"a9cc1cd2-018b-40fc-9434-97d649bdd2a8\") " pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.210761 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lklv\" (UniqueName: \"kubernetes.io/projected/a8d4e76a-b337-4cdf-a4fe-929389ba6e8c-kube-api-access-6lklv\") pod \"ovn-controller-ovs-s65qf\" (UID: \"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c\") " pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.322765 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5x5rj" Mar 18 12:30:45 crc kubenswrapper[4843]: I0318 12:30:45.341977 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.254111 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.260369 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.263433 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.263626 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.264119 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.264559 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.265523 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-fvd6l" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.298906 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.390969 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.391090 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2f8130-0164-41ca-aa4a-5e206b21bef2-config\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.391108 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r6rv\" (UniqueName: \"kubernetes.io/projected/7e2f8130-0164-41ca-aa4a-5e206b21bef2-kube-api-access-8r6rv\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.391307 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e2f8130-0164-41ca-aa4a-5e206b21bef2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.391351 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2f8130-0164-41ca-aa4a-5e206b21bef2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.391391 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2f8130-0164-41ca-aa4a-5e206b21bef2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.391420 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2f8130-0164-41ca-aa4a-5e206b21bef2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.391494 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e2f8130-0164-41ca-aa4a-5e206b21bef2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.493262 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2f8130-0164-41ca-aa4a-5e206b21bef2-config\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.493337 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r6rv\" (UniqueName: \"kubernetes.io/projected/7e2f8130-0164-41ca-aa4a-5e206b21bef2-kube-api-access-8r6rv\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.493404 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e2f8130-0164-41ca-aa4a-5e206b21bef2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.493428 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2f8130-0164-41ca-aa4a-5e206b21bef2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.493475 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2f8130-0164-41ca-aa4a-5e206b21bef2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.493496 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2f8130-0164-41ca-aa4a-5e206b21bef2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.493521 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e2f8130-0164-41ca-aa4a-5e206b21bef2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.493585 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.493999 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.494135 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e2f8130-0164-41ca-aa4a-5e206b21bef2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.494669 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e2f8130-0164-41ca-aa4a-5e206b21bef2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.496323 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2f8130-0164-41ca-aa4a-5e206b21bef2-config\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.500765 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2f8130-0164-41ca-aa4a-5e206b21bef2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.510521 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2f8130-0164-41ca-aa4a-5e206b21bef2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.518840 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r6rv\" (UniqueName: \"kubernetes.io/projected/7e2f8130-0164-41ca-aa4a-5e206b21bef2-kube-api-access-8r6rv\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.520383 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2f8130-0164-41ca-aa4a-5e206b21bef2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.532748 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7e2f8130-0164-41ca-aa4a-5e206b21bef2\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:48 crc kubenswrapper[4843]: E0318 12:30:48.554667 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 12:30:48 crc kubenswrapper[4843]: E0318 12:30:48.556089 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j2rvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-ndhdd_openstack(22a1a107-f4a2-4d98-9152-33fd271206bc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:30:48 crc kubenswrapper[4843]: E0318 12:30:48.557702 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-ndhdd" podUID="22a1a107-f4a2-4d98-9152-33fd271206bc" Mar 18 12:30:48 crc kubenswrapper[4843]: E0318 12:30:48.608418 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 12:30:48 crc kubenswrapper[4843]: E0318 12:30:48.608955 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hv6xr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-52dv9_openstack(414f50e8-c87d-4ef6-8cd2-40dcf12f5ade): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:30:48 crc kubenswrapper[4843]: E0318 12:30:48.610626 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" podUID="414f50e8-c87d-4ef6-8cd2-40dcf12f5ade" Mar 18 12:30:48 crc kubenswrapper[4843]: I0318 12:30:48.617975 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.058524 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.061134 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.064210 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.064770 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.065013 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-946d6" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.065737 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.067215 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.169669 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqngs\" (UniqueName: \"kubernetes.io/projected/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-kube-api-access-kqngs\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.169763 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.169794 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.169821 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.169863 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.169892 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.169919 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-config\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.169950 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.271877 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.271914 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.271935 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.271973 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.271993 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.272009 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-config\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.272031 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.272096 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqngs\" (UniqueName: \"kubernetes.io/projected/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-kube-api-access-kqngs\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.273634 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.273910 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.274426 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.274488 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-config\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.281462 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.281494 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.284594 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.293307 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqngs\" (UniqueName: \"kubernetes.io/projected/2e6bed4c-30f0-4088-9ac8-2a818cd781d6-kube-api-access-kqngs\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.299014 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.315286 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2e6bed4c-30f0-4088-9ac8-2a818cd781d6\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.430487 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.516766 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.529454 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.530522 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ndhdd" Mar 18 12:30:49 crc kubenswrapper[4843]: W0318 12:30:49.539311 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77409830_08b4_4a50_8f8e_0e0c3ad009b4.slice/crio-918700776d2f73584e77237f38e2c06bde7de969d96c4609594a06efde2023cc WatchSource:0}: Error finding container 918700776d2f73584e77237f38e2c06bde7de969d96c4609594a06efde2023cc: Status 404 returned error can't find the container with id 918700776d2f73584e77237f38e2c06bde7de969d96c4609594a06efde2023cc Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.562589 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.745451 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2rvt\" (UniqueName: \"kubernetes.io/projected/22a1a107-f4a2-4d98-9152-33fd271206bc-kube-api-access-j2rvt\") pod \"22a1a107-f4a2-4d98-9152-33fd271206bc\" (UID: \"22a1a107-f4a2-4d98-9152-33fd271206bc\") " Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.745502 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a1a107-f4a2-4d98-9152-33fd271206bc-config\") pod \"22a1a107-f4a2-4d98-9152-33fd271206bc\" (UID: \"22a1a107-f4a2-4d98-9152-33fd271206bc\") " Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.745533 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-dns-svc\") pod \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\" (UID: \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\") " Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.745621 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv6xr\" (UniqueName: \"kubernetes.io/projected/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-kube-api-access-hv6xr\") pod \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\" (UID: \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\") " Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.745669 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-config\") pod \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\" (UID: \"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade\") " Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.746643 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-config" (OuterVolumeSpecName: "config") pod "414f50e8-c87d-4ef6-8cd2-40dcf12f5ade" (UID: "414f50e8-c87d-4ef6-8cd2-40dcf12f5ade"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.747796 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "414f50e8-c87d-4ef6-8cd2-40dcf12f5ade" (UID: "414f50e8-c87d-4ef6-8cd2-40dcf12f5ade"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.747829 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a1a107-f4a2-4d98-9152-33fd271206bc-config" (OuterVolumeSpecName: "config") pod "22a1a107-f4a2-4d98-9152-33fd271206bc" (UID: "22a1a107-f4a2-4d98-9152-33fd271206bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.751480 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-kube-api-access-hv6xr" (OuterVolumeSpecName: "kube-api-access-hv6xr") pod "414f50e8-c87d-4ef6-8cd2-40dcf12f5ade" (UID: "414f50e8-c87d-4ef6-8cd2-40dcf12f5ade"). InnerVolumeSpecName "kube-api-access-hv6xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.752971 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a1a107-f4a2-4d98-9152-33fd271206bc-kube-api-access-j2rvt" (OuterVolumeSpecName: "kube-api-access-j2rvt") pod "22a1a107-f4a2-4d98-9152-33fd271206bc" (UID: "22a1a107-f4a2-4d98-9152-33fd271206bc"). InnerVolumeSpecName "kube-api-access-j2rvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.754445 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" event={"ID":"414f50e8-c87d-4ef6-8cd2-40dcf12f5ade","Type":"ContainerDied","Data":"d73b8b829b45ae69bd69676148279a9b98e575cc1a085003382b5ed7464bd65b"} Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.754532 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-52dv9" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.767434 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ndhdd" event={"ID":"22a1a107-f4a2-4d98-9152-33fd271206bc","Type":"ContainerDied","Data":"2313f7aaa9f79f1ea627a9b4b50bef5a1cc8707402a5af583bdcd3659e0216bd"} Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.767524 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ndhdd" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.777855 4843 generic.go:334] "Generic (PLEG): container finished" podID="fcc3bbed-56b9-41a8-baaa-b79ff13848ed" containerID="46108ab64a7614990b2dcb75ec9594dae4f6920b37144f55812fd6d05452fa09" exitCode=0 Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.777999 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" event={"ID":"fcc3bbed-56b9-41a8-baaa-b79ff13848ed","Type":"ContainerDied","Data":"46108ab64a7614990b2dcb75ec9594dae4f6920b37144f55812fd6d05452fa09"} Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.781633 4843 generic.go:334] "Generic (PLEG): container finished" podID="91d7bffb-7c24-4a70-a412-258080407683" containerID="37d5b9b85eac1267ae67eea2778b78b9af5de3c132a92be41b1de304c90c36e9" exitCode=0 Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.781840 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" event={"ID":"91d7bffb-7c24-4a70-a412-258080407683","Type":"ContainerDied","Data":"37d5b9b85eac1267ae67eea2778b78b9af5de3c132a92be41b1de304c90c36e9"} Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.783411 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"257240a5-cc42-4354-9079-66e6de070b34","Type":"ContainerStarted","Data":"f11026eeedc534f2918535264f5f5070a00c005eb929a7b4be8e57789d2283f3"} Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.784787 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77409830-08b4-4a50-8f8e-0e0c3ad009b4","Type":"ContainerStarted","Data":"918700776d2f73584e77237f38e2c06bde7de969d96c4609594a06efde2023cc"} Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.788822 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1c41f082-cf59-42b4-8314-64aace288dd1","Type":"ContainerStarted","Data":"96f25e3bc4bf12b1cfb6fdf64233518801e344d3e8eca473ac443d6e8d832d6f"} Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.829921 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-s65qf"] Mar 18 12:30:49 crc kubenswrapper[4843]: W0318 12:30:49.834908 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8d4e76a_b337_4cdf_a4fe_929389ba6e8c.slice/crio-3096e0acacc02adf4947da47455eb7ea76bfb859f9074ddcb4f1caa91c3a54ab WatchSource:0}: Error finding container 3096e0acacc02adf4947da47455eb7ea76bfb859f9074ddcb4f1caa91c3a54ab: Status 404 returned error can't find the container with id 3096e0acacc02adf4947da47455eb7ea76bfb859f9074ddcb4f1caa91c3a54ab Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.848056 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.848084 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2rvt\" (UniqueName: \"kubernetes.io/projected/22a1a107-f4a2-4d98-9152-33fd271206bc-kube-api-access-j2rvt\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.848095 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a1a107-f4a2-4d98-9152-33fd271206bc-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.848104 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.848131 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv6xr\" (UniqueName: \"kubernetes.io/projected/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade-kube-api-access-hv6xr\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.851118 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-52dv9"] Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.866750 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-52dv9"] Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.895120 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.913698 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5x5rj"] Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.924335 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ndhdd"] Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.929098 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ndhdd"] Mar 18 12:30:49 crc kubenswrapper[4843]: I0318 12:30:49.980312 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.000299 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 12:30:50 crc kubenswrapper[4843]: W0318 12:30:50.007870 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c0d669e_b241_4702_96c3_2de893c52987.slice/crio-1a256458a35ef3e2f04a54a7f2b3ef2b0116cd7d10de687f20e3ed70b38f3e50 WatchSource:0}: Error finding container 1a256458a35ef3e2f04a54a7f2b3ef2b0116cd7d10de687f20e3ed70b38f3e50: Status 404 returned error can't find the container with id 1a256458a35ef3e2f04a54a7f2b3ef2b0116cd7d10de687f20e3ed70b38f3e50 Mar 18 12:30:50 crc kubenswrapper[4843]: W0318 12:30:50.009822 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e2f8130_0164_41ca_aa4a_5e206b21bef2.slice/crio-5243039a7e9397dfc58c83bbcbb1bc500792a2ca28c0532e8d03db10d30cd50e WatchSource:0}: Error finding container 5243039a7e9397dfc58c83bbcbb1bc500792a2ca28c0532e8d03db10d30cd50e: Status 404 returned error can't find the container with id 5243039a7e9397dfc58c83bbcbb1bc500792a2ca28c0532e8d03db10d30cd50e Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.022389 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.034554 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.034608 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.034648 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.035473 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fcc44fd473fc2d97d7be9aa8e61f5a92c58d2a0df082678596236e3adb17e3e"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.035537 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://7fcc44fd473fc2d97d7be9aa8e61f5a92c58d2a0df082678596236e3adb17e3e" gracePeriod=600 Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.107481 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 12:30:50 crc kubenswrapper[4843]: W0318 12:30:50.112886 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e6bed4c_30f0_4088_9ac8_2a818cd781d6.slice/crio-db5df509c20f1a307f069e472e860cde04d8ac92b801a26a2c9cc3f8bc1788d5 WatchSource:0}: Error finding container db5df509c20f1a307f069e472e860cde04d8ac92b801a26a2c9cc3f8bc1788d5: Status 404 returned error can't find the container with id db5df509c20f1a307f069e472e860cde04d8ac92b801a26a2c9cc3f8bc1788d5 Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.797916 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="7fcc44fd473fc2d97d7be9aa8e61f5a92c58d2a0df082678596236e3adb17e3e" exitCode=0 Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.799196 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"7fcc44fd473fc2d97d7be9aa8e61f5a92c58d2a0df082678596236e3adb17e3e"} Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.799286 4843 scope.go:117] "RemoveContainer" containerID="156751a099ebefa58e45dd19fa380fffeac977e92f6bd61d7c8b0b1be68aae80" Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.802644 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5x5rj" event={"ID":"a9cc1cd2-018b-40fc-9434-97d649bdd2a8","Type":"ContainerStarted","Data":"21b5b8445431d0eb89f11422337acb75e3655974434a75d6e2f7076f48004f26"} Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.804776 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520","Type":"ContainerStarted","Data":"ca2984f00c39c2ee837f984497954c52f65713f4a19e16f6d6bf6c8a8062020c"} Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.805900 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a384e3a7-e6ad-4832-8218-ba3f11df2c2f","Type":"ContainerStarted","Data":"b94708a9ba2901b60cd765c754e7b874a3130f1212bd20f547319e29fe5e682d"} Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.806726 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e2f8130-0164-41ca-aa4a-5e206b21bef2","Type":"ContainerStarted","Data":"5243039a7e9397dfc58c83bbcbb1bc500792a2ca28c0532e8d03db10d30cd50e"} Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.807592 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2e6bed4c-30f0-4088-9ac8-2a818cd781d6","Type":"ContainerStarted","Data":"db5df509c20f1a307f069e472e860cde04d8ac92b801a26a2c9cc3f8bc1788d5"} Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.809292 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4c0d669e-b241-4702-96c3-2de893c52987","Type":"ContainerStarted","Data":"1a256458a35ef3e2f04a54a7f2b3ef2b0116cd7d10de687f20e3ed70b38f3e50"} Mar 18 12:30:50 crc kubenswrapper[4843]: I0318 12:30:50.810466 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s65qf" event={"ID":"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c","Type":"ContainerStarted","Data":"3096e0acacc02adf4947da47455eb7ea76bfb859f9074ddcb4f1caa91c3a54ab"} Mar 18 12:30:51 crc kubenswrapper[4843]: I0318 12:30:51.002019 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a1a107-f4a2-4d98-9152-33fd271206bc" path="/var/lib/kubelet/pods/22a1a107-f4a2-4d98-9152-33fd271206bc/volumes" Mar 18 12:30:51 crc kubenswrapper[4843]: I0318 12:30:51.002902 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="414f50e8-c87d-4ef6-8cd2-40dcf12f5ade" path="/var/lib/kubelet/pods/414f50e8-c87d-4ef6-8cd2-40dcf12f5ade/volumes" Mar 18 12:30:51 crc kubenswrapper[4843]: I0318 12:30:51.835669 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" event={"ID":"fcc3bbed-56b9-41a8-baaa-b79ff13848ed","Type":"ContainerStarted","Data":"146fcf8076b78b99cba3b54bfd1a771006c1e08bf7879581193732646ac387df"} Mar 18 12:30:51 crc kubenswrapper[4843]: I0318 12:30:51.836030 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:30:51 crc kubenswrapper[4843]: I0318 12:30:51.840435 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"6c1df8135718fa79548f8df435226cda37a6730cb41d0cf14ca133c83dba65e7"} Mar 18 12:30:51 crc kubenswrapper[4843]: I0318 12:30:51.843101 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" event={"ID":"91d7bffb-7c24-4a70-a412-258080407683","Type":"ContainerStarted","Data":"87509fb4c14fd3cb642c566b345ac28b606679367470450070b16ae58ef7e137"} Mar 18 12:30:51 crc kubenswrapper[4843]: I0318 12:30:51.843893 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:30:51 crc kubenswrapper[4843]: I0318 12:30:51.858409 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" podStartSLOduration=7.048724194 podStartE2EDuration="16.858390739s" podCreationTimestamp="2026-03-18 12:30:35 +0000 UTC" firstStartedPulling="2026-03-18 12:30:38.829164417 +0000 UTC m=+1272.544989941" lastFinishedPulling="2026-03-18 12:30:48.638830962 +0000 UTC m=+1282.354656486" observedRunningTime="2026-03-18 12:30:51.85493613 +0000 UTC m=+1285.570761654" watchObservedRunningTime="2026-03-18 12:30:51.858390739 +0000 UTC m=+1285.574216263" Mar 18 12:30:51 crc kubenswrapper[4843]: I0318 12:30:51.902381 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" podStartSLOduration=7.717446421 podStartE2EDuration="16.902366889s" podCreationTimestamp="2026-03-18 12:30:35 +0000 UTC" firstStartedPulling="2026-03-18 12:30:39.693873698 +0000 UTC m=+1273.409699222" lastFinishedPulling="2026-03-18 12:30:48.878794166 +0000 UTC m=+1282.594619690" observedRunningTime="2026-03-18 12:30:51.894625669 +0000 UTC m=+1285.610451203" watchObservedRunningTime="2026-03-18 12:30:51.902366889 +0000 UTC m=+1285.618192413" Mar 18 12:30:57 crc kubenswrapper[4843]: I0318 12:30:57.028255 4843 scope.go:117] "RemoveContainer" containerID="89710d877cb10056d8c3b344b39be2354aff6222e9a15e23590a0650948a2b32" Mar 18 12:31:00 crc kubenswrapper[4843]: I0318 12:31:00.532055 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:31:00 crc kubenswrapper[4843]: I0318 12:31:00.837799 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:31:00 crc kubenswrapper[4843]: I0318 12:31:00.891771 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-v8m9t"] Mar 18 12:31:00 crc kubenswrapper[4843]: I0318 12:31:00.970429 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e2f8130-0164-41ca-aa4a-5e206b21bef2","Type":"ContainerStarted","Data":"f86a7fe92bb2b7ea24d1c7c5d8dff2cd85de565724a83bd723c0bc5f4971b41d"} Mar 18 12:31:00 crc kubenswrapper[4843]: I0318 12:31:00.972125 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2e6bed4c-30f0-4088-9ac8-2a818cd781d6","Type":"ContainerStarted","Data":"074fc52de3f4eff0440273234c2b68bcf668282fe12c75753ccf97d901ccbae8"} Mar 18 12:31:00 crc kubenswrapper[4843]: I0318 12:31:00.974363 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4c0d669e-b241-4702-96c3-2de893c52987","Type":"ContainerStarted","Data":"54953787baafa747361abc4ebf4a4f6678974ae16c7641083b1a59134e58ea70"} Mar 18 12:31:00 crc kubenswrapper[4843]: I0318 12:31:00.974542 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 12:31:00 crc kubenswrapper[4843]: I0318 12:31:00.977370 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s65qf" event={"ID":"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c","Type":"ContainerStarted","Data":"b52b7291fafe3a7d7e13f6d211b8f99c915bb03806318cded13cadd247826f9b"} Mar 18 12:31:00 crc kubenswrapper[4843]: I0318 12:31:00.980166 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5x5rj" event={"ID":"a9cc1cd2-018b-40fc-9434-97d649bdd2a8","Type":"ContainerStarted","Data":"6cfa5b68a0f0158caebfff52a8be6a93ff98c863e77f265bc661214a68b7a270"} Mar 18 12:31:00 crc kubenswrapper[4843]: I0318 12:31:00.980337 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-5x5rj" Mar 18 12:31:00 crc kubenswrapper[4843]: I0318 12:31:00.981867 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520","Type":"ContainerStarted","Data":"bac5e873e5df5137b8d148b4438f2f902786b285f5c574c36f76e581c98e81c1"} Mar 18 12:31:00 crc kubenswrapper[4843]: I0318 12:31:00.996739 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.838481532 podStartE2EDuration="21.996719091s" podCreationTimestamp="2026-03-18 12:30:39 +0000 UTC" firstStartedPulling="2026-03-18 12:30:50.012556377 +0000 UTC m=+1283.728381901" lastFinishedPulling="2026-03-18 12:30:59.170793936 +0000 UTC m=+1292.886619460" observedRunningTime="2026-03-18 12:31:00.990341999 +0000 UTC m=+1294.706167533" watchObservedRunningTime="2026-03-18 12:31:00.996719091 +0000 UTC m=+1294.712544615" Mar 18 12:31:00 crc kubenswrapper[4843]: I0318 12:31:00.998291 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a384e3a7-e6ad-4832-8218-ba3f11df2c2f","Type":"ContainerStarted","Data":"c706906e20b732a7e189490de165902874c1f5f9540a894c382988792ec29781"} Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.005509 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77409830-08b4-4a50-8f8e-0e0c3ad009b4","Type":"ContainerStarted","Data":"d4ee15718d98e85f33e33c745b4353fb34c13c57f9f8badd3ea05863c06d680a"} Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.005624 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" podUID="fcc3bbed-56b9-41a8-baaa-b79ff13848ed" containerName="dnsmasq-dns" containerID="cri-o://146fcf8076b78b99cba3b54bfd1a771006c1e08bf7879581193732646ac387df" gracePeriod=10 Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.015441 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5x5rj" podStartSLOduration=7.774193464 podStartE2EDuration="17.015426413s" podCreationTimestamp="2026-03-18 12:30:44 +0000 UTC" firstStartedPulling="2026-03-18 12:30:49.91315277 +0000 UTC m=+1283.628978294" lastFinishedPulling="2026-03-18 12:30:59.154385709 +0000 UTC m=+1292.870211243" observedRunningTime="2026-03-18 12:31:01.015353961 +0000 UTC m=+1294.731179485" watchObservedRunningTime="2026-03-18 12:31:01.015426413 +0000 UTC m=+1294.731251937" Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.100285 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.658413263 podStartE2EDuration="20.100265735s" podCreationTimestamp="2026-03-18 12:30:41 +0000 UTC" firstStartedPulling="2026-03-18 12:30:49.541184532 +0000 UTC m=+1283.257010046" lastFinishedPulling="2026-03-18 12:30:59.983036994 +0000 UTC m=+1293.698862518" observedRunningTime="2026-03-18 12:31:01.097123566 +0000 UTC m=+1294.812949090" watchObservedRunningTime="2026-03-18 12:31:01.100265735 +0000 UTC m=+1294.816091259" Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.640102 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.785448 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-config\") pod \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\" (UID: \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\") " Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.785558 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lngcf\" (UniqueName: \"kubernetes.io/projected/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-kube-api-access-lngcf\") pod \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\" (UID: \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\") " Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.785640 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-dns-svc\") pod \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\" (UID: \"fcc3bbed-56b9-41a8-baaa-b79ff13848ed\") " Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.794149 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-kube-api-access-lngcf" (OuterVolumeSpecName: "kube-api-access-lngcf") pod "fcc3bbed-56b9-41a8-baaa-b79ff13848ed" (UID: "fcc3bbed-56b9-41a8-baaa-b79ff13848ed"). InnerVolumeSpecName "kube-api-access-lngcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.823263 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fcc3bbed-56b9-41a8-baaa-b79ff13848ed" (UID: "fcc3bbed-56b9-41a8-baaa-b79ff13848ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.831687 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-config" (OuterVolumeSpecName: "config") pod "fcc3bbed-56b9-41a8-baaa-b79ff13848ed" (UID: "fcc3bbed-56b9-41a8-baaa-b79ff13848ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.887719 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lngcf\" (UniqueName: \"kubernetes.io/projected/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-kube-api-access-lngcf\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.887753 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:01 crc kubenswrapper[4843]: I0318 12:31:01.887770 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcc3bbed-56b9-41a8-baaa-b79ff13848ed-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.013864 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1c41f082-cf59-42b4-8314-64aace288dd1","Type":"ContainerStarted","Data":"b2949077702fceb64cea3279ee20a1822ebb720ae15697187e10f706bad4d9b4"} Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.015903 4843 generic.go:334] "Generic (PLEG): container finished" podID="fcc3bbed-56b9-41a8-baaa-b79ff13848ed" containerID="146fcf8076b78b99cba3b54bfd1a771006c1e08bf7879581193732646ac387df" exitCode=0 Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.015959 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.015983 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" event={"ID":"fcc3bbed-56b9-41a8-baaa-b79ff13848ed","Type":"ContainerDied","Data":"146fcf8076b78b99cba3b54bfd1a771006c1e08bf7879581193732646ac387df"} Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.016024 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-v8m9t" event={"ID":"fcc3bbed-56b9-41a8-baaa-b79ff13848ed","Type":"ContainerDied","Data":"9faa216b243dcd95eac07863d4d200472992f5a609e606e4575b15f6d77f3233"} Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.016042 4843 scope.go:117] "RemoveContainer" containerID="146fcf8076b78b99cba3b54bfd1a771006c1e08bf7879581193732646ac387df" Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.019429 4843 generic.go:334] "Generic (PLEG): container finished" podID="a8d4e76a-b337-4cdf-a4fe-929389ba6e8c" containerID="b52b7291fafe3a7d7e13f6d211b8f99c915bb03806318cded13cadd247826f9b" exitCode=0 Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.019497 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s65qf" event={"ID":"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c","Type":"ContainerDied","Data":"b52b7291fafe3a7d7e13f6d211b8f99c915bb03806318cded13cadd247826f9b"} Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.023356 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"257240a5-cc42-4354-9079-66e6de070b34","Type":"ContainerStarted","Data":"9e69e813190bc3b6041d2a54beae48ef9ecf7969785d0b66c5880f22bcd65291"} Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.024027 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.049438 4843 scope.go:117] "RemoveContainer" containerID="46108ab64a7614990b2dcb75ec9594dae4f6920b37144f55812fd6d05452fa09" Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.095141 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-v8m9t"] Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.102092 4843 scope.go:117] "RemoveContainer" containerID="146fcf8076b78b99cba3b54bfd1a771006c1e08bf7879581193732646ac387df" Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.102394 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-v8m9t"] Mar 18 12:31:02 crc kubenswrapper[4843]: E0318 12:31:02.103116 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"146fcf8076b78b99cba3b54bfd1a771006c1e08bf7879581193732646ac387df\": container with ID starting with 146fcf8076b78b99cba3b54bfd1a771006c1e08bf7879581193732646ac387df not found: ID does not exist" containerID="146fcf8076b78b99cba3b54bfd1a771006c1e08bf7879581193732646ac387df" Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.103158 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"146fcf8076b78b99cba3b54bfd1a771006c1e08bf7879581193732646ac387df"} err="failed to get container status \"146fcf8076b78b99cba3b54bfd1a771006c1e08bf7879581193732646ac387df\": rpc error: code = NotFound desc = could not find container \"146fcf8076b78b99cba3b54bfd1a771006c1e08bf7879581193732646ac387df\": container with ID starting with 146fcf8076b78b99cba3b54bfd1a771006c1e08bf7879581193732646ac387df not found: ID does not exist" Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.103185 4843 scope.go:117] "RemoveContainer" containerID="46108ab64a7614990b2dcb75ec9594dae4f6920b37144f55812fd6d05452fa09" Mar 18 12:31:02 crc kubenswrapper[4843]: E0318 12:31:02.104875 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46108ab64a7614990b2dcb75ec9594dae4f6920b37144f55812fd6d05452fa09\": container with ID starting with 46108ab64a7614990b2dcb75ec9594dae4f6920b37144f55812fd6d05452fa09 not found: ID does not exist" containerID="46108ab64a7614990b2dcb75ec9594dae4f6920b37144f55812fd6d05452fa09" Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.104911 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46108ab64a7614990b2dcb75ec9594dae4f6920b37144f55812fd6d05452fa09"} err="failed to get container status \"46108ab64a7614990b2dcb75ec9594dae4f6920b37144f55812fd6d05452fa09\": rpc error: code = NotFound desc = could not find container \"46108ab64a7614990b2dcb75ec9594dae4f6920b37144f55812fd6d05452fa09\": container with ID starting with 46108ab64a7614990b2dcb75ec9594dae4f6920b37144f55812fd6d05452fa09 not found: ID does not exist" Mar 18 12:31:02 crc kubenswrapper[4843]: I0318 12:31:02.994509 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcc3bbed-56b9-41a8-baaa-b79ff13848ed" path="/var/lib/kubelet/pods/fcc3bbed-56b9-41a8-baaa-b79ff13848ed/volumes" Mar 18 12:31:03 crc kubenswrapper[4843]: I0318 12:31:03.035865 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s65qf" event={"ID":"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c","Type":"ContainerStarted","Data":"ee9853a0475adfafa4392ce0df1941417a9b0a42ac6f3568619f4e0594c53103"} Mar 18 12:31:03 crc kubenswrapper[4843]: I0318 12:31:03.035905 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s65qf" event={"ID":"a8d4e76a-b337-4cdf-a4fe-929389ba6e8c","Type":"ContainerStarted","Data":"2e84b84f9347a6f461f2377c276b922361ec8b5031ac084b6ddef1dfa5d9194d"} Mar 18 12:31:03 crc kubenswrapper[4843]: I0318 12:31:03.036535 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:31:03 crc kubenswrapper[4843]: I0318 12:31:03.036600 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:31:03 crc kubenswrapper[4843]: I0318 12:31:03.059978 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-s65qf" podStartSLOduration=9.746516603 podStartE2EDuration="19.059956455s" podCreationTimestamp="2026-03-18 12:30:44 +0000 UTC" firstStartedPulling="2026-03-18 12:30:49.839468015 +0000 UTC m=+1283.555293539" lastFinishedPulling="2026-03-18 12:30:59.152907847 +0000 UTC m=+1292.868733391" observedRunningTime="2026-03-18 12:31:03.058414621 +0000 UTC m=+1296.774240145" watchObservedRunningTime="2026-03-18 12:31:03.059956455 +0000 UTC m=+1296.775781999" Mar 18 12:31:04 crc kubenswrapper[4843]: I0318 12:31:04.045763 4843 generic.go:334] "Generic (PLEG): container finished" podID="9f1dd599-3b73-4b6c-8f80-0fbb1ce13520" containerID="bac5e873e5df5137b8d148b4438f2f902786b285f5c574c36f76e581c98e81c1" exitCode=0 Mar 18 12:31:04 crc kubenswrapper[4843]: I0318 12:31:04.045879 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520","Type":"ContainerDied","Data":"bac5e873e5df5137b8d148b4438f2f902786b285f5c574c36f76e581c98e81c1"} Mar 18 12:31:04 crc kubenswrapper[4843]: I0318 12:31:04.047363 4843 generic.go:334] "Generic (PLEG): container finished" podID="a384e3a7-e6ad-4832-8218-ba3f11df2c2f" containerID="c706906e20b732a7e189490de165902874c1f5f9540a894c382988792ec29781" exitCode=0 Mar 18 12:31:04 crc kubenswrapper[4843]: I0318 12:31:04.047459 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a384e3a7-e6ad-4832-8218-ba3f11df2c2f","Type":"ContainerDied","Data":"c706906e20b732a7e189490de165902874c1f5f9540a894c382988792ec29781"} Mar 18 12:31:05 crc kubenswrapper[4843]: I0318 12:31:05.065100 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9f1dd599-3b73-4b6c-8f80-0fbb1ce13520","Type":"ContainerStarted","Data":"7eb9649f8bdc69991f57161fe5e44894caad4d49d965ec5f7e4fa406e658559a"} Mar 18 12:31:05 crc kubenswrapper[4843]: I0318 12:31:05.068617 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a384e3a7-e6ad-4832-8218-ba3f11df2c2f","Type":"ContainerStarted","Data":"f5feab568dc3fe2e99a58306952b3d2391519bc7fefbf2dd27a97abc02b808ee"} Mar 18 12:31:05 crc kubenswrapper[4843]: I0318 12:31:05.072156 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7e2f8130-0164-41ca-aa4a-5e206b21bef2","Type":"ContainerStarted","Data":"548149b164ad3126d33ea25da7f0655e881c812afac84b1098d2389fde5ba7e7"} Mar 18 12:31:05 crc kubenswrapper[4843]: I0318 12:31:05.075543 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2e6bed4c-30f0-4088-9ac8-2a818cd781d6","Type":"ContainerStarted","Data":"aadfe9405520fe17aeda71b42bcb90b367941484be4860ee453f55dc5af201b7"} Mar 18 12:31:05 crc kubenswrapper[4843]: I0318 12:31:05.105982 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=17.8345283 podStartE2EDuration="27.105959758s" podCreationTimestamp="2026-03-18 12:30:38 +0000 UTC" firstStartedPulling="2026-03-18 12:30:50.023367574 +0000 UTC m=+1283.739193098" lastFinishedPulling="2026-03-18 12:30:59.294799032 +0000 UTC m=+1293.010624556" observedRunningTime="2026-03-18 12:31:05.098873856 +0000 UTC m=+1298.814699480" watchObservedRunningTime="2026-03-18 12:31:05.105959758 +0000 UTC m=+1298.821785322" Mar 18 12:31:05 crc kubenswrapper[4843]: I0318 12:31:05.140125 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.905284993 podStartE2EDuration="29.140106829s" podCreationTimestamp="2026-03-18 12:30:36 +0000 UTC" firstStartedPulling="2026-03-18 12:30:49.917995488 +0000 UTC m=+1283.633821012" lastFinishedPulling="2026-03-18 12:30:59.152817324 +0000 UTC m=+1292.868642848" observedRunningTime="2026-03-18 12:31:05.129897389 +0000 UTC m=+1298.845722923" watchObservedRunningTime="2026-03-18 12:31:05.140106829 +0000 UTC m=+1298.855932363" Mar 18 12:31:05 crc kubenswrapper[4843]: I0318 12:31:05.190531 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.151247469 podStartE2EDuration="17.190507952s" podCreationTimestamp="2026-03-18 12:30:48 +0000 UTC" firstStartedPulling="2026-03-18 12:30:50.115198336 +0000 UTC m=+1283.831023860" lastFinishedPulling="2026-03-18 12:31:04.154458819 +0000 UTC m=+1297.870284343" observedRunningTime="2026-03-18 12:31:05.186670683 +0000 UTC m=+1298.902496217" watchObservedRunningTime="2026-03-18 12:31:05.190507952 +0000 UTC m=+1298.906333496" Mar 18 12:31:05 crc kubenswrapper[4843]: I0318 12:31:05.192455 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.076440021 podStartE2EDuration="18.192443407s" podCreationTimestamp="2026-03-18 12:30:47 +0000 UTC" firstStartedPulling="2026-03-18 12:30:50.01757999 +0000 UTC m=+1283.733405514" lastFinishedPulling="2026-03-18 12:31:04.133583376 +0000 UTC m=+1297.849408900" observedRunningTime="2026-03-18 12:31:05.164441141 +0000 UTC m=+1298.880266675" watchObservedRunningTime="2026-03-18 12:31:05.192443407 +0000 UTC m=+1298.908268941" Mar 18 12:31:06 crc kubenswrapper[4843]: I0318 12:31:06.681267 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 12:31:06 crc kubenswrapper[4843]: I0318 12:31:06.721617 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.103586 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 12:31:07 crc kubenswrapper[4843]: E0318 12:31:07.122319 4843 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.205:59730->38.129.56.205:46701: write tcp 38.129.56.205:59730->38.129.56.205:46701: write: connection reset by peer Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.187745 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.431290 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.482562 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.503152 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-2mwsx"] Mar 18 12:31:07 crc kubenswrapper[4843]: E0318 12:31:07.503555 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc3bbed-56b9-41a8-baaa-b79ff13848ed" containerName="dnsmasq-dns" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.503573 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc3bbed-56b9-41a8-baaa-b79ff13848ed" containerName="dnsmasq-dns" Mar 18 12:31:07 crc kubenswrapper[4843]: E0318 12:31:07.503665 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc3bbed-56b9-41a8-baaa-b79ff13848ed" containerName="init" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.503676 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc3bbed-56b9-41a8-baaa-b79ff13848ed" containerName="init" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.503898 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc3bbed-56b9-41a8-baaa-b79ff13848ed" containerName="dnsmasq-dns" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.508379 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.510391 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.532053 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-2mwsx"] Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.572627 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-bpdmm"] Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.573972 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.577167 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.593689 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bpdmm"] Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.621571 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-config\") pod \"dnsmasq-dns-6bc7876d45-2mwsx\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.621636 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-2mwsx\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.621691 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw4bv\" (UniqueName: \"kubernetes.io/projected/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-kube-api-access-jw4bv\") pod \"dnsmasq-dns-6bc7876d45-2mwsx\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.621731 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-2mwsx\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.722897 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/da7f4dbf-af62-45f2-a204-578e20011760-ovs-rundir\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.722961 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/da7f4dbf-af62-45f2-a204-578e20011760-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.722981 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da7f4dbf-af62-45f2-a204-578e20011760-combined-ca-bundle\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.723011 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r4mg\" (UniqueName: \"kubernetes.io/projected/da7f4dbf-af62-45f2-a204-578e20011760-kube-api-access-4r4mg\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.723165 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-2mwsx\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.723282 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw4bv\" (UniqueName: \"kubernetes.io/projected/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-kube-api-access-jw4bv\") pod \"dnsmasq-dns-6bc7876d45-2mwsx\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.723357 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/da7f4dbf-af62-45f2-a204-578e20011760-ovn-rundir\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.723463 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-2mwsx\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.723710 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7f4dbf-af62-45f2-a204-578e20011760-config\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.723907 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-config\") pod \"dnsmasq-dns-6bc7876d45-2mwsx\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.724217 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-2mwsx\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.724266 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-2mwsx\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.724706 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-config\") pod \"dnsmasq-dns-6bc7876d45-2mwsx\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.745189 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw4bv\" (UniqueName: \"kubernetes.io/projected/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-kube-api-access-jw4bv\") pod \"dnsmasq-dns-6bc7876d45-2mwsx\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.825518 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/da7f4dbf-af62-45f2-a204-578e20011760-ovs-rundir\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.825584 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/da7f4dbf-af62-45f2-a204-578e20011760-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.825606 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da7f4dbf-af62-45f2-a204-578e20011760-combined-ca-bundle\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.825625 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r4mg\" (UniqueName: \"kubernetes.io/projected/da7f4dbf-af62-45f2-a204-578e20011760-kube-api-access-4r4mg\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.825695 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/da7f4dbf-af62-45f2-a204-578e20011760-ovn-rundir\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.825793 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7f4dbf-af62-45f2-a204-578e20011760-config\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.825906 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/da7f4dbf-af62-45f2-a204-578e20011760-ovs-rundir\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.825969 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/da7f4dbf-af62-45f2-a204-578e20011760-ovn-rundir\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.826562 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7f4dbf-af62-45f2-a204-578e20011760-config\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.829347 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da7f4dbf-af62-45f2-a204-578e20011760-combined-ca-bundle\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.829576 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/da7f4dbf-af62-45f2-a204-578e20011760-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.838136 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.844538 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r4mg\" (UniqueName: \"kubernetes.io/projected/da7f4dbf-af62-45f2-a204-578e20011760-kube-api-access-4r4mg\") pod \"ovn-controller-metrics-bpdmm\" (UID: \"da7f4dbf-af62-45f2-a204-578e20011760\") " pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.888554 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bpdmm" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.951741 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-2mwsx"] Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.978922 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-zmjkx"] Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.980449 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.990444 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 12:31:07 crc kubenswrapper[4843]: I0318 12:31:07.991616 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zmjkx"] Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.088199 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.088503 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.110308 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.135037 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.135070 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.135178 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-527fg\" (UniqueName: \"kubernetes.io/projected/3d9f91de-2026-490d-b469-59458f66daba-kube-api-access-527fg\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.135201 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-dns-svc\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.135232 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-config\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.151721 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.236834 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.236892 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.237010 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-527fg\" (UniqueName: \"kubernetes.io/projected/3d9f91de-2026-490d-b469-59458f66daba-kube-api-access-527fg\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.237046 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-dns-svc\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.237079 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-config\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.237742 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.237759 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.237895 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-config\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.240748 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-dns-svc\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.256871 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-527fg\" (UniqueName: \"kubernetes.io/projected/3d9f91de-2026-490d-b469-59458f66daba-kube-api-access-527fg\") pod \"dnsmasq-dns-8554648995-zmjkx\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.301937 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.354580 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-2mwsx"] Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.460764 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bpdmm"] Mar 18 12:31:08 crc kubenswrapper[4843]: W0318 12:31:08.465431 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda7f4dbf_af62_45f2_a204_578e20011760.slice/crio-75a6cd1982daec8ec228851eea1150477355a5d52230d402402dc8d4ca70863a WatchSource:0}: Error finding container 75a6cd1982daec8ec228851eea1150477355a5d52230d402402dc8d4ca70863a: Status 404 returned error can't find the container with id 75a6cd1982daec8ec228851eea1150477355a5d52230d402402dc8d4ca70863a Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.605500 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.611282 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.614206 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.614338 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.616177 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-ft6h7" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.616875 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.617057 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 12:31:08 crc kubenswrapper[4843]: W0318 12:31:08.728650 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d9f91de_2026_490d_b469_59458f66daba.slice/crio-0041173382cffe77d04f25447a683b3bd320309e1b714333cc0a253e6434706f WatchSource:0}: Error finding container 0041173382cffe77d04f25447a683b3bd320309e1b714333cc0a253e6434706f: Status 404 returned error can't find the container with id 0041173382cffe77d04f25447a683b3bd320309e1b714333cc0a253e6434706f Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.728734 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zmjkx"] Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.750934 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/301b1485-ef36-42c7-a6a8-4c3619416072-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.751004 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/301b1485-ef36-42c7-a6a8-4c3619416072-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.751045 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/301b1485-ef36-42c7-a6a8-4c3619416072-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.751105 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/301b1485-ef36-42c7-a6a8-4c3619416072-config\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.751133 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/301b1485-ef36-42c7-a6a8-4c3619416072-scripts\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.751165 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5v6z\" (UniqueName: \"kubernetes.io/projected/301b1485-ef36-42c7-a6a8-4c3619416072-kube-api-access-r5v6z\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.751187 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301b1485-ef36-42c7-a6a8-4c3619416072-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.854845 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/301b1485-ef36-42c7-a6a8-4c3619416072-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.854920 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/301b1485-ef36-42c7-a6a8-4c3619416072-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.854978 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/301b1485-ef36-42c7-a6a8-4c3619416072-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.855582 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/301b1485-ef36-42c7-a6a8-4c3619416072-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.856099 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/301b1485-ef36-42c7-a6a8-4c3619416072-config\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.856490 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/301b1485-ef36-42c7-a6a8-4c3619416072-scripts\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.856526 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5v6z\" (UniqueName: \"kubernetes.io/projected/301b1485-ef36-42c7-a6a8-4c3619416072-kube-api-access-r5v6z\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.856549 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301b1485-ef36-42c7-a6a8-4c3619416072-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.857156 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/301b1485-ef36-42c7-a6a8-4c3619416072-scripts\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.857408 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/301b1485-ef36-42c7-a6a8-4c3619416072-config\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.860317 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/301b1485-ef36-42c7-a6a8-4c3619416072-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.861457 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/301b1485-ef36-42c7-a6a8-4c3619416072-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.861722 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301b1485-ef36-42c7-a6a8-4c3619416072-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.888608 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5v6z\" (UniqueName: \"kubernetes.io/projected/301b1485-ef36-42c7-a6a8-4c3619416072-kube-api-access-r5v6z\") pod \"ovn-northd-0\" (UID: \"301b1485-ef36-42c7-a6a8-4c3619416072\") " pod="openstack/ovn-northd-0" Mar 18 12:31:08 crc kubenswrapper[4843]: I0318 12:31:08.926763 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.127114 4843 generic.go:334] "Generic (PLEG): container finished" podID="3d9f91de-2026-490d-b469-59458f66daba" containerID="5b654677ca317d760ea2a9fd60421dc31d05ad15d6f138aa855c3ff02ebb1d55" exitCode=0 Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.127416 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zmjkx" event={"ID":"3d9f91de-2026-490d-b469-59458f66daba","Type":"ContainerDied","Data":"5b654677ca317d760ea2a9fd60421dc31d05ad15d6f138aa855c3ff02ebb1d55"} Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.127447 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zmjkx" event={"ID":"3d9f91de-2026-490d-b469-59458f66daba","Type":"ContainerStarted","Data":"0041173382cffe77d04f25447a683b3bd320309e1b714333cc0a253e6434706f"} Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.152330 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bpdmm" event={"ID":"da7f4dbf-af62-45f2-a204-578e20011760","Type":"ContainerStarted","Data":"666b5095014fef44cc9c6b3e71a923f76331cd40ff38e0f0bbcd23c94cdc4365"} Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.152592 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bpdmm" event={"ID":"da7f4dbf-af62-45f2-a204-578e20011760","Type":"ContainerStarted","Data":"75a6cd1982daec8ec228851eea1150477355a5d52230d402402dc8d4ca70863a"} Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.173952 4843 generic.go:334] "Generic (PLEG): container finished" podID="23463565-55d3-4b10-9ec2-bb9dccfd2b6f" containerID="62cf7e879314db2928df88f77d7bd73111e745603a9402d69114d42aa2da183e" exitCode=0 Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.174529 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" event={"ID":"23463565-55d3-4b10-9ec2-bb9dccfd2b6f","Type":"ContainerDied","Data":"62cf7e879314db2928df88f77d7bd73111e745603a9402d69114d42aa2da183e"} Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.174686 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" event={"ID":"23463565-55d3-4b10-9ec2-bb9dccfd2b6f","Type":"ContainerStarted","Data":"8f1674479970eb027eefa3198d39723da68698a4a2d9b11dc23be776dea26c0b"} Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.202111 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-bpdmm" podStartSLOduration=2.202085232 podStartE2EDuration="2.202085232s" podCreationTimestamp="2026-03-18 12:31:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:09.190022369 +0000 UTC m=+1302.905847893" watchObservedRunningTime="2026-03-18 12:31:09.202085232 +0000 UTC m=+1302.917910756" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.450035 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.450493 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.470761 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.509762 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 12:31:09 crc kubenswrapper[4843]: W0318 12:31:09.511831 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod301b1485_ef36_42c7_a6a8_4c3619416072.slice/crio-33e1bc72e0922c014127e2c091e44650a1aeb6554b3496958f16af444858ff27 WatchSource:0}: Error finding container 33e1bc72e0922c014127e2c091e44650a1aeb6554b3496958f16af444858ff27: Status 404 returned error can't find the container with id 33e1bc72e0922c014127e2c091e44650a1aeb6554b3496958f16af444858ff27 Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.536167 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.576542 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw4bv\" (UniqueName: \"kubernetes.io/projected/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-kube-api-access-jw4bv\") pod \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.577823 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-dns-svc\") pod \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.578037 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-ovsdbserver-sb\") pod \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.578145 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-config\") pod \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\" (UID: \"23463565-55d3-4b10-9ec2-bb9dccfd2b6f\") " Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.581635 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-kube-api-access-jw4bv" (OuterVolumeSpecName: "kube-api-access-jw4bv") pod "23463565-55d3-4b10-9ec2-bb9dccfd2b6f" (UID: "23463565-55d3-4b10-9ec2-bb9dccfd2b6f"). InnerVolumeSpecName "kube-api-access-jw4bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.600732 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-config" (OuterVolumeSpecName: "config") pod "23463565-55d3-4b10-9ec2-bb9dccfd2b6f" (UID: "23463565-55d3-4b10-9ec2-bb9dccfd2b6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.601469 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "23463565-55d3-4b10-9ec2-bb9dccfd2b6f" (UID: "23463565-55d3-4b10-9ec2-bb9dccfd2b6f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.603004 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "23463565-55d3-4b10-9ec2-bb9dccfd2b6f" (UID: "23463565-55d3-4b10-9ec2-bb9dccfd2b6f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.679858 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.679913 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw4bv\" (UniqueName: \"kubernetes.io/projected/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-kube-api-access-jw4bv\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.679940 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.679958 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23463565-55d3-4b10-9ec2-bb9dccfd2b6f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:09 crc kubenswrapper[4843]: I0318 12:31:09.740200 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 12:31:10 crc kubenswrapper[4843]: I0318 12:31:10.182178 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zmjkx" event={"ID":"3d9f91de-2026-490d-b469-59458f66daba","Type":"ContainerStarted","Data":"e28ca782cfc4d37481a4c1e42c1b8f303abca5d45d85f57adafef76609198cef"} Mar 18 12:31:10 crc kubenswrapper[4843]: I0318 12:31:10.183498 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:10 crc kubenswrapper[4843]: I0318 12:31:10.184399 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"301b1485-ef36-42c7-a6a8-4c3619416072","Type":"ContainerStarted","Data":"33e1bc72e0922c014127e2c091e44650a1aeb6554b3496958f16af444858ff27"} Mar 18 12:31:10 crc kubenswrapper[4843]: I0318 12:31:10.190777 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" event={"ID":"23463565-55d3-4b10-9ec2-bb9dccfd2b6f","Type":"ContainerDied","Data":"8f1674479970eb027eefa3198d39723da68698a4a2d9b11dc23be776dea26c0b"} Mar 18 12:31:10 crc kubenswrapper[4843]: I0318 12:31:10.190919 4843 scope.go:117] "RemoveContainer" containerID="62cf7e879314db2928df88f77d7bd73111e745603a9402d69114d42aa2da183e" Mar 18 12:31:10 crc kubenswrapper[4843]: I0318 12:31:10.191395 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-2mwsx" Mar 18 12:31:10 crc kubenswrapper[4843]: I0318 12:31:10.210301 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-zmjkx" podStartSLOduration=3.210282543 podStartE2EDuration="3.210282543s" podCreationTimestamp="2026-03-18 12:31:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:10.204775527 +0000 UTC m=+1303.920601051" watchObservedRunningTime="2026-03-18 12:31:10.210282543 +0000 UTC m=+1303.926108067" Mar 18 12:31:10 crc kubenswrapper[4843]: I0318 12:31:10.287305 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-2mwsx"] Mar 18 12:31:10 crc kubenswrapper[4843]: I0318 12:31:10.292549 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-2mwsx"] Mar 18 12:31:10 crc kubenswrapper[4843]: I0318 12:31:10.303732 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 12:31:10 crc kubenswrapper[4843]: I0318 12:31:10.994282 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23463565-55d3-4b10-9ec2-bb9dccfd2b6f" path="/var/lib/kubelet/pods/23463565-55d3-4b10-9ec2-bb9dccfd2b6f/volumes" Mar 18 12:31:11 crc kubenswrapper[4843]: I0318 12:31:11.199173 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"301b1485-ef36-42c7-a6a8-4c3619416072","Type":"ContainerStarted","Data":"88abf2f844cbb561e2eaf61cc3399c16b8938adc9b4015a4b4b3730f01a2adfb"} Mar 18 12:31:11 crc kubenswrapper[4843]: I0318 12:31:11.199225 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"301b1485-ef36-42c7-a6a8-4c3619416072","Type":"ContainerStarted","Data":"81fe6f9e4cba2b71eb9acdd4cd3b901183e548efc8eb76b8cf2b9703e6a6b70a"} Mar 18 12:31:11 crc kubenswrapper[4843]: I0318 12:31:11.199279 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 12:31:11 crc kubenswrapper[4843]: I0318 12:31:11.221783 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.102905039 podStartE2EDuration="3.221763407s" podCreationTimestamp="2026-03-18 12:31:08 +0000 UTC" firstStartedPulling="2026-03-18 12:31:09.513473538 +0000 UTC m=+1303.229299062" lastFinishedPulling="2026-03-18 12:31:10.632331906 +0000 UTC m=+1304.348157430" observedRunningTime="2026-03-18 12:31:11.217168526 +0000 UTC m=+1304.932994070" watchObservedRunningTime="2026-03-18 12:31:11.221763407 +0000 UTC m=+1304.937588951" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.036846 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.188056 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zmjkx"] Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.198569 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rhmgx"] Mar 18 12:31:12 crc kubenswrapper[4843]: E0318 12:31:12.198898 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23463565-55d3-4b10-9ec2-bb9dccfd2b6f" containerName="init" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.198914 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="23463565-55d3-4b10-9ec2-bb9dccfd2b6f" containerName="init" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.199066 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="23463565-55d3-4b10-9ec2-bb9dccfd2b6f" containerName="init" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.201372 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.231356 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rhmgx"] Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.267907 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.341927 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.341979 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l99p5\" (UniqueName: \"kubernetes.io/projected/0cfdef9a-e628-4679-9bac-c3efe39b5a41-kube-api-access-l99p5\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.342005 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.342086 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-config\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.342228 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.364767 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.444242 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-config\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.444416 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.444530 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.444571 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l99p5\" (UniqueName: \"kubernetes.io/projected/0cfdef9a-e628-4679-9bac-c3efe39b5a41-kube-api-access-l99p5\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.444600 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.445730 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-config\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.445900 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.445917 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.446186 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.469637 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l99p5\" (UniqueName: \"kubernetes.io/projected/0cfdef9a-e628-4679-9bac-c3efe39b5a41-kube-api-access-l99p5\") pod \"dnsmasq-dns-b8fbc5445-rhmgx\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:12 crc kubenswrapper[4843]: I0318 12:31:12.522906 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.140155 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rhmgx"] Mar 18 12:31:13 crc kubenswrapper[4843]: W0318 12:31:13.153812 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cfdef9a_e628_4679_9bac_c3efe39b5a41.slice/crio-61271786b96714d7ec6a68b627c5591ab469e63d77dc1d06792adc71c9db152f WatchSource:0}: Error finding container 61271786b96714d7ec6a68b627c5591ab469e63d77dc1d06792adc71c9db152f: Status 404 returned error can't find the container with id 61271786b96714d7ec6a68b627c5591ab469e63d77dc1d06792adc71c9db152f Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.222477 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-zmjkx" podUID="3d9f91de-2026-490d-b469-59458f66daba" containerName="dnsmasq-dns" containerID="cri-o://e28ca782cfc4d37481a4c1e42c1b8f303abca5d45d85f57adafef76609198cef" gracePeriod=10 Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.223197 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" event={"ID":"0cfdef9a-e628-4679-9bac-c3efe39b5a41","Type":"ContainerStarted","Data":"61271786b96714d7ec6a68b627c5591ab469e63d77dc1d06792adc71c9db152f"} Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.272779 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.287987 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.290960 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.291085 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.291262 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-zvmqb" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.291364 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.311334 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.412100 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.412207 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp2zs\" (UniqueName: \"kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-kube-api-access-wp2zs\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.412259 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4e07598e-c70f-4beb-a828-b58cb64c38c0-cache\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.412397 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.412446 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4e07598e-c70f-4beb-a828-b58cb64c38c0-lock\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.412476 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e07598e-c70f-4beb-a828-b58cb64c38c0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.514456 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp2zs\" (UniqueName: \"kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-kube-api-access-wp2zs\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.514531 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4e07598e-c70f-4beb-a828-b58cb64c38c0-cache\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.514581 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.514612 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4e07598e-c70f-4beb-a828-b58cb64c38c0-lock\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: E0318 12:31:13.514795 4843 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:31:13 crc kubenswrapper[4843]: E0318 12:31:13.514812 4843 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:31:13 crc kubenswrapper[4843]: E0318 12:31:13.514867 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift podName:4e07598e-c70f-4beb-a828-b58cb64c38c0 nodeName:}" failed. No retries permitted until 2026-03-18 12:31:14.014846707 +0000 UTC m=+1307.730672241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift") pod "swift-storage-0" (UID: "4e07598e-c70f-4beb-a828-b58cb64c38c0") : configmap "swift-ring-files" not found Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.515065 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4e07598e-c70f-4beb-a828-b58cb64c38c0-cache\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.515106 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4e07598e-c70f-4beb-a828-b58cb64c38c0-lock\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.516510 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e07598e-c70f-4beb-a828-b58cb64c38c0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.516706 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.517034 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.522431 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e07598e-c70f-4beb-a828-b58cb64c38c0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.532768 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp2zs\" (UniqueName: \"kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-kube-api-access-wp2zs\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.559375 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.703222 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.803376 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-gx669"] Mar 18 12:31:13 crc kubenswrapper[4843]: E0318 12:31:13.803922 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9f91de-2026-490d-b469-59458f66daba" containerName="init" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.803934 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9f91de-2026-490d-b469-59458f66daba" containerName="init" Mar 18 12:31:13 crc kubenswrapper[4843]: E0318 12:31:13.803950 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9f91de-2026-490d-b469-59458f66daba" containerName="dnsmasq-dns" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.803956 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9f91de-2026-490d-b469-59458f66daba" containerName="dnsmasq-dns" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.804120 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d9f91de-2026-490d-b469-59458f66daba" containerName="dnsmasq-dns" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.804625 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.808279 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.808618 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.809244 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.821405 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-ovsdbserver-sb\") pod \"3d9f91de-2026-490d-b469-59458f66daba\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.821515 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-dns-svc\") pod \"3d9f91de-2026-490d-b469-59458f66daba\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.821565 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-config\") pod \"3d9f91de-2026-490d-b469-59458f66daba\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.821696 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-527fg\" (UniqueName: \"kubernetes.io/projected/3d9f91de-2026-490d-b469-59458f66daba-kube-api-access-527fg\") pod \"3d9f91de-2026-490d-b469-59458f66daba\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.821821 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-ovsdbserver-nb\") pod \"3d9f91de-2026-490d-b469-59458f66daba\" (UID: \"3d9f91de-2026-490d-b469-59458f66daba\") " Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.826169 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gx669"] Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.826611 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d9f91de-2026-490d-b469-59458f66daba-kube-api-access-527fg" (OuterVolumeSpecName: "kube-api-access-527fg") pod "3d9f91de-2026-490d-b469-59458f66daba" (UID: "3d9f91de-2026-490d-b469-59458f66daba"). InnerVolumeSpecName "kube-api-access-527fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.843712 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-9bkwq"] Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.844843 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.876714 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3d9f91de-2026-490d-b469-59458f66daba" (UID: "3d9f91de-2026-490d-b469-59458f66daba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.892134 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-config" (OuterVolumeSpecName: "config") pod "3d9f91de-2026-490d-b469-59458f66daba" (UID: "3d9f91de-2026-490d-b469-59458f66daba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.896812 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9bkwq"] Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.903548 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-gx669"] Mar 18 12:31:13 crc kubenswrapper[4843]: E0318 12:31:13.904221 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-cprr4 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-gx669" podUID="503ee33f-01ae-4db7-b1a9-d571d69391aa" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.907575 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d9f91de-2026-490d-b469-59458f66daba" (UID: "3d9f91de-2026-490d-b469-59458f66daba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.914902 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3d9f91de-2026-490d-b469-59458f66daba" (UID: "3d9f91de-2026-490d-b469-59458f66daba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.923427 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-dispersionconf\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.923496 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqktb\" (UniqueName: \"kubernetes.io/projected/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-kube-api-access-tqktb\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.923534 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-combined-ca-bundle\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.923734 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/503ee33f-01ae-4db7-b1a9-d571d69391aa-ring-data-devices\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.923794 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-dispersionconf\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.923878 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-combined-ca-bundle\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.923958 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cprr4\" (UniqueName: \"kubernetes.io/projected/503ee33f-01ae-4db7-b1a9-d571d69391aa-kube-api-access-cprr4\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.923994 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-swiftconf\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.924034 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-swiftconf\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.924125 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-ring-data-devices\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.924223 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/503ee33f-01ae-4db7-b1a9-d571d69391aa-scripts\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.924265 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/503ee33f-01ae-4db7-b1a9-d571d69391aa-etc-swift\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.924297 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-scripts\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.924373 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-etc-swift\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.924531 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-527fg\" (UniqueName: \"kubernetes.io/projected/3d9f91de-2026-490d-b469-59458f66daba-kube-api-access-527fg\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.924557 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.924570 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.924583 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:13 crc kubenswrapper[4843]: I0318 12:31:13.924597 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9f91de-2026-490d-b469-59458f66daba-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.026178 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-combined-ca-bundle\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.026250 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.026277 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/503ee33f-01ae-4db7-b1a9-d571d69391aa-ring-data-devices\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.026538 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-dispersionconf\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.026600 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-combined-ca-bundle\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.026629 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cprr4\" (UniqueName: \"kubernetes.io/projected/503ee33f-01ae-4db7-b1a9-d571d69391aa-kube-api-access-cprr4\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: E0318 12:31:14.026420 4843 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.026730 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-swiftconf\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: E0318 12:31:14.026748 4843 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:31:14 crc kubenswrapper[4843]: E0318 12:31:14.026830 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift podName:4e07598e-c70f-4beb-a828-b58cb64c38c0 nodeName:}" failed. No retries permitted until 2026-03-18 12:31:15.026808467 +0000 UTC m=+1308.742633991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift") pod "swift-storage-0" (UID: "4e07598e-c70f-4beb-a828-b58cb64c38c0") : configmap "swift-ring-files" not found Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.026756 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-swiftconf\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.027066 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-ring-data-devices\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.027137 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/503ee33f-01ae-4db7-b1a9-d571d69391aa-scripts\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.027164 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/503ee33f-01ae-4db7-b1a9-d571d69391aa-etc-swift\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.027184 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-scripts\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.027224 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-etc-swift\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.027235 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/503ee33f-01ae-4db7-b1a9-d571d69391aa-ring-data-devices\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.027348 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-dispersionconf\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.027435 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqktb\" (UniqueName: \"kubernetes.io/projected/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-kube-api-access-tqktb\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.027590 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/503ee33f-01ae-4db7-b1a9-d571d69391aa-etc-swift\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.027602 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-etc-swift\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.027780 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/503ee33f-01ae-4db7-b1a9-d571d69391aa-scripts\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.027844 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-ring-data-devices\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.028077 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-scripts\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.029555 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-dispersionconf\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.029983 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-combined-ca-bundle\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.029990 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-swiftconf\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.030296 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-swiftconf\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.030880 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-dispersionconf\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.031381 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-combined-ca-bundle\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.041098 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqktb\" (UniqueName: \"kubernetes.io/projected/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-kube-api-access-tqktb\") pod \"swift-ring-rebalance-9bkwq\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.041606 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cprr4\" (UniqueName: \"kubernetes.io/projected/503ee33f-01ae-4db7-b1a9-d571d69391aa-kube-api-access-cprr4\") pod \"swift-ring-rebalance-gx669\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.185695 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.237266 4843 generic.go:334] "Generic (PLEG): container finished" podID="3d9f91de-2026-490d-b469-59458f66daba" containerID="e28ca782cfc4d37481a4c1e42c1b8f303abca5d45d85f57adafef76609198cef" exitCode=0 Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.237367 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zmjkx" event={"ID":"3d9f91de-2026-490d-b469-59458f66daba","Type":"ContainerDied","Data":"e28ca782cfc4d37481a4c1e42c1b8f303abca5d45d85f57adafef76609198cef"} Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.237400 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zmjkx" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.237406 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zmjkx" event={"ID":"3d9f91de-2026-490d-b469-59458f66daba","Type":"ContainerDied","Data":"0041173382cffe77d04f25447a683b3bd320309e1b714333cc0a253e6434706f"} Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.237423 4843 scope.go:117] "RemoveContainer" containerID="e28ca782cfc4d37481a4c1e42c1b8f303abca5d45d85f57adafef76609198cef" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.242186 4843 generic.go:334] "Generic (PLEG): container finished" podID="0cfdef9a-e628-4679-9bac-c3efe39b5a41" containerID="3a92f80ff331232ad5660fb596cf0ee8429ac8e0fcc6b12246afce94cad09de1" exitCode=0 Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.242265 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.243928 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" event={"ID":"0cfdef9a-e628-4679-9bac-c3efe39b5a41","Type":"ContainerDied","Data":"3a92f80ff331232ad5660fb596cf0ee8429ac8e0fcc6b12246afce94cad09de1"} Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.262040 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.273637 4843 scope.go:117] "RemoveContainer" containerID="5b654677ca317d760ea2a9fd60421dc31d05ad15d6f138aa855c3ff02ebb1d55" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.398956 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/503ee33f-01ae-4db7-b1a9-d571d69391aa-etc-swift\") pod \"503ee33f-01ae-4db7-b1a9-d571d69391aa\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.399030 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-dispersionconf\") pod \"503ee33f-01ae-4db7-b1a9-d571d69391aa\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.399099 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/503ee33f-01ae-4db7-b1a9-d571d69391aa-scripts\") pod \"503ee33f-01ae-4db7-b1a9-d571d69391aa\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.399129 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-combined-ca-bundle\") pod \"503ee33f-01ae-4db7-b1a9-d571d69391aa\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.399250 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cprr4\" (UniqueName: \"kubernetes.io/projected/503ee33f-01ae-4db7-b1a9-d571d69391aa-kube-api-access-cprr4\") pod \"503ee33f-01ae-4db7-b1a9-d571d69391aa\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.399290 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/503ee33f-01ae-4db7-b1a9-d571d69391aa-ring-data-devices\") pod \"503ee33f-01ae-4db7-b1a9-d571d69391aa\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.399345 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-swiftconf\") pod \"503ee33f-01ae-4db7-b1a9-d571d69391aa\" (UID: \"503ee33f-01ae-4db7-b1a9-d571d69391aa\") " Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.400235 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/503ee33f-01ae-4db7-b1a9-d571d69391aa-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "503ee33f-01ae-4db7-b1a9-d571d69391aa" (UID: "503ee33f-01ae-4db7-b1a9-d571d69391aa"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.400424 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/503ee33f-01ae-4db7-b1a9-d571d69391aa-scripts" (OuterVolumeSpecName: "scripts") pod "503ee33f-01ae-4db7-b1a9-d571d69391aa" (UID: "503ee33f-01ae-4db7-b1a9-d571d69391aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.401683 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/503ee33f-01ae-4db7-b1a9-d571d69391aa-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "503ee33f-01ae-4db7-b1a9-d571d69391aa" (UID: "503ee33f-01ae-4db7-b1a9-d571d69391aa"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.407577 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "503ee33f-01ae-4db7-b1a9-d571d69391aa" (UID: "503ee33f-01ae-4db7-b1a9-d571d69391aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.408591 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/503ee33f-01ae-4db7-b1a9-d571d69391aa-kube-api-access-cprr4" (OuterVolumeSpecName: "kube-api-access-cprr4") pod "503ee33f-01ae-4db7-b1a9-d571d69391aa" (UID: "503ee33f-01ae-4db7-b1a9-d571d69391aa"). InnerVolumeSpecName "kube-api-access-cprr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.408598 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "503ee33f-01ae-4db7-b1a9-d571d69391aa" (UID: "503ee33f-01ae-4db7-b1a9-d571d69391aa"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.408956 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "503ee33f-01ae-4db7-b1a9-d571d69391aa" (UID: "503ee33f-01ae-4db7-b1a9-d571d69391aa"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.414450 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zmjkx"] Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.418894 4843 scope.go:117] "RemoveContainer" containerID="e28ca782cfc4d37481a4c1e42c1b8f303abca5d45d85f57adafef76609198cef" Mar 18 12:31:14 crc kubenswrapper[4843]: E0318 12:31:14.419333 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e28ca782cfc4d37481a4c1e42c1b8f303abca5d45d85f57adafef76609198cef\": container with ID starting with e28ca782cfc4d37481a4c1e42c1b8f303abca5d45d85f57adafef76609198cef not found: ID does not exist" containerID="e28ca782cfc4d37481a4c1e42c1b8f303abca5d45d85f57adafef76609198cef" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.419364 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e28ca782cfc4d37481a4c1e42c1b8f303abca5d45d85f57adafef76609198cef"} err="failed to get container status \"e28ca782cfc4d37481a4c1e42c1b8f303abca5d45d85f57adafef76609198cef\": rpc error: code = NotFound desc = could not find container \"e28ca782cfc4d37481a4c1e42c1b8f303abca5d45d85f57adafef76609198cef\": container with ID starting with e28ca782cfc4d37481a4c1e42c1b8f303abca5d45d85f57adafef76609198cef not found: ID does not exist" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.419390 4843 scope.go:117] "RemoveContainer" containerID="5b654677ca317d760ea2a9fd60421dc31d05ad15d6f138aa855c3ff02ebb1d55" Mar 18 12:31:14 crc kubenswrapper[4843]: E0318 12:31:14.419726 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b654677ca317d760ea2a9fd60421dc31d05ad15d6f138aa855c3ff02ebb1d55\": container with ID starting with 5b654677ca317d760ea2a9fd60421dc31d05ad15d6f138aa855c3ff02ebb1d55 not found: ID does not exist" containerID="5b654677ca317d760ea2a9fd60421dc31d05ad15d6f138aa855c3ff02ebb1d55" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.419769 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b654677ca317d760ea2a9fd60421dc31d05ad15d6f138aa855c3ff02ebb1d55"} err="failed to get container status \"5b654677ca317d760ea2a9fd60421dc31d05ad15d6f138aa855c3ff02ebb1d55\": rpc error: code = NotFound desc = could not find container \"5b654677ca317d760ea2a9fd60421dc31d05ad15d6f138aa855c3ff02ebb1d55\": container with ID starting with 5b654677ca317d760ea2a9fd60421dc31d05ad15d6f138aa855c3ff02ebb1d55 not found: ID does not exist" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.427776 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zmjkx"] Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.501919 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cprr4\" (UniqueName: \"kubernetes.io/projected/503ee33f-01ae-4db7-b1a9-d571d69391aa-kube-api-access-cprr4\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.502634 4843 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/503ee33f-01ae-4db7-b1a9-d571d69391aa-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.502668 4843 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.502678 4843 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/503ee33f-01ae-4db7-b1a9-d571d69391aa-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.502687 4843 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.502697 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/503ee33f-01ae-4db7-b1a9-d571d69391aa-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.502705 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503ee33f-01ae-4db7-b1a9-d571d69391aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.773280 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9bkwq"] Mar 18 12:31:14 crc kubenswrapper[4843]: I0318 12:31:14.998092 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d9f91de-2026-490d-b469-59458f66daba" path="/var/lib/kubelet/pods/3d9f91de-2026-490d-b469-59458f66daba/volumes" Mar 18 12:31:15 crc kubenswrapper[4843]: I0318 12:31:15.112863 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:15 crc kubenswrapper[4843]: E0318 12:31:15.113040 4843 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:31:15 crc kubenswrapper[4843]: E0318 12:31:15.113070 4843 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:31:15 crc kubenswrapper[4843]: E0318 12:31:15.113134 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift podName:4e07598e-c70f-4beb-a828-b58cb64c38c0 nodeName:}" failed. No retries permitted until 2026-03-18 12:31:17.113114108 +0000 UTC m=+1310.828939632 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift") pod "swift-storage-0" (UID: "4e07598e-c70f-4beb-a828-b58cb64c38c0") : configmap "swift-ring-files" not found Mar 18 12:31:15 crc kubenswrapper[4843]: I0318 12:31:15.258204 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" event={"ID":"0cfdef9a-e628-4679-9bac-c3efe39b5a41","Type":"ContainerStarted","Data":"d885470a42a8c4aef1e7dd2897a5cc93b1a08a58322a56f930810ccae974ef1e"} Mar 18 12:31:15 crc kubenswrapper[4843]: I0318 12:31:15.258321 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:15 crc kubenswrapper[4843]: I0318 12:31:15.259381 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9bkwq" event={"ID":"570c96ec-626f-41e2-bf9b-5da8f8d65fa2","Type":"ContainerStarted","Data":"ad0cca07d752acfeb428e7f833fd071f40f72bfa7a32f4e682b703f6b0ab5e32"} Mar 18 12:31:15 crc kubenswrapper[4843]: I0318 12:31:15.260322 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gx669" Mar 18 12:31:15 crc kubenswrapper[4843]: I0318 12:31:15.296980 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" podStartSLOduration=3.2969441059999998 podStartE2EDuration="3.296944106s" podCreationTimestamp="2026-03-18 12:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:15.290015049 +0000 UTC m=+1309.005840593" watchObservedRunningTime="2026-03-18 12:31:15.296944106 +0000 UTC m=+1309.012769630" Mar 18 12:31:15 crc kubenswrapper[4843]: I0318 12:31:15.363799 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-gx669"] Mar 18 12:31:15 crc kubenswrapper[4843]: I0318 12:31:15.378907 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-gx669"] Mar 18 12:31:16 crc kubenswrapper[4843]: I0318 12:31:16.820935 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-cznn8"] Mar 18 12:31:16 crc kubenswrapper[4843]: I0318 12:31:16.821988 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cznn8" Mar 18 12:31:16 crc kubenswrapper[4843]: I0318 12:31:16.830417 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 12:31:16 crc kubenswrapper[4843]: I0318 12:31:16.837247 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cznn8"] Mar 18 12:31:16 crc kubenswrapper[4843]: I0318 12:31:16.972773 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z2c2\" (UniqueName: \"kubernetes.io/projected/d8e7959b-d36c-4a07-86f2-77bf271876ce-kube-api-access-2z2c2\") pod \"root-account-create-update-cznn8\" (UID: \"d8e7959b-d36c-4a07-86f2-77bf271876ce\") " pod="openstack/root-account-create-update-cznn8" Mar 18 12:31:16 crc kubenswrapper[4843]: I0318 12:31:16.972916 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e7959b-d36c-4a07-86f2-77bf271876ce-operator-scripts\") pod \"root-account-create-update-cznn8\" (UID: \"d8e7959b-d36c-4a07-86f2-77bf271876ce\") " pod="openstack/root-account-create-update-cznn8" Mar 18 12:31:17 crc kubenswrapper[4843]: I0318 12:31:16.998003 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="503ee33f-01ae-4db7-b1a9-d571d69391aa" path="/var/lib/kubelet/pods/503ee33f-01ae-4db7-b1a9-d571d69391aa/volumes" Mar 18 12:31:17 crc kubenswrapper[4843]: I0318 12:31:17.074605 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e7959b-d36c-4a07-86f2-77bf271876ce-operator-scripts\") pod \"root-account-create-update-cznn8\" (UID: \"d8e7959b-d36c-4a07-86f2-77bf271876ce\") " pod="openstack/root-account-create-update-cznn8" Mar 18 12:31:17 crc kubenswrapper[4843]: I0318 12:31:17.074776 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z2c2\" (UniqueName: \"kubernetes.io/projected/d8e7959b-d36c-4a07-86f2-77bf271876ce-kube-api-access-2z2c2\") pod \"root-account-create-update-cznn8\" (UID: \"d8e7959b-d36c-4a07-86f2-77bf271876ce\") " pod="openstack/root-account-create-update-cznn8" Mar 18 12:31:17 crc kubenswrapper[4843]: I0318 12:31:17.076587 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e7959b-d36c-4a07-86f2-77bf271876ce-operator-scripts\") pod \"root-account-create-update-cznn8\" (UID: \"d8e7959b-d36c-4a07-86f2-77bf271876ce\") " pod="openstack/root-account-create-update-cznn8" Mar 18 12:31:17 crc kubenswrapper[4843]: I0318 12:31:17.221954 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:17 crc kubenswrapper[4843]: E0318 12:31:17.475845 4843 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:31:17 crc kubenswrapper[4843]: E0318 12:31:17.475887 4843 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:31:17 crc kubenswrapper[4843]: E0318 12:31:17.475955 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift podName:4e07598e-c70f-4beb-a828-b58cb64c38c0 nodeName:}" failed. No retries permitted until 2026-03-18 12:31:21.475934508 +0000 UTC m=+1315.191760032 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift") pod "swift-storage-0" (UID: "4e07598e-c70f-4beb-a828-b58cb64c38c0") : configmap "swift-ring-files" not found Mar 18 12:31:17 crc kubenswrapper[4843]: I0318 12:31:17.487829 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z2c2\" (UniqueName: \"kubernetes.io/projected/d8e7959b-d36c-4a07-86f2-77bf271876ce-kube-api-access-2z2c2\") pod \"root-account-create-update-cznn8\" (UID: \"d8e7959b-d36c-4a07-86f2-77bf271876ce\") " pod="openstack/root-account-create-update-cznn8" Mar 18 12:31:17 crc kubenswrapper[4843]: I0318 12:31:17.748357 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cznn8" Mar 18 12:31:19 crc kubenswrapper[4843]: I0318 12:31:19.398779 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cznn8"] Mar 18 12:31:19 crc kubenswrapper[4843]: I0318 12:31:19.540808 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cznn8" event={"ID":"d8e7959b-d36c-4a07-86f2-77bf271876ce","Type":"ContainerStarted","Data":"0b4700770ebe9f1c0045968dcb02efa4c6e22f5b7d1f5445091d2eed247786cc"} Mar 18 12:31:19 crc kubenswrapper[4843]: I0318 12:31:19.542958 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9bkwq" event={"ID":"570c96ec-626f-41e2-bf9b-5da8f8d65fa2","Type":"ContainerStarted","Data":"d2fc77594d550cd3709ba26637ecf204c09dd65f5851b07c7f8027d2082d1040"} Mar 18 12:31:19 crc kubenswrapper[4843]: I0318 12:31:19.573938 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-9bkwq" podStartSLOduration=2.443100362 podStartE2EDuration="6.573923413s" podCreationTimestamp="2026-03-18 12:31:13 +0000 UTC" firstStartedPulling="2026-03-18 12:31:14.778022779 +0000 UTC m=+1308.493848303" lastFinishedPulling="2026-03-18 12:31:18.9088458 +0000 UTC m=+1312.624671354" observedRunningTime="2026-03-18 12:31:19.570165316 +0000 UTC m=+1313.285990840" watchObservedRunningTime="2026-03-18 12:31:19.573923413 +0000 UTC m=+1313.289748937" Mar 18 12:31:19 crc kubenswrapper[4843]: I0318 12:31:19.927461 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-g6j4q"] Mar 18 12:31:19 crc kubenswrapper[4843]: I0318 12:31:19.928507 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-g6j4q"] Mar 18 12:31:19 crc kubenswrapper[4843]: I0318 12:31:19.928585 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-g6j4q" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.025486 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d766-account-create-update-p7lfg"] Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.026888 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d766-account-create-update-p7lfg" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.028758 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.035735 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d766-account-create-update-p7lfg"] Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.061963 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2wfj\" (UniqueName: \"kubernetes.io/projected/7e792089-e775-4ff8-85db-e7cfacd8bba6-kube-api-access-k2wfj\") pod \"glance-db-create-g6j4q\" (UID: \"7e792089-e775-4ff8-85db-e7cfacd8bba6\") " pod="openstack/glance-db-create-g6j4q" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.062047 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e792089-e775-4ff8-85db-e7cfacd8bba6-operator-scripts\") pod \"glance-db-create-g6j4q\" (UID: \"7e792089-e775-4ff8-85db-e7cfacd8bba6\") " pod="openstack/glance-db-create-g6j4q" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.163344 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d49e5ae1-6976-4a45-a007-279a231ec974-operator-scripts\") pod \"glance-d766-account-create-update-p7lfg\" (UID: \"d49e5ae1-6976-4a45-a007-279a231ec974\") " pod="openstack/glance-d766-account-create-update-p7lfg" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.163442 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e792089-e775-4ff8-85db-e7cfacd8bba6-operator-scripts\") pod \"glance-db-create-g6j4q\" (UID: \"7e792089-e775-4ff8-85db-e7cfacd8bba6\") " pod="openstack/glance-db-create-g6j4q" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.163593 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2tq8\" (UniqueName: \"kubernetes.io/projected/d49e5ae1-6976-4a45-a007-279a231ec974-kube-api-access-n2tq8\") pod \"glance-d766-account-create-update-p7lfg\" (UID: \"d49e5ae1-6976-4a45-a007-279a231ec974\") " pod="openstack/glance-d766-account-create-update-p7lfg" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.164045 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2wfj\" (UniqueName: \"kubernetes.io/projected/7e792089-e775-4ff8-85db-e7cfacd8bba6-kube-api-access-k2wfj\") pod \"glance-db-create-g6j4q\" (UID: \"7e792089-e775-4ff8-85db-e7cfacd8bba6\") " pod="openstack/glance-db-create-g6j4q" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.164345 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e792089-e775-4ff8-85db-e7cfacd8bba6-operator-scripts\") pod \"glance-db-create-g6j4q\" (UID: \"7e792089-e775-4ff8-85db-e7cfacd8bba6\") " pod="openstack/glance-db-create-g6j4q" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.188706 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2wfj\" (UniqueName: \"kubernetes.io/projected/7e792089-e775-4ff8-85db-e7cfacd8bba6-kube-api-access-k2wfj\") pod \"glance-db-create-g6j4q\" (UID: \"7e792089-e775-4ff8-85db-e7cfacd8bba6\") " pod="openstack/glance-db-create-g6j4q" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.265146 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d49e5ae1-6976-4a45-a007-279a231ec974-operator-scripts\") pod \"glance-d766-account-create-update-p7lfg\" (UID: \"d49e5ae1-6976-4a45-a007-279a231ec974\") " pod="openstack/glance-d766-account-create-update-p7lfg" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.265223 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2tq8\" (UniqueName: \"kubernetes.io/projected/d49e5ae1-6976-4a45-a007-279a231ec974-kube-api-access-n2tq8\") pod \"glance-d766-account-create-update-p7lfg\" (UID: \"d49e5ae1-6976-4a45-a007-279a231ec974\") " pod="openstack/glance-d766-account-create-update-p7lfg" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.267987 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d49e5ae1-6976-4a45-a007-279a231ec974-operator-scripts\") pod \"glance-d766-account-create-update-p7lfg\" (UID: \"d49e5ae1-6976-4a45-a007-279a231ec974\") " pod="openstack/glance-d766-account-create-update-p7lfg" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.284969 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-g6j4q" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.293073 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2tq8\" (UniqueName: \"kubernetes.io/projected/d49e5ae1-6976-4a45-a007-279a231ec974-kube-api-access-n2tq8\") pod \"glance-d766-account-create-update-p7lfg\" (UID: \"d49e5ae1-6976-4a45-a007-279a231ec974\") " pod="openstack/glance-d766-account-create-update-p7lfg" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.356031 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d766-account-create-update-p7lfg" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.552905 4843 generic.go:334] "Generic (PLEG): container finished" podID="d8e7959b-d36c-4a07-86f2-77bf271876ce" containerID="7801c60089bb685762c5df2ec2757e4ff8016d98b29cd7766b6cb3bd34b37f53" exitCode=0 Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.554240 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cznn8" event={"ID":"d8e7959b-d36c-4a07-86f2-77bf271876ce","Type":"ContainerDied","Data":"7801c60089bb685762c5df2ec2757e4ff8016d98b29cd7766b6cb3bd34b37f53"} Mar 18 12:31:20 crc kubenswrapper[4843]: W0318 12:31:20.629847 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd49e5ae1_6976_4a45_a007_279a231ec974.slice/crio-009c88b05255785c201753a5f42735fd196afbacbe113f94189bcc1a863e5aba WatchSource:0}: Error finding container 009c88b05255785c201753a5f42735fd196afbacbe113f94189bcc1a863e5aba: Status 404 returned error can't find the container with id 009c88b05255785c201753a5f42735fd196afbacbe113f94189bcc1a863e5aba Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.632709 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d766-account-create-update-p7lfg"] Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.702935 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-686w4"] Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.704338 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-686w4" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.711222 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-686w4"] Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.735097 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-g6j4q"] Mar 18 12:31:20 crc kubenswrapper[4843]: W0318 12:31:20.741368 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e792089_e775_4ff8_85db_e7cfacd8bba6.slice/crio-25783ee59fab546e809491cff720cafc367a8b068cc0c0aedcec32f1003d6a21 WatchSource:0}: Error finding container 25783ee59fab546e809491cff720cafc367a8b068cc0c0aedcec32f1003d6a21: Status 404 returned error can't find the container with id 25783ee59fab546e809491cff720cafc367a8b068cc0c0aedcec32f1003d6a21 Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.804127 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6007-account-create-update-jc8kx"] Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.818701 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6007-account-create-update-jc8kx"] Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.818810 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6007-account-create-update-jc8kx" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.821294 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.881678 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfx97\" (UniqueName: \"kubernetes.io/projected/bf32744d-f781-4703-b0e2-0ca8ca852092-kube-api-access-vfx97\") pod \"keystone-db-create-686w4\" (UID: \"bf32744d-f781-4703-b0e2-0ca8ca852092\") " pod="openstack/keystone-db-create-686w4" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.881851 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf32744d-f781-4703-b0e2-0ca8ca852092-operator-scripts\") pod \"keystone-db-create-686w4\" (UID: \"bf32744d-f781-4703-b0e2-0ca8ca852092\") " pod="openstack/keystone-db-create-686w4" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.923330 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-sqt7x"] Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.924569 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sqt7x" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.941315 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-sqt7x"] Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.985985 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b8e94f1-e634-4d66-8f8e-939ede76f529-operator-scripts\") pod \"keystone-6007-account-create-update-jc8kx\" (UID: \"9b8e94f1-e634-4d66-8f8e-939ede76f529\") " pod="openstack/keystone-6007-account-create-update-jc8kx" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.986090 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf32744d-f781-4703-b0e2-0ca8ca852092-operator-scripts\") pod \"keystone-db-create-686w4\" (UID: \"bf32744d-f781-4703-b0e2-0ca8ca852092\") " pod="openstack/keystone-db-create-686w4" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.986188 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tdbd\" (UniqueName: \"kubernetes.io/projected/9b8e94f1-e634-4d66-8f8e-939ede76f529-kube-api-access-8tdbd\") pod \"keystone-6007-account-create-update-jc8kx\" (UID: \"9b8e94f1-e634-4d66-8f8e-939ede76f529\") " pod="openstack/keystone-6007-account-create-update-jc8kx" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.986216 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfx97\" (UniqueName: \"kubernetes.io/projected/bf32744d-f781-4703-b0e2-0ca8ca852092-kube-api-access-vfx97\") pod \"keystone-db-create-686w4\" (UID: \"bf32744d-f781-4703-b0e2-0ca8ca852092\") " pod="openstack/keystone-db-create-686w4" Mar 18 12:31:20 crc kubenswrapper[4843]: I0318 12:31:20.987437 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf32744d-f781-4703-b0e2-0ca8ca852092-operator-scripts\") pod \"keystone-db-create-686w4\" (UID: \"bf32744d-f781-4703-b0e2-0ca8ca852092\") " pod="openstack/keystone-db-create-686w4" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.000376 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5ee3-account-create-update-5hns2"] Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.001585 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ee3-account-create-update-5hns2" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.004622 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.009282 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfx97\" (UniqueName: \"kubernetes.io/projected/bf32744d-f781-4703-b0e2-0ca8ca852092-kube-api-access-vfx97\") pod \"keystone-db-create-686w4\" (UID: \"bf32744d-f781-4703-b0e2-0ca8ca852092\") " pod="openstack/keystone-db-create-686w4" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.012318 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ee3-account-create-update-5hns2"] Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.031572 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-686w4" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.088178 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b8e94f1-e634-4d66-8f8e-939ede76f529-operator-scripts\") pod \"keystone-6007-account-create-update-jc8kx\" (UID: \"9b8e94f1-e634-4d66-8f8e-939ede76f529\") " pod="openstack/keystone-6007-account-create-update-jc8kx" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.088859 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5hxl\" (UniqueName: \"kubernetes.io/projected/0d5becb7-c02d-46fd-8ced-be1b3bfcf16f-kube-api-access-b5hxl\") pod \"placement-db-create-sqt7x\" (UID: \"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f\") " pod="openstack/placement-db-create-sqt7x" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.088979 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b8e94f1-e634-4d66-8f8e-939ede76f529-operator-scripts\") pod \"keystone-6007-account-create-update-jc8kx\" (UID: \"9b8e94f1-e634-4d66-8f8e-939ede76f529\") " pod="openstack/keystone-6007-account-create-update-jc8kx" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.089023 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d5becb7-c02d-46fd-8ced-be1b3bfcf16f-operator-scripts\") pod \"placement-db-create-sqt7x\" (UID: \"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f\") " pod="openstack/placement-db-create-sqt7x" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.089120 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tdbd\" (UniqueName: \"kubernetes.io/projected/9b8e94f1-e634-4d66-8f8e-939ede76f529-kube-api-access-8tdbd\") pod \"keystone-6007-account-create-update-jc8kx\" (UID: \"9b8e94f1-e634-4d66-8f8e-939ede76f529\") " pod="openstack/keystone-6007-account-create-update-jc8kx" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.105546 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tdbd\" (UniqueName: \"kubernetes.io/projected/9b8e94f1-e634-4d66-8f8e-939ede76f529-kube-api-access-8tdbd\") pod \"keystone-6007-account-create-update-jc8kx\" (UID: \"9b8e94f1-e634-4d66-8f8e-939ede76f529\") " pod="openstack/keystone-6007-account-create-update-jc8kx" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.148829 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6007-account-create-update-jc8kx" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.191218 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66e19b12-173c-40cb-8e07-494707530bc1-operator-scripts\") pod \"placement-5ee3-account-create-update-5hns2\" (UID: \"66e19b12-173c-40cb-8e07-494707530bc1\") " pod="openstack/placement-5ee3-account-create-update-5hns2" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.191377 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5hxl\" (UniqueName: \"kubernetes.io/projected/0d5becb7-c02d-46fd-8ced-be1b3bfcf16f-kube-api-access-b5hxl\") pod \"placement-db-create-sqt7x\" (UID: \"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f\") " pod="openstack/placement-db-create-sqt7x" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.191471 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hghk9\" (UniqueName: \"kubernetes.io/projected/66e19b12-173c-40cb-8e07-494707530bc1-kube-api-access-hghk9\") pod \"placement-5ee3-account-create-update-5hns2\" (UID: \"66e19b12-173c-40cb-8e07-494707530bc1\") " pod="openstack/placement-5ee3-account-create-update-5hns2" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.191515 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d5becb7-c02d-46fd-8ced-be1b3bfcf16f-operator-scripts\") pod \"placement-db-create-sqt7x\" (UID: \"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f\") " pod="openstack/placement-db-create-sqt7x" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.197079 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d5becb7-c02d-46fd-8ced-be1b3bfcf16f-operator-scripts\") pod \"placement-db-create-sqt7x\" (UID: \"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f\") " pod="openstack/placement-db-create-sqt7x" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.209513 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5hxl\" (UniqueName: \"kubernetes.io/projected/0d5becb7-c02d-46fd-8ced-be1b3bfcf16f-kube-api-access-b5hxl\") pod \"placement-db-create-sqt7x\" (UID: \"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f\") " pod="openstack/placement-db-create-sqt7x" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.242670 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sqt7x" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.294191 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66e19b12-173c-40cb-8e07-494707530bc1-operator-scripts\") pod \"placement-5ee3-account-create-update-5hns2\" (UID: \"66e19b12-173c-40cb-8e07-494707530bc1\") " pod="openstack/placement-5ee3-account-create-update-5hns2" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.294866 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hghk9\" (UniqueName: \"kubernetes.io/projected/66e19b12-173c-40cb-8e07-494707530bc1-kube-api-access-hghk9\") pod \"placement-5ee3-account-create-update-5hns2\" (UID: \"66e19b12-173c-40cb-8e07-494707530bc1\") " pod="openstack/placement-5ee3-account-create-update-5hns2" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.295164 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66e19b12-173c-40cb-8e07-494707530bc1-operator-scripts\") pod \"placement-5ee3-account-create-update-5hns2\" (UID: \"66e19b12-173c-40cb-8e07-494707530bc1\") " pod="openstack/placement-5ee3-account-create-update-5hns2" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.311187 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hghk9\" (UniqueName: \"kubernetes.io/projected/66e19b12-173c-40cb-8e07-494707530bc1-kube-api-access-hghk9\") pod \"placement-5ee3-account-create-update-5hns2\" (UID: \"66e19b12-173c-40cb-8e07-494707530bc1\") " pod="openstack/placement-5ee3-account-create-update-5hns2" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.412084 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ee3-account-create-update-5hns2" Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.467855 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-686w4"] Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.477210 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6007-account-create-update-jc8kx"] Mar 18 12:31:21 crc kubenswrapper[4843]: W0318 12:31:21.485217 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf32744d_f781_4703_b0e2_0ca8ca852092.slice/crio-3645d5a8f7b9824510f16829454abd88f5b2965851001696843da588486236ae WatchSource:0}: Error finding container 3645d5a8f7b9824510f16829454abd88f5b2965851001696843da588486236ae: Status 404 returned error can't find the container with id 3645d5a8f7b9824510f16829454abd88f5b2965851001696843da588486236ae Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.514781 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:21 crc kubenswrapper[4843]: E0318 12:31:21.514956 4843 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:31:21 crc kubenswrapper[4843]: E0318 12:31:21.514981 4843 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:31:21 crc kubenswrapper[4843]: E0318 12:31:21.515037 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift podName:4e07598e-c70f-4beb-a828-b58cb64c38c0 nodeName:}" failed. No retries permitted until 2026-03-18 12:31:29.515020514 +0000 UTC m=+1323.230846038 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift") pod "swift-storage-0" (UID: "4e07598e-c70f-4beb-a828-b58cb64c38c0") : configmap "swift-ring-files" not found Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.560857 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-686w4" event={"ID":"bf32744d-f781-4703-b0e2-0ca8ca852092","Type":"ContainerStarted","Data":"3645d5a8f7b9824510f16829454abd88f5b2965851001696843da588486236ae"} Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.561817 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d766-account-create-update-p7lfg" event={"ID":"d49e5ae1-6976-4a45-a007-279a231ec974","Type":"ContainerStarted","Data":"009c88b05255785c201753a5f42735fd196afbacbe113f94189bcc1a863e5aba"} Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.564570 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6007-account-create-update-jc8kx" event={"ID":"9b8e94f1-e634-4d66-8f8e-939ede76f529","Type":"ContainerStarted","Data":"48d90f8809c4f7f48b2b557a77b033193a92222bfc6470e35c763073451fc47c"} Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.565458 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-g6j4q" event={"ID":"7e792089-e775-4ff8-85db-e7cfacd8bba6","Type":"ContainerStarted","Data":"25783ee59fab546e809491cff720cafc367a8b068cc0c0aedcec32f1003d6a21"} Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.832601 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-sqt7x"] Mar 18 12:31:21 crc kubenswrapper[4843]: W0318 12:31:21.834823 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d5becb7_c02d_46fd_8ced_be1b3bfcf16f.slice/crio-0d35e7f82111c281a48b36fcff31769dfb50482018a8b333af11b323880c5472 WatchSource:0}: Error finding container 0d35e7f82111c281a48b36fcff31769dfb50482018a8b333af11b323880c5472: Status 404 returned error can't find the container with id 0d35e7f82111c281a48b36fcff31769dfb50482018a8b333af11b323880c5472 Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.887431 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ee3-account-create-update-5hns2"] Mar 18 12:31:21 crc kubenswrapper[4843]: I0318 12:31:21.889835 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cznn8" Mar 18 12:31:21 crc kubenswrapper[4843]: W0318 12:31:21.894166 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66e19b12_173c_40cb_8e07_494707530bc1.slice/crio-d11f17e1f6cb95c9a553670b0fa3e0424c7cffa9f29b8d567f00c06c782f8147 WatchSource:0}: Error finding container d11f17e1f6cb95c9a553670b0fa3e0424c7cffa9f29b8d567f00c06c782f8147: Status 404 returned error can't find the container with id d11f17e1f6cb95c9a553670b0fa3e0424c7cffa9f29b8d567f00c06c782f8147 Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.022983 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z2c2\" (UniqueName: \"kubernetes.io/projected/d8e7959b-d36c-4a07-86f2-77bf271876ce-kube-api-access-2z2c2\") pod \"d8e7959b-d36c-4a07-86f2-77bf271876ce\" (UID: \"d8e7959b-d36c-4a07-86f2-77bf271876ce\") " Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.023328 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e7959b-d36c-4a07-86f2-77bf271876ce-operator-scripts\") pod \"d8e7959b-d36c-4a07-86f2-77bf271876ce\" (UID: \"d8e7959b-d36c-4a07-86f2-77bf271876ce\") " Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.024395 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e7959b-d36c-4a07-86f2-77bf271876ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8e7959b-d36c-4a07-86f2-77bf271876ce" (UID: "d8e7959b-d36c-4a07-86f2-77bf271876ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.028722 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e7959b-d36c-4a07-86f2-77bf271876ce-kube-api-access-2z2c2" (OuterVolumeSpecName: "kube-api-access-2z2c2") pod "d8e7959b-d36c-4a07-86f2-77bf271876ce" (UID: "d8e7959b-d36c-4a07-86f2-77bf271876ce"). InnerVolumeSpecName "kube-api-access-2z2c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.125419 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e7959b-d36c-4a07-86f2-77bf271876ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.125446 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z2c2\" (UniqueName: \"kubernetes.io/projected/d8e7959b-d36c-4a07-86f2-77bf271876ce-kube-api-access-2z2c2\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.538942 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.587810 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cznn8" event={"ID":"d8e7959b-d36c-4a07-86f2-77bf271876ce","Type":"ContainerDied","Data":"0b4700770ebe9f1c0045968dcb02efa4c6e22f5b7d1f5445091d2eed247786cc"} Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.587881 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b4700770ebe9f1c0045968dcb02efa4c6e22f5b7d1f5445091d2eed247786cc" Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.587827 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cznn8" Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.590009 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ee3-account-create-update-5hns2" event={"ID":"66e19b12-173c-40cb-8e07-494707530bc1","Type":"ContainerStarted","Data":"d11f17e1f6cb95c9a553670b0fa3e0424c7cffa9f29b8d567f00c06c782f8147"} Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.591846 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sqt7x" event={"ID":"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f","Type":"ContainerStarted","Data":"0d35e7f82111c281a48b36fcff31769dfb50482018a8b333af11b323880c5472"} Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.608785 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bgzck"] Mar 18 12:31:22 crc kubenswrapper[4843]: I0318 12:31:22.609006 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" podUID="91d7bffb-7c24-4a70-a412-258080407683" containerName="dnsmasq-dns" containerID="cri-o://87509fb4c14fd3cb642c566b345ac28b606679367470450070b16ae58ef7e137" gracePeriod=10 Mar 18 12:31:23 crc kubenswrapper[4843]: I0318 12:31:23.164601 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cznn8"] Mar 18 12:31:23 crc kubenswrapper[4843]: I0318 12:31:23.175782 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-cznn8"] Mar 18 12:31:23 crc kubenswrapper[4843]: I0318 12:31:23.601919 4843 generic.go:334] "Generic (PLEG): container finished" podID="91d7bffb-7c24-4a70-a412-258080407683" containerID="87509fb4c14fd3cb642c566b345ac28b606679367470450070b16ae58ef7e137" exitCode=0 Mar 18 12:31:23 crc kubenswrapper[4843]: I0318 12:31:23.601979 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" event={"ID":"91d7bffb-7c24-4a70-a412-258080407683","Type":"ContainerDied","Data":"87509fb4c14fd3cb642c566b345ac28b606679367470450070b16ae58ef7e137"} Mar 18 12:31:24 crc kubenswrapper[4843]: I0318 12:31:24.619194 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6007-account-create-update-jc8kx" event={"ID":"9b8e94f1-e634-4d66-8f8e-939ede76f529","Type":"ContainerStarted","Data":"d870cba98f1e27e5b426eb39e65593aa769e1143aef02c8d8739ea6a6ad4505c"} Mar 18 12:31:24 crc kubenswrapper[4843]: I0318 12:31:24.627147 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ee3-account-create-update-5hns2" event={"ID":"66e19b12-173c-40cb-8e07-494707530bc1","Type":"ContainerStarted","Data":"1336150f6db4c3268a81a5115bbdc5f4e84985ef4ad52f512d9aaebf978ff5fc"} Mar 18 12:31:24 crc kubenswrapper[4843]: I0318 12:31:24.635102 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-g6j4q" event={"ID":"7e792089-e775-4ff8-85db-e7cfacd8bba6","Type":"ContainerStarted","Data":"fee1484bbff24cc68525b9dbb843b766538eeff0e6e48955fa69c7bda4cdec85"} Mar 18 12:31:24 crc kubenswrapper[4843]: I0318 12:31:24.637962 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-686w4" event={"ID":"bf32744d-f781-4703-b0e2-0ca8ca852092","Type":"ContainerStarted","Data":"1e6591e4d061c1a956713a72c32f83b6f4fa18d77f327e88dc1d5d74205818d8"} Mar 18 12:31:24 crc kubenswrapper[4843]: I0318 12:31:24.647718 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6007-account-create-update-jc8kx" podStartSLOduration=4.64769681 podStartE2EDuration="4.64769681s" podCreationTimestamp="2026-03-18 12:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:24.645640432 +0000 UTC m=+1318.361465966" watchObservedRunningTime="2026-03-18 12:31:24.64769681 +0000 UTC m=+1318.363522344" Mar 18 12:31:24 crc kubenswrapper[4843]: I0318 12:31:24.653616 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d766-account-create-update-p7lfg" event={"ID":"d49e5ae1-6976-4a45-a007-279a231ec974","Type":"ContainerStarted","Data":"9dd859577e64e8d95fb5e62e170d352f8731175bb06d53be92e529ed3818d6f1"} Mar 18 12:31:24 crc kubenswrapper[4843]: I0318 12:31:24.666111 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sqt7x" event={"ID":"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f","Type":"ContainerStarted","Data":"84f1e80a55595ce219003cd95ea8901fed26fe1122aad71ab147263581baf074"} Mar 18 12:31:24 crc kubenswrapper[4843]: I0318 12:31:24.673854 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5ee3-account-create-update-5hns2" podStartSLOduration=4.673837094 podStartE2EDuration="4.673837094s" podCreationTimestamp="2026-03-18 12:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:24.666492925 +0000 UTC m=+1318.382318459" watchObservedRunningTime="2026-03-18 12:31:24.673837094 +0000 UTC m=+1318.389662618" Mar 18 12:31:24 crc kubenswrapper[4843]: I0318 12:31:24.707364 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-686w4" podStartSLOduration=4.707347887 podStartE2EDuration="4.707347887s" podCreationTimestamp="2026-03-18 12:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:24.703024254 +0000 UTC m=+1318.418849778" watchObservedRunningTime="2026-03-18 12:31:24.707347887 +0000 UTC m=+1318.423173411" Mar 18 12:31:24 crc kubenswrapper[4843]: I0318 12:31:24.707803 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-g6j4q" podStartSLOduration=5.707794239 podStartE2EDuration="5.707794239s" podCreationTimestamp="2026-03-18 12:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:24.689841099 +0000 UTC m=+1318.405666623" watchObservedRunningTime="2026-03-18 12:31:24.707794239 +0000 UTC m=+1318.423619763" Mar 18 12:31:24 crc kubenswrapper[4843]: I0318 12:31:24.735279 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-d766-account-create-update-p7lfg" podStartSLOduration=4.73525748 podStartE2EDuration="4.73525748s" podCreationTimestamp="2026-03-18 12:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:24.723335311 +0000 UTC m=+1318.439160855" watchObservedRunningTime="2026-03-18 12:31:24.73525748 +0000 UTC m=+1318.451083004" Mar 18 12:31:24 crc kubenswrapper[4843]: I0318 12:31:24.750634 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-sqt7x" podStartSLOduration=4.750617017 podStartE2EDuration="4.750617017s" podCreationTimestamp="2026-03-18 12:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:24.744363649 +0000 UTC m=+1318.460189173" watchObservedRunningTime="2026-03-18 12:31:24.750617017 +0000 UTC m=+1318.466442541" Mar 18 12:31:24 crc kubenswrapper[4843]: I0318 12:31:24.999556 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e7959b-d36c-4a07-86f2-77bf271876ce" path="/var/lib/kubelet/pods/d8e7959b-d36c-4a07-86f2-77bf271876ce/volumes" Mar 18 12:31:25 crc kubenswrapper[4843]: E0318 12:31:25.020402 4843 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66e19b12_173c_40cb_8e07_494707530bc1.slice/crio-conmon-1336150f6db4c3268a81a5115bbdc5f4e84985ef4ad52f512d9aaebf978ff5fc.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.193757 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.295352 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfxsn\" (UniqueName: \"kubernetes.io/projected/91d7bffb-7c24-4a70-a412-258080407683-kube-api-access-bfxsn\") pod \"91d7bffb-7c24-4a70-a412-258080407683\" (UID: \"91d7bffb-7c24-4a70-a412-258080407683\") " Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.295618 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91d7bffb-7c24-4a70-a412-258080407683-dns-svc\") pod \"91d7bffb-7c24-4a70-a412-258080407683\" (UID: \"91d7bffb-7c24-4a70-a412-258080407683\") " Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.295682 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91d7bffb-7c24-4a70-a412-258080407683-config\") pod \"91d7bffb-7c24-4a70-a412-258080407683\" (UID: \"91d7bffb-7c24-4a70-a412-258080407683\") " Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.300713 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d7bffb-7c24-4a70-a412-258080407683-kube-api-access-bfxsn" (OuterVolumeSpecName: "kube-api-access-bfxsn") pod "91d7bffb-7c24-4a70-a412-258080407683" (UID: "91d7bffb-7c24-4a70-a412-258080407683"). InnerVolumeSpecName "kube-api-access-bfxsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.368484 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91d7bffb-7c24-4a70-a412-258080407683-config" (OuterVolumeSpecName: "config") pod "91d7bffb-7c24-4a70-a412-258080407683" (UID: "91d7bffb-7c24-4a70-a412-258080407683"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.372891 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91d7bffb-7c24-4a70-a412-258080407683-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "91d7bffb-7c24-4a70-a412-258080407683" (UID: "91d7bffb-7c24-4a70-a412-258080407683"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.398526 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfxsn\" (UniqueName: \"kubernetes.io/projected/91d7bffb-7c24-4a70-a412-258080407683-kube-api-access-bfxsn\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.398561 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91d7bffb-7c24-4a70-a412-258080407683-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.398570 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91d7bffb-7c24-4a70-a412-258080407683-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.680689 4843 generic.go:334] "Generic (PLEG): container finished" podID="0d5becb7-c02d-46fd-8ced-be1b3bfcf16f" containerID="84f1e80a55595ce219003cd95ea8901fed26fe1122aad71ab147263581baf074" exitCode=0 Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.680782 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sqt7x" event={"ID":"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f","Type":"ContainerDied","Data":"84f1e80a55595ce219003cd95ea8901fed26fe1122aad71ab147263581baf074"} Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.683537 4843 generic.go:334] "Generic (PLEG): container finished" podID="9b8e94f1-e634-4d66-8f8e-939ede76f529" containerID="d870cba98f1e27e5b426eb39e65593aa769e1143aef02c8d8739ea6a6ad4505c" exitCode=0 Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.683608 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6007-account-create-update-jc8kx" event={"ID":"9b8e94f1-e634-4d66-8f8e-939ede76f529","Type":"ContainerDied","Data":"d870cba98f1e27e5b426eb39e65593aa769e1143aef02c8d8739ea6a6ad4505c"} Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.685625 4843 generic.go:334] "Generic (PLEG): container finished" podID="66e19b12-173c-40cb-8e07-494707530bc1" containerID="1336150f6db4c3268a81a5115bbdc5f4e84985ef4ad52f512d9aaebf978ff5fc" exitCode=0 Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.685724 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ee3-account-create-update-5hns2" event={"ID":"66e19b12-173c-40cb-8e07-494707530bc1","Type":"ContainerDied","Data":"1336150f6db4c3268a81a5115bbdc5f4e84985ef4ad52f512d9aaebf978ff5fc"} Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.687242 4843 generic.go:334] "Generic (PLEG): container finished" podID="7e792089-e775-4ff8-85db-e7cfacd8bba6" containerID="fee1484bbff24cc68525b9dbb843b766538eeff0e6e48955fa69c7bda4cdec85" exitCode=0 Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.687337 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-g6j4q" event={"ID":"7e792089-e775-4ff8-85db-e7cfacd8bba6","Type":"ContainerDied","Data":"fee1484bbff24cc68525b9dbb843b766538eeff0e6e48955fa69c7bda4cdec85"} Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.689447 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" event={"ID":"91d7bffb-7c24-4a70-a412-258080407683","Type":"ContainerDied","Data":"ca8bbf1a9465fea6818f7cce1e37652f79ee2cbed8e8ea9238c8fb9944e22b29"} Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.689494 4843 scope.go:117] "RemoveContainer" containerID="87509fb4c14fd3cb642c566b345ac28b606679367470450070b16ae58ef7e137" Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.689703 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bgzck" Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.697915 4843 generic.go:334] "Generic (PLEG): container finished" podID="bf32744d-f781-4703-b0e2-0ca8ca852092" containerID="1e6591e4d061c1a956713a72c32f83b6f4fa18d77f327e88dc1d5d74205818d8" exitCode=0 Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.697997 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-686w4" event={"ID":"bf32744d-f781-4703-b0e2-0ca8ca852092","Type":"ContainerDied","Data":"1e6591e4d061c1a956713a72c32f83b6f4fa18d77f327e88dc1d5d74205818d8"} Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.699913 4843 generic.go:334] "Generic (PLEG): container finished" podID="d49e5ae1-6976-4a45-a007-279a231ec974" containerID="9dd859577e64e8d95fb5e62e170d352f8731175bb06d53be92e529ed3818d6f1" exitCode=0 Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.699940 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d766-account-create-update-p7lfg" event={"ID":"d49e5ae1-6976-4a45-a007-279a231ec974","Type":"ContainerDied","Data":"9dd859577e64e8d95fb5e62e170d352f8731175bb06d53be92e529ed3818d6f1"} Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.766010 4843 scope.go:117] "RemoveContainer" containerID="37d5b9b85eac1267ae67eea2778b78b9af5de3c132a92be41b1de304c90c36e9" Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.839984 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bgzck"] Mar 18 12:31:25 crc kubenswrapper[4843]: I0318 12:31:25.845627 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bgzck"] Mar 18 12:31:26 crc kubenswrapper[4843]: I0318 12:31:26.881391 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-s68w5"] Mar 18 12:31:26 crc kubenswrapper[4843]: E0318 12:31:26.882131 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d7bffb-7c24-4a70-a412-258080407683" containerName="init" Mar 18 12:31:26 crc kubenswrapper[4843]: I0318 12:31:26.882147 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d7bffb-7c24-4a70-a412-258080407683" containerName="init" Mar 18 12:31:26 crc kubenswrapper[4843]: E0318 12:31:26.882174 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d7bffb-7c24-4a70-a412-258080407683" containerName="dnsmasq-dns" Mar 18 12:31:26 crc kubenswrapper[4843]: I0318 12:31:26.882182 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d7bffb-7c24-4a70-a412-258080407683" containerName="dnsmasq-dns" Mar 18 12:31:26 crc kubenswrapper[4843]: E0318 12:31:26.882202 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e7959b-d36c-4a07-86f2-77bf271876ce" containerName="mariadb-account-create-update" Mar 18 12:31:26 crc kubenswrapper[4843]: I0318 12:31:26.882210 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e7959b-d36c-4a07-86f2-77bf271876ce" containerName="mariadb-account-create-update" Mar 18 12:31:26 crc kubenswrapper[4843]: I0318 12:31:26.882454 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e7959b-d36c-4a07-86f2-77bf271876ce" containerName="mariadb-account-create-update" Mar 18 12:31:26 crc kubenswrapper[4843]: I0318 12:31:26.882476 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d7bffb-7c24-4a70-a412-258080407683" containerName="dnsmasq-dns" Mar 18 12:31:26 crc kubenswrapper[4843]: I0318 12:31:26.883175 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-s68w5" Mar 18 12:31:26 crc kubenswrapper[4843]: I0318 12:31:26.885868 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 12:31:26 crc kubenswrapper[4843]: I0318 12:31:26.902812 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-s68w5"] Mar 18 12:31:26 crc kubenswrapper[4843]: I0318 12:31:26.926035 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bb0988d-75e1-4300-a2ac-16becc785231-operator-scripts\") pod \"root-account-create-update-s68w5\" (UID: \"7bb0988d-75e1-4300-a2ac-16becc785231\") " pod="openstack/root-account-create-update-s68w5" Mar 18 12:31:26 crc kubenswrapper[4843]: I0318 12:31:26.926167 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lvgf\" (UniqueName: \"kubernetes.io/projected/7bb0988d-75e1-4300-a2ac-16becc785231-kube-api-access-5lvgf\") pod \"root-account-create-update-s68w5\" (UID: \"7bb0988d-75e1-4300-a2ac-16becc785231\") " pod="openstack/root-account-create-update-s68w5" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.006413 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d7bffb-7c24-4a70-a412-258080407683" path="/var/lib/kubelet/pods/91d7bffb-7c24-4a70-a412-258080407683/volumes" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.027131 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bb0988d-75e1-4300-a2ac-16becc785231-operator-scripts\") pod \"root-account-create-update-s68w5\" (UID: \"7bb0988d-75e1-4300-a2ac-16becc785231\") " pod="openstack/root-account-create-update-s68w5" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.027683 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lvgf\" (UniqueName: \"kubernetes.io/projected/7bb0988d-75e1-4300-a2ac-16becc785231-kube-api-access-5lvgf\") pod \"root-account-create-update-s68w5\" (UID: \"7bb0988d-75e1-4300-a2ac-16becc785231\") " pod="openstack/root-account-create-update-s68w5" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.028640 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bb0988d-75e1-4300-a2ac-16becc785231-operator-scripts\") pod \"root-account-create-update-s68w5\" (UID: \"7bb0988d-75e1-4300-a2ac-16becc785231\") " pod="openstack/root-account-create-update-s68w5" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.067333 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lvgf\" (UniqueName: \"kubernetes.io/projected/7bb0988d-75e1-4300-a2ac-16becc785231-kube-api-access-5lvgf\") pod \"root-account-create-update-s68w5\" (UID: \"7bb0988d-75e1-4300-a2ac-16becc785231\") " pod="openstack/root-account-create-update-s68w5" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.133397 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-686w4" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.212185 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-s68w5" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.319388 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6007-account-create-update-jc8kx" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.344917 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdbd\" (UniqueName: \"kubernetes.io/projected/9b8e94f1-e634-4d66-8f8e-939ede76f529-kube-api-access-8tdbd\") pod \"9b8e94f1-e634-4d66-8f8e-939ede76f529\" (UID: \"9b8e94f1-e634-4d66-8f8e-939ede76f529\") " Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.345147 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfx97\" (UniqueName: \"kubernetes.io/projected/bf32744d-f781-4703-b0e2-0ca8ca852092-kube-api-access-vfx97\") pod \"bf32744d-f781-4703-b0e2-0ca8ca852092\" (UID: \"bf32744d-f781-4703-b0e2-0ca8ca852092\") " Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.345191 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf32744d-f781-4703-b0e2-0ca8ca852092-operator-scripts\") pod \"bf32744d-f781-4703-b0e2-0ca8ca852092\" (UID: \"bf32744d-f781-4703-b0e2-0ca8ca852092\") " Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.346475 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf32744d-f781-4703-b0e2-0ca8ca852092-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf32744d-f781-4703-b0e2-0ca8ca852092" (UID: "bf32744d-f781-4703-b0e2-0ca8ca852092"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.347631 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sqt7x" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.356187 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ee3-account-create-update-5hns2" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.362931 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b8e94f1-e634-4d66-8f8e-939ede76f529-kube-api-access-8tdbd" (OuterVolumeSpecName: "kube-api-access-8tdbd") pod "9b8e94f1-e634-4d66-8f8e-939ede76f529" (UID: "9b8e94f1-e634-4d66-8f8e-939ede76f529"). InnerVolumeSpecName "kube-api-access-8tdbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.363490 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf32744d-f781-4703-b0e2-0ca8ca852092-kube-api-access-vfx97" (OuterVolumeSpecName: "kube-api-access-vfx97") pod "bf32744d-f781-4703-b0e2-0ca8ca852092" (UID: "bf32744d-f781-4703-b0e2-0ca8ca852092"). InnerVolumeSpecName "kube-api-access-vfx97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.366163 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-g6j4q" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.388356 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d766-account-create-update-p7lfg" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.447932 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2tq8\" (UniqueName: \"kubernetes.io/projected/d49e5ae1-6976-4a45-a007-279a231ec974-kube-api-access-n2tq8\") pod \"d49e5ae1-6976-4a45-a007-279a231ec974\" (UID: \"d49e5ae1-6976-4a45-a007-279a231ec974\") " Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.448011 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2wfj\" (UniqueName: \"kubernetes.io/projected/7e792089-e775-4ff8-85db-e7cfacd8bba6-kube-api-access-k2wfj\") pod \"7e792089-e775-4ff8-85db-e7cfacd8bba6\" (UID: \"7e792089-e775-4ff8-85db-e7cfacd8bba6\") " Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.448118 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e792089-e775-4ff8-85db-e7cfacd8bba6-operator-scripts\") pod \"7e792089-e775-4ff8-85db-e7cfacd8bba6\" (UID: \"7e792089-e775-4ff8-85db-e7cfacd8bba6\") " Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.448206 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5hxl\" (UniqueName: \"kubernetes.io/projected/0d5becb7-c02d-46fd-8ced-be1b3bfcf16f-kube-api-access-b5hxl\") pod \"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f\" (UID: \"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f\") " Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.448260 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b8e94f1-e634-4d66-8f8e-939ede76f529-operator-scripts\") pod \"9b8e94f1-e634-4d66-8f8e-939ede76f529\" (UID: \"9b8e94f1-e634-4d66-8f8e-939ede76f529\") " Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.448316 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d5becb7-c02d-46fd-8ced-be1b3bfcf16f-operator-scripts\") pod \"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f\" (UID: \"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f\") " Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.448362 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hghk9\" (UniqueName: \"kubernetes.io/projected/66e19b12-173c-40cb-8e07-494707530bc1-kube-api-access-hghk9\") pod \"66e19b12-173c-40cb-8e07-494707530bc1\" (UID: \"66e19b12-173c-40cb-8e07-494707530bc1\") " Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.448396 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d49e5ae1-6976-4a45-a007-279a231ec974-operator-scripts\") pod \"d49e5ae1-6976-4a45-a007-279a231ec974\" (UID: \"d49e5ae1-6976-4a45-a007-279a231ec974\") " Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.448480 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66e19b12-173c-40cb-8e07-494707530bc1-operator-scripts\") pod \"66e19b12-173c-40cb-8e07-494707530bc1\" (UID: \"66e19b12-173c-40cb-8e07-494707530bc1\") " Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.449225 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfx97\" (UniqueName: \"kubernetes.io/projected/bf32744d-f781-4703-b0e2-0ca8ca852092-kube-api-access-vfx97\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.449252 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf32744d-f781-4703-b0e2-0ca8ca852092-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.449267 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdbd\" (UniqueName: \"kubernetes.io/projected/9b8e94f1-e634-4d66-8f8e-939ede76f529-kube-api-access-8tdbd\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.449557 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66e19b12-173c-40cb-8e07-494707530bc1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66e19b12-173c-40cb-8e07-494707530bc1" (UID: "66e19b12-173c-40cb-8e07-494707530bc1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.449712 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5becb7-c02d-46fd-8ced-be1b3bfcf16f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d5becb7-c02d-46fd-8ced-be1b3bfcf16f" (UID: "0d5becb7-c02d-46fd-8ced-be1b3bfcf16f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.449923 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d49e5ae1-6976-4a45-a007-279a231ec974-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d49e5ae1-6976-4a45-a007-279a231ec974" (UID: "d49e5ae1-6976-4a45-a007-279a231ec974"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.449998 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b8e94f1-e634-4d66-8f8e-939ede76f529-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b8e94f1-e634-4d66-8f8e-939ede76f529" (UID: "9b8e94f1-e634-4d66-8f8e-939ede76f529"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.450568 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e792089-e775-4ff8-85db-e7cfacd8bba6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e792089-e775-4ff8-85db-e7cfacd8bba6" (UID: "7e792089-e775-4ff8-85db-e7cfacd8bba6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.452695 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66e19b12-173c-40cb-8e07-494707530bc1-kube-api-access-hghk9" (OuterVolumeSpecName: "kube-api-access-hghk9") pod "66e19b12-173c-40cb-8e07-494707530bc1" (UID: "66e19b12-173c-40cb-8e07-494707530bc1"). InnerVolumeSpecName "kube-api-access-hghk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.453413 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e792089-e775-4ff8-85db-e7cfacd8bba6-kube-api-access-k2wfj" (OuterVolumeSpecName: "kube-api-access-k2wfj") pod "7e792089-e775-4ff8-85db-e7cfacd8bba6" (UID: "7e792089-e775-4ff8-85db-e7cfacd8bba6"). InnerVolumeSpecName "kube-api-access-k2wfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.454047 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d49e5ae1-6976-4a45-a007-279a231ec974-kube-api-access-n2tq8" (OuterVolumeSpecName: "kube-api-access-n2tq8") pod "d49e5ae1-6976-4a45-a007-279a231ec974" (UID: "d49e5ae1-6976-4a45-a007-279a231ec974"). InnerVolumeSpecName "kube-api-access-n2tq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.454749 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d5becb7-c02d-46fd-8ced-be1b3bfcf16f-kube-api-access-b5hxl" (OuterVolumeSpecName: "kube-api-access-b5hxl") pod "0d5becb7-c02d-46fd-8ced-be1b3bfcf16f" (UID: "0d5becb7-c02d-46fd-8ced-be1b3bfcf16f"). InnerVolumeSpecName "kube-api-access-b5hxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.551459 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2tq8\" (UniqueName: \"kubernetes.io/projected/d49e5ae1-6976-4a45-a007-279a231ec974-kube-api-access-n2tq8\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.551499 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2wfj\" (UniqueName: \"kubernetes.io/projected/7e792089-e775-4ff8-85db-e7cfacd8bba6-kube-api-access-k2wfj\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.551513 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e792089-e775-4ff8-85db-e7cfacd8bba6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.551526 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5hxl\" (UniqueName: \"kubernetes.io/projected/0d5becb7-c02d-46fd-8ced-be1b3bfcf16f-kube-api-access-b5hxl\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.551537 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b8e94f1-e634-4d66-8f8e-939ede76f529-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.551548 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d5becb7-c02d-46fd-8ced-be1b3bfcf16f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.551559 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hghk9\" (UniqueName: \"kubernetes.io/projected/66e19b12-173c-40cb-8e07-494707530bc1-kube-api-access-hghk9\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.551573 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d49e5ae1-6976-4a45-a007-279a231ec974-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.551587 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66e19b12-173c-40cb-8e07-494707530bc1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.680111 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-s68w5"] Mar 18 12:31:27 crc kubenswrapper[4843]: W0318 12:31:27.686238 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bb0988d_75e1_4300_a2ac_16becc785231.slice/crio-67bfcf3bdc5780253d41b06da2323b395809c5d598b051df48cfd9b9d5d69605 WatchSource:0}: Error finding container 67bfcf3bdc5780253d41b06da2323b395809c5d598b051df48cfd9b9d5d69605: Status 404 returned error can't find the container with id 67bfcf3bdc5780253d41b06da2323b395809c5d598b051df48cfd9b9d5d69605 Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.697598 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.726753 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-686w4" event={"ID":"bf32744d-f781-4703-b0e2-0ca8ca852092","Type":"ContainerDied","Data":"3645d5a8f7b9824510f16829454abd88f5b2965851001696843da588486236ae"} Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.726795 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3645d5a8f7b9824510f16829454abd88f5b2965851001696843da588486236ae" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.726832 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-686w4" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.728674 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d766-account-create-update-p7lfg" event={"ID":"d49e5ae1-6976-4a45-a007-279a231ec974","Type":"ContainerDied","Data":"009c88b05255785c201753a5f42735fd196afbacbe113f94189bcc1a863e5aba"} Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.728703 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="009c88b05255785c201753a5f42735fd196afbacbe113f94189bcc1a863e5aba" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.728754 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d766-account-create-update-p7lfg" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.731041 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sqt7x" event={"ID":"0d5becb7-c02d-46fd-8ced-be1b3bfcf16f","Type":"ContainerDied","Data":"0d35e7f82111c281a48b36fcff31769dfb50482018a8b333af11b323880c5472"} Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.731071 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d35e7f82111c281a48b36fcff31769dfb50482018a8b333af11b323880c5472" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.731136 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sqt7x" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.740548 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6007-account-create-update-jc8kx" event={"ID":"9b8e94f1-e634-4d66-8f8e-939ede76f529","Type":"ContainerDied","Data":"48d90f8809c4f7f48b2b557a77b033193a92222bfc6470e35c763073451fc47c"} Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.740614 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48d90f8809c4f7f48b2b557a77b033193a92222bfc6470e35c763073451fc47c" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.740986 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6007-account-create-update-jc8kx" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.743572 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ee3-account-create-update-5hns2" event={"ID":"66e19b12-173c-40cb-8e07-494707530bc1","Type":"ContainerDied","Data":"d11f17e1f6cb95c9a553670b0fa3e0424c7cffa9f29b8d567f00c06c782f8147"} Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.743604 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d11f17e1f6cb95c9a553670b0fa3e0424c7cffa9f29b8d567f00c06c782f8147" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.743688 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ee3-account-create-update-5hns2" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.761514 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-s68w5" event={"ID":"7bb0988d-75e1-4300-a2ac-16becc785231","Type":"ContainerStarted","Data":"67bfcf3bdc5780253d41b06da2323b395809c5d598b051df48cfd9b9d5d69605"} Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.774276 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-g6j4q" event={"ID":"7e792089-e775-4ff8-85db-e7cfacd8bba6","Type":"ContainerDied","Data":"25783ee59fab546e809491cff720cafc367a8b068cc0c0aedcec32f1003d6a21"} Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.774314 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25783ee59fab546e809491cff720cafc367a8b068cc0c0aedcec32f1003d6a21" Mar 18 12:31:27 crc kubenswrapper[4843]: I0318 12:31:27.774674 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-g6j4q" Mar 18 12:31:28 crc kubenswrapper[4843]: I0318 12:31:28.789512 4843 generic.go:334] "Generic (PLEG): container finished" podID="570c96ec-626f-41e2-bf9b-5da8f8d65fa2" containerID="d2fc77594d550cd3709ba26637ecf204c09dd65f5851b07c7f8027d2082d1040" exitCode=0 Mar 18 12:31:28 crc kubenswrapper[4843]: I0318 12:31:28.789605 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9bkwq" event={"ID":"570c96ec-626f-41e2-bf9b-5da8f8d65fa2","Type":"ContainerDied","Data":"d2fc77594d550cd3709ba26637ecf204c09dd65f5851b07c7f8027d2082d1040"} Mar 18 12:31:28 crc kubenswrapper[4843]: I0318 12:31:28.793437 4843 generic.go:334] "Generic (PLEG): container finished" podID="7bb0988d-75e1-4300-a2ac-16becc785231" containerID="d18ff4054a4fa49ce68ba30bb5d9dd0ac68a79dd0409bf2a6e344e4166715504" exitCode=0 Mar 18 12:31:28 crc kubenswrapper[4843]: I0318 12:31:28.793505 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-s68w5" event={"ID":"7bb0988d-75e1-4300-a2ac-16becc785231","Type":"ContainerDied","Data":"d18ff4054a4fa49ce68ba30bb5d9dd0ac68a79dd0409bf2a6e344e4166715504"} Mar 18 12:31:29 crc kubenswrapper[4843]: I0318 12:31:29.031742 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 12:31:29 crc kubenswrapper[4843]: I0318 12:31:29.595012 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:29 crc kubenswrapper[4843]: I0318 12:31:29.601602 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4e07598e-c70f-4beb-a828-b58cb64c38c0-etc-swift\") pod \"swift-storage-0\" (UID: \"4e07598e-c70f-4beb-a828-b58cb64c38c0\") " pod="openstack/swift-storage-0" Mar 18 12:31:29 crc kubenswrapper[4843]: I0318 12:31:29.861052 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.186963 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.203061 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-s68w5" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.209277 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-combined-ca-bundle\") pod \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.209363 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqktb\" (UniqueName: \"kubernetes.io/projected/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-kube-api-access-tqktb\") pod \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.209411 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-scripts\") pod \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.209453 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-swiftconf\") pod \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.209483 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-ring-data-devices\") pod \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.209578 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-dispersionconf\") pod \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.209603 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lvgf\" (UniqueName: \"kubernetes.io/projected/7bb0988d-75e1-4300-a2ac-16becc785231-kube-api-access-5lvgf\") pod \"7bb0988d-75e1-4300-a2ac-16becc785231\" (UID: \"7bb0988d-75e1-4300-a2ac-16becc785231\") " Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.209624 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-etc-swift\") pod \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\" (UID: \"570c96ec-626f-41e2-bf9b-5da8f8d65fa2\") " Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.209663 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bb0988d-75e1-4300-a2ac-16becc785231-operator-scripts\") pod \"7bb0988d-75e1-4300-a2ac-16becc785231\" (UID: \"7bb0988d-75e1-4300-a2ac-16becc785231\") " Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.210979 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb0988d-75e1-4300-a2ac-16becc785231-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bb0988d-75e1-4300-a2ac-16becc785231" (UID: "7bb0988d-75e1-4300-a2ac-16becc785231"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.214613 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb0988d-75e1-4300-a2ac-16becc785231-kube-api-access-5lvgf" (OuterVolumeSpecName: "kube-api-access-5lvgf") pod "7bb0988d-75e1-4300-a2ac-16becc785231" (UID: "7bb0988d-75e1-4300-a2ac-16becc785231"). InnerVolumeSpecName "kube-api-access-5lvgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.215089 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-jqd9q"] Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.215495 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "570c96ec-626f-41e2-bf9b-5da8f8d65fa2" (UID: "570c96ec-626f-41e2-bf9b-5da8f8d65fa2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4843]: E0318 12:31:30.215905 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5becb7-c02d-46fd-8ced-be1b3bfcf16f" containerName="mariadb-database-create" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.215933 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5becb7-c02d-46fd-8ced-be1b3bfcf16f" containerName="mariadb-database-create" Mar 18 12:31:30 crc kubenswrapper[4843]: E0318 12:31:30.215959 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8e94f1-e634-4d66-8f8e-939ede76f529" containerName="mariadb-account-create-update" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.215968 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8e94f1-e634-4d66-8f8e-939ede76f529" containerName="mariadb-account-create-update" Mar 18 12:31:30 crc kubenswrapper[4843]: E0318 12:31:30.215988 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d49e5ae1-6976-4a45-a007-279a231ec974" containerName="mariadb-account-create-update" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.215996 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d49e5ae1-6976-4a45-a007-279a231ec974" containerName="mariadb-account-create-update" Mar 18 12:31:30 crc kubenswrapper[4843]: E0318 12:31:30.216027 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb0988d-75e1-4300-a2ac-16becc785231" containerName="mariadb-account-create-update" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.216038 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb0988d-75e1-4300-a2ac-16becc785231" containerName="mariadb-account-create-update" Mar 18 12:31:30 crc kubenswrapper[4843]: E0318 12:31:30.216072 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf32744d-f781-4703-b0e2-0ca8ca852092" containerName="mariadb-database-create" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.216081 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf32744d-f781-4703-b0e2-0ca8ca852092" containerName="mariadb-database-create" Mar 18 12:31:30 crc kubenswrapper[4843]: E0318 12:31:30.216100 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e19b12-173c-40cb-8e07-494707530bc1" containerName="mariadb-account-create-update" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.216108 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e19b12-173c-40cb-8e07-494707530bc1" containerName="mariadb-account-create-update" Mar 18 12:31:30 crc kubenswrapper[4843]: E0318 12:31:30.216127 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="570c96ec-626f-41e2-bf9b-5da8f8d65fa2" containerName="swift-ring-rebalance" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.216140 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="570c96ec-626f-41e2-bf9b-5da8f8d65fa2" containerName="swift-ring-rebalance" Mar 18 12:31:30 crc kubenswrapper[4843]: E0318 12:31:30.216162 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e792089-e775-4ff8-85db-e7cfacd8bba6" containerName="mariadb-database-create" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.216170 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e792089-e775-4ff8-85db-e7cfacd8bba6" containerName="mariadb-database-create" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.216270 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "570c96ec-626f-41e2-bf9b-5da8f8d65fa2" (UID: "570c96ec-626f-41e2-bf9b-5da8f8d65fa2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.216840 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="570c96ec-626f-41e2-bf9b-5da8f8d65fa2" containerName="swift-ring-rebalance" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.216955 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b8e94f1-e634-4d66-8f8e-939ede76f529" containerName="mariadb-account-create-update" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.216976 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb0988d-75e1-4300-a2ac-16becc785231" containerName="mariadb-account-create-update" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.217025 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d49e5ae1-6976-4a45-a007-279a231ec974" containerName="mariadb-account-create-update" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.217531 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e792089-e775-4ff8-85db-e7cfacd8bba6" containerName="mariadb-database-create" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.217543 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d5becb7-c02d-46fd-8ced-be1b3bfcf16f" containerName="mariadb-database-create" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.217584 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf32744d-f781-4703-b0e2-0ca8ca852092" containerName="mariadb-database-create" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.217596 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e19b12-173c-40cb-8e07-494707530bc1" containerName="mariadb-account-create-update" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.218068 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-kube-api-access-tqktb" (OuterVolumeSpecName: "kube-api-access-tqktb") pod "570c96ec-626f-41e2-bf9b-5da8f8d65fa2" (UID: "570c96ec-626f-41e2-bf9b-5da8f8d65fa2"). InnerVolumeSpecName "kube-api-access-tqktb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.218278 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "570c96ec-626f-41e2-bf9b-5da8f8d65fa2" (UID: "570c96ec-626f-41e2-bf9b-5da8f8d65fa2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.219536 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.227055 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fshgw" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.227167 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.228556 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jqd9q"] Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.230568 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-scripts" (OuterVolumeSpecName: "scripts") pod "570c96ec-626f-41e2-bf9b-5da8f8d65fa2" (UID: "570c96ec-626f-41e2-bf9b-5da8f8d65fa2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.243762 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "570c96ec-626f-41e2-bf9b-5da8f8d65fa2" (UID: "570c96ec-626f-41e2-bf9b-5da8f8d65fa2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.251874 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "570c96ec-626f-41e2-bf9b-5da8f8d65fa2" (UID: "570c96ec-626f-41e2-bf9b-5da8f8d65fa2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.310908 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-db-sync-config-data\") pod \"glance-db-sync-jqd9q\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.310960 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-config-data\") pod \"glance-db-sync-jqd9q\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.311040 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-combined-ca-bundle\") pod \"glance-db-sync-jqd9q\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.311069 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj96p\" (UniqueName: \"kubernetes.io/projected/2a59eee1-b0e8-402b-b89e-d7e461672a21-kube-api-access-bj96p\") pod \"glance-db-sync-jqd9q\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.311169 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.311218 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqktb\" (UniqueName: \"kubernetes.io/projected/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-kube-api-access-tqktb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.311238 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.311249 4843 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.311258 4843 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.311268 4843 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.311280 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lvgf\" (UniqueName: \"kubernetes.io/projected/7bb0988d-75e1-4300-a2ac-16becc785231-kube-api-access-5lvgf\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.311288 4843 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/570c96ec-626f-41e2-bf9b-5da8f8d65fa2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.311299 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bb0988d-75e1-4300-a2ac-16becc785231-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.371595 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5x5rj" podUID="a9cc1cd2-018b-40fc-9434-97d649bdd2a8" containerName="ovn-controller" probeResult="failure" output=< Mar 18 12:31:30 crc kubenswrapper[4843]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 12:31:30 crc kubenswrapper[4843]: > Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.412955 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-db-sync-config-data\") pod \"glance-db-sync-jqd9q\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.413006 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-config-data\") pod \"glance-db-sync-jqd9q\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.413277 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-combined-ca-bundle\") pod \"glance-db-sync-jqd9q\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.413317 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj96p\" (UniqueName: \"kubernetes.io/projected/2a59eee1-b0e8-402b-b89e-d7e461672a21-kube-api-access-bj96p\") pod \"glance-db-sync-jqd9q\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.417092 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-db-sync-config-data\") pod \"glance-db-sync-jqd9q\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.417642 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-config-data\") pod \"glance-db-sync-jqd9q\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.418845 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-combined-ca-bundle\") pod \"glance-db-sync-jqd9q\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.429350 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj96p\" (UniqueName: \"kubernetes.io/projected/2a59eee1-b0e8-402b-b89e-d7e461672a21-kube-api-access-bj96p\") pod \"glance-db-sync-jqd9q\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.480587 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 12:31:30 crc kubenswrapper[4843]: W0318 12:31:30.481679 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e07598e_c70f_4beb_a828_b58cb64c38c0.slice/crio-8eb46bc46516c88c1f4aa58fdb61ab70b87469deab7675bddd29ab7c75641020 WatchSource:0}: Error finding container 8eb46bc46516c88c1f4aa58fdb61ab70b87469deab7675bddd29ab7c75641020: Status 404 returned error can't find the container with id 8eb46bc46516c88c1f4aa58fdb61ab70b87469deab7675bddd29ab7c75641020 Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.546910 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.811053 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"8eb46bc46516c88c1f4aa58fdb61ab70b87469deab7675bddd29ab7c75641020"} Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.812412 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-s68w5" event={"ID":"7bb0988d-75e1-4300-a2ac-16becc785231","Type":"ContainerDied","Data":"67bfcf3bdc5780253d41b06da2323b395809c5d598b051df48cfd9b9d5d69605"} Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.812439 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67bfcf3bdc5780253d41b06da2323b395809c5d598b051df48cfd9b9d5d69605" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.812487 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-s68w5" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.819834 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9bkwq" event={"ID":"570c96ec-626f-41e2-bf9b-5da8f8d65fa2","Type":"ContainerDied","Data":"ad0cca07d752acfeb428e7f833fd071f40f72bfa7a32f4e682b703f6b0ab5e32"} Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.819896 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad0cca07d752acfeb428e7f833fd071f40f72bfa7a32f4e682b703f6b0ab5e32" Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.819922 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9bkwq" Mar 18 12:31:30 crc kubenswrapper[4843]: W0318 12:31:30.912341 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a59eee1_b0e8_402b_b89e_d7e461672a21.slice/crio-ad6eacb3e6d411e4382dd853b530e9b7e4a11adb74179ec5730a5f7fcb27593b WatchSource:0}: Error finding container ad6eacb3e6d411e4382dd853b530e9b7e4a11adb74179ec5730a5f7fcb27593b: Status 404 returned error can't find the container with id ad6eacb3e6d411e4382dd853b530e9b7e4a11adb74179ec5730a5f7fcb27593b Mar 18 12:31:30 crc kubenswrapper[4843]: I0318 12:31:30.924970 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jqd9q"] Mar 18 12:31:31 crc kubenswrapper[4843]: I0318 12:31:31.849267 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"8d193b3eb1934e4bf5362329121d736a8168aec502159e2ea15f2076bfc2ed9c"} Mar 18 12:31:31 crc kubenswrapper[4843]: I0318 12:31:31.850947 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jqd9q" event={"ID":"2a59eee1-b0e8-402b-b89e-d7e461672a21","Type":"ContainerStarted","Data":"ad6eacb3e6d411e4382dd853b530e9b7e4a11adb74179ec5730a5f7fcb27593b"} Mar 18 12:31:32 crc kubenswrapper[4843]: I0318 12:31:32.860541 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"f6844da624ffbb2813e4f45875596cbc46fdcb34822f248876bfe7142694af46"} Mar 18 12:31:32 crc kubenswrapper[4843]: I0318 12:31:32.860850 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"9851d2c9e20ebe9ab35a5538c957c43055ee116481c5647f0b27e7d0fd70d73c"} Mar 18 12:31:32 crc kubenswrapper[4843]: I0318 12:31:32.860860 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"b068d6f239b16b878a122b67d67fcf89c363e43595a7e8d54a2beb78332c7a93"} Mar 18 12:31:33 crc kubenswrapper[4843]: I0318 12:31:33.243759 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-s68w5"] Mar 18 12:31:33 crc kubenswrapper[4843]: I0318 12:31:33.249224 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-s68w5"] Mar 18 12:31:33 crc kubenswrapper[4843]: I0318 12:31:33.870101 4843 generic.go:334] "Generic (PLEG): container finished" podID="257240a5-cc42-4354-9079-66e6de070b34" containerID="9e69e813190bc3b6041d2a54beae48ef9ecf7969785d0b66c5880f22bcd65291" exitCode=0 Mar 18 12:31:33 crc kubenswrapper[4843]: I0318 12:31:33.870175 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"257240a5-cc42-4354-9079-66e6de070b34","Type":"ContainerDied","Data":"9e69e813190bc3b6041d2a54beae48ef9ecf7969785d0b66c5880f22bcd65291"} Mar 18 12:31:33 crc kubenswrapper[4843]: I0318 12:31:33.871871 4843 generic.go:334] "Generic (PLEG): container finished" podID="1c41f082-cf59-42b4-8314-64aace288dd1" containerID="b2949077702fceb64cea3279ee20a1822ebb720ae15697187e10f706bad4d9b4" exitCode=0 Mar 18 12:31:33 crc kubenswrapper[4843]: I0318 12:31:33.871895 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1c41f082-cf59-42b4-8314-64aace288dd1","Type":"ContainerDied","Data":"b2949077702fceb64cea3279ee20a1822ebb720ae15697187e10f706bad4d9b4"} Mar 18 12:31:34 crc kubenswrapper[4843]: I0318 12:31:34.885181 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"257240a5-cc42-4354-9079-66e6de070b34","Type":"ContainerStarted","Data":"560a2e30ff8c9ca0f672bb737f8b8e51458216cac9b16f5096d1bb050b58d5dc"} Mar 18 12:31:34 crc kubenswrapper[4843]: I0318 12:31:34.885733 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:31:34 crc kubenswrapper[4843]: I0318 12:31:34.888973 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1c41f082-cf59-42b4-8314-64aace288dd1","Type":"ContainerStarted","Data":"482d98003089d6422095943abcc3acd42ac94a7de53b7495f1d7c5c9663fb80f"} Mar 18 12:31:34 crc kubenswrapper[4843]: I0318 12:31:34.892795 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 12:31:34 crc kubenswrapper[4843]: I0318 12:31:34.895866 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"92dfacdead495a001882e7db3527ca7364187da78906b102d05a82bf81d7d699"} Mar 18 12:31:34 crc kubenswrapper[4843]: I0318 12:31:34.895894 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"6b9f5b118daace4cda5f853fcf6984d715b1cc457e966be196d75e4ad2ff1077"} Mar 18 12:31:34 crc kubenswrapper[4843]: I0318 12:31:34.895905 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"5f881dcf662bbfd46e85dcdb1ccca64ca530dcd31d972d20e9decb5330798c66"} Mar 18 12:31:34 crc kubenswrapper[4843]: I0318 12:31:34.914366 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.567681744 podStartE2EDuration="59.91434432s" podCreationTimestamp="2026-03-18 12:30:35 +0000 UTC" firstStartedPulling="2026-03-18 12:30:49.530005575 +0000 UTC m=+1283.245831099" lastFinishedPulling="2026-03-18 12:30:58.876668151 +0000 UTC m=+1292.592493675" observedRunningTime="2026-03-18 12:31:34.906052104 +0000 UTC m=+1328.621877628" watchObservedRunningTime="2026-03-18 12:31:34.91434432 +0000 UTC m=+1328.630169844" Mar 18 12:31:34 crc kubenswrapper[4843]: I0318 12:31:34.943454 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.102039091 podStartE2EDuration="59.943402416s" podCreationTimestamp="2026-03-18 12:30:35 +0000 UTC" firstStartedPulling="2026-03-18 12:30:49.311966144 +0000 UTC m=+1283.027791668" lastFinishedPulling="2026-03-18 12:30:59.153329469 +0000 UTC m=+1292.869154993" observedRunningTime="2026-03-18 12:31:34.925311321 +0000 UTC m=+1328.641136845" watchObservedRunningTime="2026-03-18 12:31:34.943402416 +0000 UTC m=+1328.659227930" Mar 18 12:31:34 crc kubenswrapper[4843]: I0318 12:31:34.996694 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb0988d-75e1-4300-a2ac-16becc785231" path="/var/lib/kubelet/pods/7bb0988d-75e1-4300-a2ac-16becc785231/volumes" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.383739 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.389255 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5x5rj" podUID="a9cc1cd2-018b-40fc-9434-97d649bdd2a8" containerName="ovn-controller" probeResult="failure" output=< Mar 18 12:31:35 crc kubenswrapper[4843]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 12:31:35 crc kubenswrapper[4843]: > Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.397110 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-s65qf" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.625699 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5x5rj-config-vc9fl"] Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.626967 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.629692 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.645430 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5x5rj-config-vc9fl"] Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.712561 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrlzs\" (UniqueName: \"kubernetes.io/projected/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-kube-api-access-rrlzs\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.712616 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-run\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.712714 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-run-ovn\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.712748 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-log-ovn\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.712791 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-additional-scripts\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.712826 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-scripts\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.814075 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-scripts\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.814144 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrlzs\" (UniqueName: \"kubernetes.io/projected/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-kube-api-access-rrlzs\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.814175 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-run\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.814272 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-run-ovn\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.814315 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-log-ovn\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.814374 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-additional-scripts\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.815250 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-additional-scripts\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.815556 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-run\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.815740 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-log-ovn\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.815800 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-run-ovn\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.817208 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-scripts\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.842257 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrlzs\" (UniqueName: \"kubernetes.io/projected/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-kube-api-access-rrlzs\") pod \"ovn-controller-5x5rj-config-vc9fl\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.908976 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"079348a1aa71e91a29edd564963e2101f9c858040b03fb2120f1286d2ebe79e2"} Mar 18 12:31:35 crc kubenswrapper[4843]: I0318 12:31:35.948432 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:36 crc kubenswrapper[4843]: I0318 12:31:36.602960 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5x5rj-config-vc9fl"] Mar 18 12:31:36 crc kubenswrapper[4843]: I0318 12:31:36.939763 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"48856572f09837fb4bfb4b0754b3b75c2e5ab1fab2620ab5efb1c39a1a41fa82"} Mar 18 12:31:36 crc kubenswrapper[4843]: I0318 12:31:36.940012 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"04c78be35bfc259b8ca0edae1cc09e225e821a13cdd862a87f29c45f2e797378"} Mar 18 12:31:38 crc kubenswrapper[4843]: I0318 12:31:38.261271 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hgdnl"] Mar 18 12:31:38 crc kubenswrapper[4843]: I0318 12:31:38.262545 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hgdnl" Mar 18 12:31:38 crc kubenswrapper[4843]: I0318 12:31:38.265714 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 12:31:38 crc kubenswrapper[4843]: I0318 12:31:38.270536 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hgdnl"] Mar 18 12:31:38 crc kubenswrapper[4843]: I0318 12:31:38.385632 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsgvt\" (UniqueName: \"kubernetes.io/projected/20b814bf-1f2a-4604-922c-74073fb910d6-kube-api-access-vsgvt\") pod \"root-account-create-update-hgdnl\" (UID: \"20b814bf-1f2a-4604-922c-74073fb910d6\") " pod="openstack/root-account-create-update-hgdnl" Mar 18 12:31:38 crc kubenswrapper[4843]: I0318 12:31:38.385760 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20b814bf-1f2a-4604-922c-74073fb910d6-operator-scripts\") pod \"root-account-create-update-hgdnl\" (UID: \"20b814bf-1f2a-4604-922c-74073fb910d6\") " pod="openstack/root-account-create-update-hgdnl" Mar 18 12:31:38 crc kubenswrapper[4843]: I0318 12:31:38.487125 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsgvt\" (UniqueName: \"kubernetes.io/projected/20b814bf-1f2a-4604-922c-74073fb910d6-kube-api-access-vsgvt\") pod \"root-account-create-update-hgdnl\" (UID: \"20b814bf-1f2a-4604-922c-74073fb910d6\") " pod="openstack/root-account-create-update-hgdnl" Mar 18 12:31:38 crc kubenswrapper[4843]: I0318 12:31:38.487195 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20b814bf-1f2a-4604-922c-74073fb910d6-operator-scripts\") pod \"root-account-create-update-hgdnl\" (UID: \"20b814bf-1f2a-4604-922c-74073fb910d6\") " pod="openstack/root-account-create-update-hgdnl" Mar 18 12:31:38 crc kubenswrapper[4843]: I0318 12:31:38.488121 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20b814bf-1f2a-4604-922c-74073fb910d6-operator-scripts\") pod \"root-account-create-update-hgdnl\" (UID: \"20b814bf-1f2a-4604-922c-74073fb910d6\") " pod="openstack/root-account-create-update-hgdnl" Mar 18 12:31:38 crc kubenswrapper[4843]: I0318 12:31:38.508536 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsgvt\" (UniqueName: \"kubernetes.io/projected/20b814bf-1f2a-4604-922c-74073fb910d6-kube-api-access-vsgvt\") pod \"root-account-create-update-hgdnl\" (UID: \"20b814bf-1f2a-4604-922c-74073fb910d6\") " pod="openstack/root-account-create-update-hgdnl" Mar 18 12:31:38 crc kubenswrapper[4843]: I0318 12:31:38.590806 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hgdnl" Mar 18 12:31:40 crc kubenswrapper[4843]: I0318 12:31:40.373160 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5x5rj" podUID="a9cc1cd2-018b-40fc-9434-97d649bdd2a8" containerName="ovn-controller" probeResult="failure" output=< Mar 18 12:31:40 crc kubenswrapper[4843]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 12:31:40 crc kubenswrapper[4843]: > Mar 18 12:31:43 crc kubenswrapper[4843]: I0318 12:31:43.738709 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hgdnl"] Mar 18 12:31:43 crc kubenswrapper[4843]: I0318 12:31:43.997683 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5x5rj-config-vc9fl" event={"ID":"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21","Type":"ContainerStarted","Data":"c8f84dc934e1494b8be8261626bed21da2f91fd351e535474b8bae3a1b60ade6"} Mar 18 12:31:43 crc kubenswrapper[4843]: I0318 12:31:43.997981 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5x5rj-config-vc9fl" event={"ID":"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21","Type":"ContainerStarted","Data":"b4aa69d4ffbf4a1779f63ad356c5a1c16aa248e77da5c946ca646b725e1170e5"} Mar 18 12:31:43 crc kubenswrapper[4843]: I0318 12:31:43.998960 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hgdnl" event={"ID":"20b814bf-1f2a-4604-922c-74073fb910d6","Type":"ContainerStarted","Data":"1eac199cd750bd34a1bcefeec8d11b16af7ce3067ec1b310539de8674c1deafd"} Mar 18 12:31:43 crc kubenswrapper[4843]: I0318 12:31:43.998995 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hgdnl" event={"ID":"20b814bf-1f2a-4604-922c-74073fb910d6","Type":"ContainerStarted","Data":"afd10779e29c101243ac9f13ccdc772044e7dc9d9f08449956708afbbad55db1"} Mar 18 12:31:44 crc kubenswrapper[4843]: I0318 12:31:44.012279 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"23bdfdcf469d35442c1ccc55f1b496fd1fad1918395c4c2d32328ede2c73a745"} Mar 18 12:31:44 crc kubenswrapper[4843]: I0318 12:31:44.012532 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"f5a1807e9ac95e5b3e25f653d66ed5ff86d8170e60b6805a88750a3ad07f44d5"} Mar 18 12:31:44 crc kubenswrapper[4843]: I0318 12:31:44.012550 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"0c5e346b686556a6e5a61a37d499a0435c77ca67ac3a8ba7e8694efd00652c45"} Mar 18 12:31:44 crc kubenswrapper[4843]: I0318 12:31:44.016753 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5x5rj-config-vc9fl" podStartSLOduration=9.01673303 podStartE2EDuration="9.01673303s" podCreationTimestamp="2026-03-18 12:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:44.013976942 +0000 UTC m=+1337.729802486" watchObservedRunningTime="2026-03-18 12:31:44.01673303 +0000 UTC m=+1337.732558554" Mar 18 12:31:44 crc kubenswrapper[4843]: I0318 12:31:44.040037 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-hgdnl" podStartSLOduration=6.040019083 podStartE2EDuration="6.040019083s" podCreationTimestamp="2026-03-18 12:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:44.03150162 +0000 UTC m=+1337.747327144" watchObservedRunningTime="2026-03-18 12:31:44.040019083 +0000 UTC m=+1337.755844597" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.039341 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"ba3339ced9a5ae6db87685f5c8978e051a3ee19726341ababaa0369eebdbba7b"} Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.039813 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4e07598e-c70f-4beb-a828-b58cb64c38c0","Type":"ContainerStarted","Data":"5630f4dd7c7d6e1a91ede9004dde44296a9354e1e255c84425d6529a7fec3e46"} Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.044398 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jqd9q" event={"ID":"2a59eee1-b0e8-402b-b89e-d7e461672a21","Type":"ContainerStarted","Data":"dea13a29256cecf7eddac81e6d6a3347abb34dbb0d61995b271062f9eaa38368"} Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.050526 4843 generic.go:334] "Generic (PLEG): container finished" podID="f64f6299-53d2-4dcc-85e9-7b6a3cec1d21" containerID="c8f84dc934e1494b8be8261626bed21da2f91fd351e535474b8bae3a1b60ade6" exitCode=0 Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.050691 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5x5rj-config-vc9fl" event={"ID":"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21","Type":"ContainerDied","Data":"c8f84dc934e1494b8be8261626bed21da2f91fd351e535474b8bae3a1b60ade6"} Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.060815 4843 generic.go:334] "Generic (PLEG): container finished" podID="20b814bf-1f2a-4604-922c-74073fb910d6" containerID="1eac199cd750bd34a1bcefeec8d11b16af7ce3067ec1b310539de8674c1deafd" exitCode=0 Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.060876 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hgdnl" event={"ID":"20b814bf-1f2a-4604-922c-74073fb910d6","Type":"ContainerDied","Data":"1eac199cd750bd34a1bcefeec8d11b16af7ce3067ec1b310539de8674c1deafd"} Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.089677 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=27.484956048 podStartE2EDuration="33.089644412s" podCreationTimestamp="2026-03-18 12:31:12 +0000 UTC" firstStartedPulling="2026-03-18 12:31:30.483765704 +0000 UTC m=+1324.199591228" lastFinishedPulling="2026-03-18 12:31:36.088454068 +0000 UTC m=+1329.804279592" observedRunningTime="2026-03-18 12:31:45.083733424 +0000 UTC m=+1338.799558948" watchObservedRunningTime="2026-03-18 12:31:45.089644412 +0000 UTC m=+1338.805469936" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.127065 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-jqd9q" podStartSLOduration=2.645248932 podStartE2EDuration="15.127042955s" podCreationTimestamp="2026-03-18 12:31:30 +0000 UTC" firstStartedPulling="2026-03-18 12:31:30.914604256 +0000 UTC m=+1324.630429790" lastFinishedPulling="2026-03-18 12:31:43.396398279 +0000 UTC m=+1337.112223813" observedRunningTime="2026-03-18 12:31:45.125912713 +0000 UTC m=+1338.841738237" watchObservedRunningTime="2026-03-18 12:31:45.127042955 +0000 UTC m=+1338.842868479" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.387814 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qkkd9"] Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.413047 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qkkd9"] Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.413193 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.416995 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.420220 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-5x5rj" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.524637 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfnxb\" (UniqueName: \"kubernetes.io/projected/75459f7a-55b8-42da-9072-357f8e2f0065-kube-api-access-qfnxb\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.524711 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.524937 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-config\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.525115 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.525417 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.525565 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.627471 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.627553 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfnxb\" (UniqueName: \"kubernetes.io/projected/75459f7a-55b8-42da-9072-357f8e2f0065-kube-api-access-qfnxb\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.627584 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.627625 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-config\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.627703 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.627788 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.628343 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.628724 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.628980 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-config\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.629500 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.630035 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.654924 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfnxb\" (UniqueName: \"kubernetes.io/projected/75459f7a-55b8-42da-9072-357f8e2f0065-kube-api-access-qfnxb\") pod \"dnsmasq-dns-6d5b6d6b67-qkkd9\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:45 crc kubenswrapper[4843]: I0318 12:31:45.739118 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:46 crc kubenswrapper[4843]: I0318 12:31:46.200718 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qkkd9"] Mar 18 12:31:46 crc kubenswrapper[4843]: I0318 12:31:46.699287 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:31:46 crc kubenswrapper[4843]: I0318 12:31:46.967812 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.025121 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hgdnl" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.031226 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.101823 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20b814bf-1f2a-4604-922c-74073fb910d6-operator-scripts\") pod \"20b814bf-1f2a-4604-922c-74073fb910d6\" (UID: \"20b814bf-1f2a-4604-922c-74073fb910d6\") " Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.101886 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsgvt\" (UniqueName: \"kubernetes.io/projected/20b814bf-1f2a-4604-922c-74073fb910d6-kube-api-access-vsgvt\") pod \"20b814bf-1f2a-4604-922c-74073fb910d6\" (UID: \"20b814bf-1f2a-4604-922c-74073fb910d6\") " Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.102104 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5x5rj-config-vc9fl" event={"ID":"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21","Type":"ContainerDied","Data":"b4aa69d4ffbf4a1779f63ad356c5a1c16aa248e77da5c946ca646b725e1170e5"} Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.102152 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4aa69d4ffbf4a1779f63ad356c5a1c16aa248e77da5c946ca646b725e1170e5" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.102230 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5x5rj-config-vc9fl" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.103415 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b814bf-1f2a-4604-922c-74073fb910d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "20b814bf-1f2a-4604-922c-74073fb910d6" (UID: "20b814bf-1f2a-4604-922c-74073fb910d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.112313 4843 generic.go:334] "Generic (PLEG): container finished" podID="75459f7a-55b8-42da-9072-357f8e2f0065" containerID="a2e31fb0ca2c937825765d3a994057c0bb25143c5ab0a5250096f5d6d5939d84" exitCode=0 Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.112397 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" event={"ID":"75459f7a-55b8-42da-9072-357f8e2f0065","Type":"ContainerDied","Data":"a2e31fb0ca2c937825765d3a994057c0bb25143c5ab0a5250096f5d6d5939d84"} Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.112419 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" event={"ID":"75459f7a-55b8-42da-9072-357f8e2f0065","Type":"ContainerStarted","Data":"83b63442094e3880a0863f15180d7301ce1aed9a8dbb19e57fdafd1958c8801c"} Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.137891 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b814bf-1f2a-4604-922c-74073fb910d6-kube-api-access-vsgvt" (OuterVolumeSpecName: "kube-api-access-vsgvt") pod "20b814bf-1f2a-4604-922c-74073fb910d6" (UID: "20b814bf-1f2a-4604-922c-74073fb910d6"). InnerVolumeSpecName "kube-api-access-vsgvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.163184 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hgdnl" event={"ID":"20b814bf-1f2a-4604-922c-74073fb910d6","Type":"ContainerDied","Data":"afd10779e29c101243ac9f13ccdc772044e7dc9d9f08449956708afbbad55db1"} Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.163225 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afd10779e29c101243ac9f13ccdc772044e7dc9d9f08449956708afbbad55db1" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.163287 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hgdnl" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.203475 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-log-ovn\") pod \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.203573 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrlzs\" (UniqueName: \"kubernetes.io/projected/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-kube-api-access-rrlzs\") pod \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.203614 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-additional-scripts\") pod \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.203637 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-run-ovn\") pod \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.203722 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-run\") pod \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.203751 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-scripts\") pod \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\" (UID: \"f64f6299-53d2-4dcc-85e9-7b6a3cec1d21\") " Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.204128 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20b814bf-1f2a-4604-922c-74073fb910d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.204142 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsgvt\" (UniqueName: \"kubernetes.io/projected/20b814bf-1f2a-4604-922c-74073fb910d6-kube-api-access-vsgvt\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.204377 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f64f6299-53d2-4dcc-85e9-7b6a3cec1d21" (UID: "f64f6299-53d2-4dcc-85e9-7b6a3cec1d21"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.206868 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-run" (OuterVolumeSpecName: "var-run") pod "f64f6299-53d2-4dcc-85e9-7b6a3cec1d21" (UID: "f64f6299-53d2-4dcc-85e9-7b6a3cec1d21"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.206904 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f64f6299-53d2-4dcc-85e9-7b6a3cec1d21" (UID: "f64f6299-53d2-4dcc-85e9-7b6a3cec1d21"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.207165 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-kube-api-access-rrlzs" (OuterVolumeSpecName: "kube-api-access-rrlzs") pod "f64f6299-53d2-4dcc-85e9-7b6a3cec1d21" (UID: "f64f6299-53d2-4dcc-85e9-7b6a3cec1d21"). InnerVolumeSpecName "kube-api-access-rrlzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.210519 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-scripts" (OuterVolumeSpecName: "scripts") pod "f64f6299-53d2-4dcc-85e9-7b6a3cec1d21" (UID: "f64f6299-53d2-4dcc-85e9-7b6a3cec1d21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.212186 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f64f6299-53d2-4dcc-85e9-7b6a3cec1d21" (UID: "f64f6299-53d2-4dcc-85e9-7b6a3cec1d21"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.305392 4843 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.305572 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.305640 4843 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.305719 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrlzs\" (UniqueName: \"kubernetes.io/projected/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-kube-api-access-rrlzs\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.305800 4843 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:47 crc kubenswrapper[4843]: I0318 12:31:47.305850 4843 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:48 crc kubenswrapper[4843]: I0318 12:31:48.173993 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" event={"ID":"75459f7a-55b8-42da-9072-357f8e2f0065","Type":"ContainerStarted","Data":"1805d281818e0d92552e777f452a5dba15d3386963556aa5dc9824a3a2fd1dba"} Mar 18 12:31:48 crc kubenswrapper[4843]: I0318 12:31:48.174311 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:48 crc kubenswrapper[4843]: I0318 12:31:48.198038 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" podStartSLOduration=3.198019617 podStartE2EDuration="3.198019617s" podCreationTimestamp="2026-03-18 12:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:48.195457484 +0000 UTC m=+1341.911283028" watchObservedRunningTime="2026-03-18 12:31:48.198019617 +0000 UTC m=+1341.913845151" Mar 18 12:31:48 crc kubenswrapper[4843]: I0318 12:31:48.234709 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5x5rj-config-vc9fl"] Mar 18 12:31:48 crc kubenswrapper[4843]: I0318 12:31:48.242941 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5x5rj-config-vc9fl"] Mar 18 12:31:48 crc kubenswrapper[4843]: I0318 12:31:48.994473 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f64f6299-53d2-4dcc-85e9-7b6a3cec1d21" path="/var/lib/kubelet/pods/f64f6299-53d2-4dcc-85e9-7b6a3cec1d21/volumes" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.057913 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-85z5m"] Mar 18 12:31:49 crc kubenswrapper[4843]: E0318 12:31:49.058249 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b814bf-1f2a-4604-922c-74073fb910d6" containerName="mariadb-account-create-update" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.058265 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b814bf-1f2a-4604-922c-74073fb910d6" containerName="mariadb-account-create-update" Mar 18 12:31:49 crc kubenswrapper[4843]: E0318 12:31:49.058283 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64f6299-53d2-4dcc-85e9-7b6a3cec1d21" containerName="ovn-config" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.058289 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64f6299-53d2-4dcc-85e9-7b6a3cec1d21" containerName="ovn-config" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.058443 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f64f6299-53d2-4dcc-85e9-7b6a3cec1d21" containerName="ovn-config" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.058464 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b814bf-1f2a-4604-922c-74073fb910d6" containerName="mariadb-account-create-update" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.059000 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-85z5m" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.069494 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-85z5m"] Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.137749 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cxt8\" (UniqueName: \"kubernetes.io/projected/c2a1442f-5fd6-45ff-9002-5532bf8593d2-kube-api-access-5cxt8\") pod \"cinder-db-create-85z5m\" (UID: \"c2a1442f-5fd6-45ff-9002-5532bf8593d2\") " pod="openstack/cinder-db-create-85z5m" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.138141 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2a1442f-5fd6-45ff-9002-5532bf8593d2-operator-scripts\") pod \"cinder-db-create-85z5m\" (UID: \"c2a1442f-5fd6-45ff-9002-5532bf8593d2\") " pod="openstack/cinder-db-create-85z5m" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.179421 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e9b5-account-create-update-sbgfm"] Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.181338 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e9b5-account-create-update-sbgfm" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.184483 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.208893 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e9b5-account-create-update-sbgfm"] Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.240990 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2a1442f-5fd6-45ff-9002-5532bf8593d2-operator-scripts\") pod \"cinder-db-create-85z5m\" (UID: \"c2a1442f-5fd6-45ff-9002-5532bf8593d2\") " pod="openstack/cinder-db-create-85z5m" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.241085 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cxt8\" (UniqueName: \"kubernetes.io/projected/c2a1442f-5fd6-45ff-9002-5532bf8593d2-kube-api-access-5cxt8\") pod \"cinder-db-create-85z5m\" (UID: \"c2a1442f-5fd6-45ff-9002-5532bf8593d2\") " pod="openstack/cinder-db-create-85z5m" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.241913 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2a1442f-5fd6-45ff-9002-5532bf8593d2-operator-scripts\") pod \"cinder-db-create-85z5m\" (UID: \"c2a1442f-5fd6-45ff-9002-5532bf8593d2\") " pod="openstack/cinder-db-create-85z5m" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.261366 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-72hc6"] Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.262453 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-72hc6" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.279353 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-72hc6"] Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.287412 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cxt8\" (UniqueName: \"kubernetes.io/projected/c2a1442f-5fd6-45ff-9002-5532bf8593d2-kube-api-access-5cxt8\") pod \"cinder-db-create-85z5m\" (UID: \"c2a1442f-5fd6-45ff-9002-5532bf8593d2\") " pod="openstack/cinder-db-create-85z5m" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.332952 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-xswdv"] Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.333906 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xswdv" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.338340 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.338527 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.338660 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lq56r" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.338779 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.344360 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wcpg\" (UniqueName: \"kubernetes.io/projected/9d22873a-9171-47bc-9c90-29e3cd5f79f1-kube-api-access-6wcpg\") pod \"barbican-db-create-72hc6\" (UID: \"9d22873a-9171-47bc-9c90-29e3cd5f79f1\") " pod="openstack/barbican-db-create-72hc6" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.344428 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03b52bc5-9622-4d51-9c92-8d75afad43ac-operator-scripts\") pod \"cinder-e9b5-account-create-update-sbgfm\" (UID: \"03b52bc5-9622-4d51-9c92-8d75afad43ac\") " pod="openstack/cinder-e9b5-account-create-update-sbgfm" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.344455 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d22873a-9171-47bc-9c90-29e3cd5f79f1-operator-scripts\") pod \"barbican-db-create-72hc6\" (UID: \"9d22873a-9171-47bc-9c90-29e3cd5f79f1\") " pod="openstack/barbican-db-create-72hc6" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.344497 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wtf5\" (UniqueName: \"kubernetes.io/projected/03b52bc5-9622-4d51-9c92-8d75afad43ac-kube-api-access-4wtf5\") pod \"cinder-e9b5-account-create-update-sbgfm\" (UID: \"03b52bc5-9622-4d51-9c92-8d75afad43ac\") " pod="openstack/cinder-e9b5-account-create-update-sbgfm" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.353237 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xswdv"] Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.372155 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-8d63-account-create-update-pkpgt"] Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.373256 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8d63-account-create-update-pkpgt" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.375331 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-85z5m" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.380188 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.401045 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-96rfj"] Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.402179 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-96rfj" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.408663 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8d63-account-create-update-pkpgt"] Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.426543 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-96rfj"] Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.449270 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmfh2\" (UniqueName: \"kubernetes.io/projected/24bc5f90-1452-42d2-90c3-72bb22f30972-kube-api-access-gmfh2\") pod \"keystone-db-sync-xswdv\" (UID: \"24bc5f90-1452-42d2-90c3-72bb22f30972\") " pod="openstack/keystone-db-sync-xswdv" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.449335 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bc5f90-1452-42d2-90c3-72bb22f30972-combined-ca-bundle\") pod \"keystone-db-sync-xswdv\" (UID: \"24bc5f90-1452-42d2-90c3-72bb22f30972\") " pod="openstack/keystone-db-sync-xswdv" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.449365 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wcpg\" (UniqueName: \"kubernetes.io/projected/9d22873a-9171-47bc-9c90-29e3cd5f79f1-kube-api-access-6wcpg\") pod \"barbican-db-create-72hc6\" (UID: \"9d22873a-9171-47bc-9c90-29e3cd5f79f1\") " pod="openstack/barbican-db-create-72hc6" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.449404 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03b52bc5-9622-4d51-9c92-8d75afad43ac-operator-scripts\") pod \"cinder-e9b5-account-create-update-sbgfm\" (UID: \"03b52bc5-9622-4d51-9c92-8d75afad43ac\") " pod="openstack/cinder-e9b5-account-create-update-sbgfm" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.449422 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d22873a-9171-47bc-9c90-29e3cd5f79f1-operator-scripts\") pod \"barbican-db-create-72hc6\" (UID: \"9d22873a-9171-47bc-9c90-29e3cd5f79f1\") " pod="openstack/barbican-db-create-72hc6" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.449450 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wtf5\" (UniqueName: \"kubernetes.io/projected/03b52bc5-9622-4d51-9c92-8d75afad43ac-kube-api-access-4wtf5\") pod \"cinder-e9b5-account-create-update-sbgfm\" (UID: \"03b52bc5-9622-4d51-9c92-8d75afad43ac\") " pod="openstack/cinder-e9b5-account-create-update-sbgfm" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.449628 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24bc5f90-1452-42d2-90c3-72bb22f30972-config-data\") pod \"keystone-db-sync-xswdv\" (UID: \"24bc5f90-1452-42d2-90c3-72bb22f30972\") " pod="openstack/keystone-db-sync-xswdv" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.450481 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d22873a-9171-47bc-9c90-29e3cd5f79f1-operator-scripts\") pod \"barbican-db-create-72hc6\" (UID: \"9d22873a-9171-47bc-9c90-29e3cd5f79f1\") " pod="openstack/barbican-db-create-72hc6" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.451081 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03b52bc5-9622-4d51-9c92-8d75afad43ac-operator-scripts\") pod \"cinder-e9b5-account-create-update-sbgfm\" (UID: \"03b52bc5-9622-4d51-9c92-8d75afad43ac\") " pod="openstack/cinder-e9b5-account-create-update-sbgfm" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.471820 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-30c9-account-create-update-ctzdf"] Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.473262 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-30c9-account-create-update-ctzdf" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.477229 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.486783 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wcpg\" (UniqueName: \"kubernetes.io/projected/9d22873a-9171-47bc-9c90-29e3cd5f79f1-kube-api-access-6wcpg\") pod \"barbican-db-create-72hc6\" (UID: \"9d22873a-9171-47bc-9c90-29e3cd5f79f1\") " pod="openstack/barbican-db-create-72hc6" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.489196 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wtf5\" (UniqueName: \"kubernetes.io/projected/03b52bc5-9622-4d51-9c92-8d75afad43ac-kube-api-access-4wtf5\") pod \"cinder-e9b5-account-create-update-sbgfm\" (UID: \"03b52bc5-9622-4d51-9c92-8d75afad43ac\") " pod="openstack/cinder-e9b5-account-create-update-sbgfm" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.503005 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-30c9-account-create-update-ctzdf"] Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.503410 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e9b5-account-create-update-sbgfm" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.551025 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/353e0d74-8cd5-4bf0-bd10-38f049806e4f-operator-scripts\") pod \"barbican-8d63-account-create-update-pkpgt\" (UID: \"353e0d74-8cd5-4bf0-bd10-38f049806e4f\") " pod="openstack/barbican-8d63-account-create-update-pkpgt" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.551076 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlq9q\" (UniqueName: \"kubernetes.io/projected/353e0d74-8cd5-4bf0-bd10-38f049806e4f-kube-api-access-qlq9q\") pod \"barbican-8d63-account-create-update-pkpgt\" (UID: \"353e0d74-8cd5-4bf0-bd10-38f049806e4f\") " pod="openstack/barbican-8d63-account-create-update-pkpgt" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.551107 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24bc5f90-1452-42d2-90c3-72bb22f30972-config-data\") pod \"keystone-db-sync-xswdv\" (UID: \"24bc5f90-1452-42d2-90c3-72bb22f30972\") " pod="openstack/keystone-db-sync-xswdv" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.551145 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz6lb\" (UniqueName: \"kubernetes.io/projected/c55f04aa-b47f-40b3-99bc-a8a260d421db-kube-api-access-sz6lb\") pod \"neutron-db-create-96rfj\" (UID: \"c55f04aa-b47f-40b3-99bc-a8a260d421db\") " pod="openstack/neutron-db-create-96rfj" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.551172 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmfh2\" (UniqueName: \"kubernetes.io/projected/24bc5f90-1452-42d2-90c3-72bb22f30972-kube-api-access-gmfh2\") pod \"keystone-db-sync-xswdv\" (UID: \"24bc5f90-1452-42d2-90c3-72bb22f30972\") " pod="openstack/keystone-db-sync-xswdv" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.551212 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bc5f90-1452-42d2-90c3-72bb22f30972-combined-ca-bundle\") pod \"keystone-db-sync-xswdv\" (UID: \"24bc5f90-1452-42d2-90c3-72bb22f30972\") " pod="openstack/keystone-db-sync-xswdv" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.551238 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55f04aa-b47f-40b3-99bc-a8a260d421db-operator-scripts\") pod \"neutron-db-create-96rfj\" (UID: \"c55f04aa-b47f-40b3-99bc-a8a260d421db\") " pod="openstack/neutron-db-create-96rfj" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.557276 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bc5f90-1452-42d2-90c3-72bb22f30972-combined-ca-bundle\") pod \"keystone-db-sync-xswdv\" (UID: \"24bc5f90-1452-42d2-90c3-72bb22f30972\") " pod="openstack/keystone-db-sync-xswdv" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.563900 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24bc5f90-1452-42d2-90c3-72bb22f30972-config-data\") pod \"keystone-db-sync-xswdv\" (UID: \"24bc5f90-1452-42d2-90c3-72bb22f30972\") " pod="openstack/keystone-db-sync-xswdv" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.573808 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmfh2\" (UniqueName: \"kubernetes.io/projected/24bc5f90-1452-42d2-90c3-72bb22f30972-kube-api-access-gmfh2\") pod \"keystone-db-sync-xswdv\" (UID: \"24bc5f90-1452-42d2-90c3-72bb22f30972\") " pod="openstack/keystone-db-sync-xswdv" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.619291 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-72hc6" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.651701 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xswdv" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.652335 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/353e0d74-8cd5-4bf0-bd10-38f049806e4f-operator-scripts\") pod \"barbican-8d63-account-create-update-pkpgt\" (UID: \"353e0d74-8cd5-4bf0-bd10-38f049806e4f\") " pod="openstack/barbican-8d63-account-create-update-pkpgt" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.652387 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlq9q\" (UniqueName: \"kubernetes.io/projected/353e0d74-8cd5-4bf0-bd10-38f049806e4f-kube-api-access-qlq9q\") pod \"barbican-8d63-account-create-update-pkpgt\" (UID: \"353e0d74-8cd5-4bf0-bd10-38f049806e4f\") " pod="openstack/barbican-8d63-account-create-update-pkpgt" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.652580 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz6lb\" (UniqueName: \"kubernetes.io/projected/c55f04aa-b47f-40b3-99bc-a8a260d421db-kube-api-access-sz6lb\") pod \"neutron-db-create-96rfj\" (UID: \"c55f04aa-b47f-40b3-99bc-a8a260d421db\") " pod="openstack/neutron-db-create-96rfj" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.652744 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w28cg\" (UniqueName: \"kubernetes.io/projected/44130b76-6642-46ec-9e38-38880255a091-kube-api-access-w28cg\") pod \"neutron-30c9-account-create-update-ctzdf\" (UID: \"44130b76-6642-46ec-9e38-38880255a091\") " pod="openstack/neutron-30c9-account-create-update-ctzdf" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.652841 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55f04aa-b47f-40b3-99bc-a8a260d421db-operator-scripts\") pod \"neutron-db-create-96rfj\" (UID: \"c55f04aa-b47f-40b3-99bc-a8a260d421db\") " pod="openstack/neutron-db-create-96rfj" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.652892 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44130b76-6642-46ec-9e38-38880255a091-operator-scripts\") pod \"neutron-30c9-account-create-update-ctzdf\" (UID: \"44130b76-6642-46ec-9e38-38880255a091\") " pod="openstack/neutron-30c9-account-create-update-ctzdf" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.652957 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/353e0d74-8cd5-4bf0-bd10-38f049806e4f-operator-scripts\") pod \"barbican-8d63-account-create-update-pkpgt\" (UID: \"353e0d74-8cd5-4bf0-bd10-38f049806e4f\") " pod="openstack/barbican-8d63-account-create-update-pkpgt" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.654236 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55f04aa-b47f-40b3-99bc-a8a260d421db-operator-scripts\") pod \"neutron-db-create-96rfj\" (UID: \"c55f04aa-b47f-40b3-99bc-a8a260d421db\") " pod="openstack/neutron-db-create-96rfj" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.675417 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlq9q\" (UniqueName: \"kubernetes.io/projected/353e0d74-8cd5-4bf0-bd10-38f049806e4f-kube-api-access-qlq9q\") pod \"barbican-8d63-account-create-update-pkpgt\" (UID: \"353e0d74-8cd5-4bf0-bd10-38f049806e4f\") " pod="openstack/barbican-8d63-account-create-update-pkpgt" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.675422 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz6lb\" (UniqueName: \"kubernetes.io/projected/c55f04aa-b47f-40b3-99bc-a8a260d421db-kube-api-access-sz6lb\") pod \"neutron-db-create-96rfj\" (UID: \"c55f04aa-b47f-40b3-99bc-a8a260d421db\") " pod="openstack/neutron-db-create-96rfj" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.686389 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8d63-account-create-update-pkpgt" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.754520 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w28cg\" (UniqueName: \"kubernetes.io/projected/44130b76-6642-46ec-9e38-38880255a091-kube-api-access-w28cg\") pod \"neutron-30c9-account-create-update-ctzdf\" (UID: \"44130b76-6642-46ec-9e38-38880255a091\") " pod="openstack/neutron-30c9-account-create-update-ctzdf" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.754590 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44130b76-6642-46ec-9e38-38880255a091-operator-scripts\") pod \"neutron-30c9-account-create-update-ctzdf\" (UID: \"44130b76-6642-46ec-9e38-38880255a091\") " pod="openstack/neutron-30c9-account-create-update-ctzdf" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.757708 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44130b76-6642-46ec-9e38-38880255a091-operator-scripts\") pod \"neutron-30c9-account-create-update-ctzdf\" (UID: \"44130b76-6642-46ec-9e38-38880255a091\") " pod="openstack/neutron-30c9-account-create-update-ctzdf" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.783865 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w28cg\" (UniqueName: \"kubernetes.io/projected/44130b76-6642-46ec-9e38-38880255a091-kube-api-access-w28cg\") pod \"neutron-30c9-account-create-update-ctzdf\" (UID: \"44130b76-6642-46ec-9e38-38880255a091\") " pod="openstack/neutron-30c9-account-create-update-ctzdf" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.886228 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-96rfj" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.896777 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-30c9-account-create-update-ctzdf" Mar 18 12:31:49 crc kubenswrapper[4843]: I0318 12:31:49.910196 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-85z5m"] Mar 18 12:31:49 crc kubenswrapper[4843]: W0318 12:31:49.929635 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2a1442f_5fd6_45ff_9002_5532bf8593d2.slice/crio-1a9d0bc83e74355ab6c1f46f3bee39af9d99232fd0bf5325d1acb67bfed1339f WatchSource:0}: Error finding container 1a9d0bc83e74355ab6c1f46f3bee39af9d99232fd0bf5325d1acb67bfed1339f: Status 404 returned error can't find the container with id 1a9d0bc83e74355ab6c1f46f3bee39af9d99232fd0bf5325d1acb67bfed1339f Mar 18 12:31:50 crc kubenswrapper[4843]: I0318 12:31:50.051457 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e9b5-account-create-update-sbgfm"] Mar 18 12:31:50 crc kubenswrapper[4843]: I0318 12:31:50.174818 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-72hc6"] Mar 18 12:31:50 crc kubenswrapper[4843]: I0318 12:31:50.202979 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-85z5m" event={"ID":"c2a1442f-5fd6-45ff-9002-5532bf8593d2","Type":"ContainerStarted","Data":"0e4adeba34076e823f49c813041006a8d6c5c83bdd6289887901992fb2e5bac0"} Mar 18 12:31:50 crc kubenswrapper[4843]: I0318 12:31:50.203022 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-85z5m" event={"ID":"c2a1442f-5fd6-45ff-9002-5532bf8593d2","Type":"ContainerStarted","Data":"1a9d0bc83e74355ab6c1f46f3bee39af9d99232fd0bf5325d1acb67bfed1339f"} Mar 18 12:31:50 crc kubenswrapper[4843]: I0318 12:31:50.215546 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e9b5-account-create-update-sbgfm" event={"ID":"03b52bc5-9622-4d51-9c92-8d75afad43ac","Type":"ContainerStarted","Data":"ffccf7f3457334eb3918a61acd70ad5b6901e08f4fd82f083762194df2df0c8f"} Mar 18 12:31:50 crc kubenswrapper[4843]: I0318 12:31:50.220290 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-85z5m" podStartSLOduration=1.220269825 podStartE2EDuration="1.220269825s" podCreationTimestamp="2026-03-18 12:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:50.216054875 +0000 UTC m=+1343.931880399" watchObservedRunningTime="2026-03-18 12:31:50.220269825 +0000 UTC m=+1343.936095349" Mar 18 12:31:50 crc kubenswrapper[4843]: I0318 12:31:50.228907 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-72hc6" event={"ID":"9d22873a-9171-47bc-9c90-29e3cd5f79f1","Type":"ContainerStarted","Data":"55b7c5745a07b2dd5e740396ac2b57711b26660fc1ceca0c618db3bafb28e043"} Mar 18 12:31:50 crc kubenswrapper[4843]: I0318 12:31:50.252749 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e9b5-account-create-update-sbgfm" podStartSLOduration=1.252702478 podStartE2EDuration="1.252702478s" podCreationTimestamp="2026-03-18 12:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:50.244246027 +0000 UTC m=+1343.960071551" watchObservedRunningTime="2026-03-18 12:31:50.252702478 +0000 UTC m=+1343.968528002" Mar 18 12:31:50 crc kubenswrapper[4843]: I0318 12:31:50.267865 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xswdv"] Mar 18 12:31:50 crc kubenswrapper[4843]: I0318 12:31:50.310690 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8d63-account-create-update-pkpgt"] Mar 18 12:31:50 crc kubenswrapper[4843]: I0318 12:31:50.406583 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-96rfj"] Mar 18 12:31:50 crc kubenswrapper[4843]: I0318 12:31:50.419298 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-30c9-account-create-update-ctzdf"] Mar 18 12:31:50 crc kubenswrapper[4843]: W0318 12:31:50.424977 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc55f04aa_b47f_40b3_99bc_a8a260d421db.slice/crio-512e3b080eb289d938b5e34e86966dafff6ec42f0a5af3b025f901a0cf0c172d WatchSource:0}: Error finding container 512e3b080eb289d938b5e34e86966dafff6ec42f0a5af3b025f901a0cf0c172d: Status 404 returned error can't find the container with id 512e3b080eb289d938b5e34e86966dafff6ec42f0a5af3b025f901a0cf0c172d Mar 18 12:31:50 crc kubenswrapper[4843]: W0318 12:31:50.427914 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44130b76_6642_46ec_9e38_38880255a091.slice/crio-b11527e26ad3cab48bdfdf38a044a09e9c2cc54700b3f726da9f34b7d316ab0e WatchSource:0}: Error finding container b11527e26ad3cab48bdfdf38a044a09e9c2cc54700b3f726da9f34b7d316ab0e: Status 404 returned error can't find the container with id b11527e26ad3cab48bdfdf38a044a09e9c2cc54700b3f726da9f34b7d316ab0e Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.238607 4843 generic.go:334] "Generic (PLEG): container finished" podID="353e0d74-8cd5-4bf0-bd10-38f049806e4f" containerID="3c05b3fd28b9d04cd849035d7b9c2053855f68c57ae1f5e1a0d727f95677e162" exitCode=0 Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.238680 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8d63-account-create-update-pkpgt" event={"ID":"353e0d74-8cd5-4bf0-bd10-38f049806e4f","Type":"ContainerDied","Data":"3c05b3fd28b9d04cd849035d7b9c2053855f68c57ae1f5e1a0d727f95677e162"} Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.238705 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8d63-account-create-update-pkpgt" event={"ID":"353e0d74-8cd5-4bf0-bd10-38f049806e4f","Type":"ContainerStarted","Data":"1cd34a48b1ac05eb8e93ab917ba6cbdde5f97963bf8ee23cda5fecddd713a1ce"} Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.240641 4843 generic.go:334] "Generic (PLEG): container finished" podID="c2a1442f-5fd6-45ff-9002-5532bf8593d2" containerID="0e4adeba34076e823f49c813041006a8d6c5c83bdd6289887901992fb2e5bac0" exitCode=0 Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.240723 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-85z5m" event={"ID":"c2a1442f-5fd6-45ff-9002-5532bf8593d2","Type":"ContainerDied","Data":"0e4adeba34076e823f49c813041006a8d6c5c83bdd6289887901992fb2e5bac0"} Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.244436 4843 generic.go:334] "Generic (PLEG): container finished" podID="c55f04aa-b47f-40b3-99bc-a8a260d421db" containerID="7c4900a57462aebe1c9cb7912592a611ce894a40f4fd21b651d028ff7450f3a6" exitCode=0 Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.244480 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-96rfj" event={"ID":"c55f04aa-b47f-40b3-99bc-a8a260d421db","Type":"ContainerDied","Data":"7c4900a57462aebe1c9cb7912592a611ce894a40f4fd21b651d028ff7450f3a6"} Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.244523 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-96rfj" event={"ID":"c55f04aa-b47f-40b3-99bc-a8a260d421db","Type":"ContainerStarted","Data":"512e3b080eb289d938b5e34e86966dafff6ec42f0a5af3b025f901a0cf0c172d"} Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.249796 4843 generic.go:334] "Generic (PLEG): container finished" podID="03b52bc5-9622-4d51-9c92-8d75afad43ac" containerID="1de7ffacd682788a02c237fcc2290d042aef019d97619fa06423125a0acc9d3b" exitCode=0 Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.249872 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e9b5-account-create-update-sbgfm" event={"ID":"03b52bc5-9622-4d51-9c92-8d75afad43ac","Type":"ContainerDied","Data":"1de7ffacd682788a02c237fcc2290d042aef019d97619fa06423125a0acc9d3b"} Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.254127 4843 generic.go:334] "Generic (PLEG): container finished" podID="9d22873a-9171-47bc-9c90-29e3cd5f79f1" containerID="8e48cf8306e8d4a6b65cd6884b53685eeb027607eaf50335aa7dea804c8641b7" exitCode=0 Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.254210 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-72hc6" event={"ID":"9d22873a-9171-47bc-9c90-29e3cd5f79f1","Type":"ContainerDied","Data":"8e48cf8306e8d4a6b65cd6884b53685eeb027607eaf50335aa7dea804c8641b7"} Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.257606 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xswdv" event={"ID":"24bc5f90-1452-42d2-90c3-72bb22f30972","Type":"ContainerStarted","Data":"dfc303163d8b95a86668967cfda95e51b7dcd03f27f3ff59db61f89656fe651f"} Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.262977 4843 generic.go:334] "Generic (PLEG): container finished" podID="44130b76-6642-46ec-9e38-38880255a091" containerID="0974e2746a352a04f1a2bd01ae9f13a0f5079e16597fa6ef9ad6915378cee0ab" exitCode=0 Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.263048 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-30c9-account-create-update-ctzdf" event={"ID":"44130b76-6642-46ec-9e38-38880255a091","Type":"ContainerDied","Data":"0974e2746a352a04f1a2bd01ae9f13a0f5079e16597fa6ef9ad6915378cee0ab"} Mar 18 12:31:51 crc kubenswrapper[4843]: I0318 12:31:51.263074 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-30c9-account-create-update-ctzdf" event={"ID":"44130b76-6642-46ec-9e38-38880255a091","Type":"ContainerStarted","Data":"b11527e26ad3cab48bdfdf38a044a09e9c2cc54700b3f726da9f34b7d316ab0e"} Mar 18 12:31:52 crc kubenswrapper[4843]: I0318 12:31:52.280889 4843 generic.go:334] "Generic (PLEG): container finished" podID="2a59eee1-b0e8-402b-b89e-d7e461672a21" containerID="dea13a29256cecf7eddac81e6d6a3347abb34dbb0d61995b271062f9eaa38368" exitCode=0 Mar 18 12:31:52 crc kubenswrapper[4843]: I0318 12:31:52.281016 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jqd9q" event={"ID":"2a59eee1-b0e8-402b-b89e-d7e461672a21","Type":"ContainerDied","Data":"dea13a29256cecf7eddac81e6d6a3347abb34dbb0d61995b271062f9eaa38368"} Mar 18 12:31:54 crc kubenswrapper[4843]: I0318 12:31:54.777029 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-30c9-account-create-update-ctzdf" Mar 18 12:31:54 crc kubenswrapper[4843]: I0318 12:31:54.887857 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w28cg\" (UniqueName: \"kubernetes.io/projected/44130b76-6642-46ec-9e38-38880255a091-kube-api-access-w28cg\") pod \"44130b76-6642-46ec-9e38-38880255a091\" (UID: \"44130b76-6642-46ec-9e38-38880255a091\") " Mar 18 12:31:54 crc kubenswrapper[4843]: I0318 12:31:54.887917 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44130b76-6642-46ec-9e38-38880255a091-operator-scripts\") pod \"44130b76-6642-46ec-9e38-38880255a091\" (UID: \"44130b76-6642-46ec-9e38-38880255a091\") " Mar 18 12:31:54 crc kubenswrapper[4843]: I0318 12:31:54.889418 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44130b76-6642-46ec-9e38-38880255a091-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44130b76-6642-46ec-9e38-38880255a091" (UID: "44130b76-6642-46ec-9e38-38880255a091"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:54 crc kubenswrapper[4843]: I0318 12:31:54.892600 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44130b76-6642-46ec-9e38-38880255a091-kube-api-access-w28cg" (OuterVolumeSpecName: "kube-api-access-w28cg") pod "44130b76-6642-46ec-9e38-38880255a091" (UID: "44130b76-6642-46ec-9e38-38880255a091"). InnerVolumeSpecName "kube-api-access-w28cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:54 crc kubenswrapper[4843]: I0318 12:31:54.939374 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-72hc6" Mar 18 12:31:54 crc kubenswrapper[4843]: I0318 12:31:54.963607 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8d63-account-create-update-pkpgt" Mar 18 12:31:54 crc kubenswrapper[4843]: I0318 12:31:54.972646 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-85z5m" Mar 18 12:31:54 crc kubenswrapper[4843]: I0318 12:31:54.978386 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e9b5-account-create-update-sbgfm" Mar 18 12:31:54 crc kubenswrapper[4843]: I0318 12:31:54.991216 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-96rfj" Mar 18 12:31:54 crc kubenswrapper[4843]: I0318 12:31:54.991803 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w28cg\" (UniqueName: \"kubernetes.io/projected/44130b76-6642-46ec-9e38-38880255a091-kube-api-access-w28cg\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:54 crc kubenswrapper[4843]: I0318 12:31:54.992068 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44130b76-6642-46ec-9e38-38880255a091-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.004573 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.093899 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wtf5\" (UniqueName: \"kubernetes.io/projected/03b52bc5-9622-4d51-9c92-8d75afad43ac-kube-api-access-4wtf5\") pod \"03b52bc5-9622-4d51-9c92-8d75afad43ac\" (UID: \"03b52bc5-9622-4d51-9c92-8d75afad43ac\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.093983 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cxt8\" (UniqueName: \"kubernetes.io/projected/c2a1442f-5fd6-45ff-9002-5532bf8593d2-kube-api-access-5cxt8\") pod \"c2a1442f-5fd6-45ff-9002-5532bf8593d2\" (UID: \"c2a1442f-5fd6-45ff-9002-5532bf8593d2\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.094024 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55f04aa-b47f-40b3-99bc-a8a260d421db-operator-scripts\") pod \"c55f04aa-b47f-40b3-99bc-a8a260d421db\" (UID: \"c55f04aa-b47f-40b3-99bc-a8a260d421db\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.094067 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wcpg\" (UniqueName: \"kubernetes.io/projected/9d22873a-9171-47bc-9c90-29e3cd5f79f1-kube-api-access-6wcpg\") pod \"9d22873a-9171-47bc-9c90-29e3cd5f79f1\" (UID: \"9d22873a-9171-47bc-9c90-29e3cd5f79f1\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.094089 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/353e0d74-8cd5-4bf0-bd10-38f049806e4f-operator-scripts\") pod \"353e0d74-8cd5-4bf0-bd10-38f049806e4f\" (UID: \"353e0d74-8cd5-4bf0-bd10-38f049806e4f\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.094122 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d22873a-9171-47bc-9c90-29e3cd5f79f1-operator-scripts\") pod \"9d22873a-9171-47bc-9c90-29e3cd5f79f1\" (UID: \"9d22873a-9171-47bc-9c90-29e3cd5f79f1\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.094260 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2a1442f-5fd6-45ff-9002-5532bf8593d2-operator-scripts\") pod \"c2a1442f-5fd6-45ff-9002-5532bf8593d2\" (UID: \"c2a1442f-5fd6-45ff-9002-5532bf8593d2\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.094299 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz6lb\" (UniqueName: \"kubernetes.io/projected/c55f04aa-b47f-40b3-99bc-a8a260d421db-kube-api-access-sz6lb\") pod \"c55f04aa-b47f-40b3-99bc-a8a260d421db\" (UID: \"c55f04aa-b47f-40b3-99bc-a8a260d421db\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.094350 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlq9q\" (UniqueName: \"kubernetes.io/projected/353e0d74-8cd5-4bf0-bd10-38f049806e4f-kube-api-access-qlq9q\") pod \"353e0d74-8cd5-4bf0-bd10-38f049806e4f\" (UID: \"353e0d74-8cd5-4bf0-bd10-38f049806e4f\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.094397 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03b52bc5-9622-4d51-9c92-8d75afad43ac-operator-scripts\") pod \"03b52bc5-9622-4d51-9c92-8d75afad43ac\" (UID: \"03b52bc5-9622-4d51-9c92-8d75afad43ac\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.095290 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2a1442f-5fd6-45ff-9002-5532bf8593d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2a1442f-5fd6-45ff-9002-5532bf8593d2" (UID: "c2a1442f-5fd6-45ff-9002-5532bf8593d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.095322 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353e0d74-8cd5-4bf0-bd10-38f049806e4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "353e0d74-8cd5-4bf0-bd10-38f049806e4f" (UID: "353e0d74-8cd5-4bf0-bd10-38f049806e4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.095322 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c55f04aa-b47f-40b3-99bc-a8a260d421db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c55f04aa-b47f-40b3-99bc-a8a260d421db" (UID: "c55f04aa-b47f-40b3-99bc-a8a260d421db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.095343 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03b52bc5-9622-4d51-9c92-8d75afad43ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03b52bc5-9622-4d51-9c92-8d75afad43ac" (UID: "03b52bc5-9622-4d51-9c92-8d75afad43ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.095561 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d22873a-9171-47bc-9c90-29e3cd5f79f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d22873a-9171-47bc-9c90-29e3cd5f79f1" (UID: "9d22873a-9171-47bc-9c90-29e3cd5f79f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.097935 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/353e0d74-8cd5-4bf0-bd10-38f049806e4f-kube-api-access-qlq9q" (OuterVolumeSpecName: "kube-api-access-qlq9q") pod "353e0d74-8cd5-4bf0-bd10-38f049806e4f" (UID: "353e0d74-8cd5-4bf0-bd10-38f049806e4f"). InnerVolumeSpecName "kube-api-access-qlq9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.098665 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a1442f-5fd6-45ff-9002-5532bf8593d2-kube-api-access-5cxt8" (OuterVolumeSpecName: "kube-api-access-5cxt8") pod "c2a1442f-5fd6-45ff-9002-5532bf8593d2" (UID: "c2a1442f-5fd6-45ff-9002-5532bf8593d2"). InnerVolumeSpecName "kube-api-access-5cxt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.098699 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55f04aa-b47f-40b3-99bc-a8a260d421db-kube-api-access-sz6lb" (OuterVolumeSpecName: "kube-api-access-sz6lb") pod "c55f04aa-b47f-40b3-99bc-a8a260d421db" (UID: "c55f04aa-b47f-40b3-99bc-a8a260d421db"). InnerVolumeSpecName "kube-api-access-sz6lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.099308 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d22873a-9171-47bc-9c90-29e3cd5f79f1-kube-api-access-6wcpg" (OuterVolumeSpecName: "kube-api-access-6wcpg") pod "9d22873a-9171-47bc-9c90-29e3cd5f79f1" (UID: "9d22873a-9171-47bc-9c90-29e3cd5f79f1"). InnerVolumeSpecName "kube-api-access-6wcpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.099815 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b52bc5-9622-4d51-9c92-8d75afad43ac-kube-api-access-4wtf5" (OuterVolumeSpecName: "kube-api-access-4wtf5") pod "03b52bc5-9622-4d51-9c92-8d75afad43ac" (UID: "03b52bc5-9622-4d51-9c92-8d75afad43ac"). InnerVolumeSpecName "kube-api-access-4wtf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.196465 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-config-data\") pod \"2a59eee1-b0e8-402b-b89e-d7e461672a21\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.196574 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj96p\" (UniqueName: \"kubernetes.io/projected/2a59eee1-b0e8-402b-b89e-d7e461672a21-kube-api-access-bj96p\") pod \"2a59eee1-b0e8-402b-b89e-d7e461672a21\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.196678 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-combined-ca-bundle\") pod \"2a59eee1-b0e8-402b-b89e-d7e461672a21\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.196850 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-db-sync-config-data\") pod \"2a59eee1-b0e8-402b-b89e-d7e461672a21\" (UID: \"2a59eee1-b0e8-402b-b89e-d7e461672a21\") " Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.197249 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wcpg\" (UniqueName: \"kubernetes.io/projected/9d22873a-9171-47bc-9c90-29e3cd5f79f1-kube-api-access-6wcpg\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.197271 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/353e0d74-8cd5-4bf0-bd10-38f049806e4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.197286 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d22873a-9171-47bc-9c90-29e3cd5f79f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.197300 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2a1442f-5fd6-45ff-9002-5532bf8593d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.197312 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz6lb\" (UniqueName: \"kubernetes.io/projected/c55f04aa-b47f-40b3-99bc-a8a260d421db-kube-api-access-sz6lb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.197325 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlq9q\" (UniqueName: \"kubernetes.io/projected/353e0d74-8cd5-4bf0-bd10-38f049806e4f-kube-api-access-qlq9q\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.197337 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03b52bc5-9622-4d51-9c92-8d75afad43ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.197348 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wtf5\" (UniqueName: \"kubernetes.io/projected/03b52bc5-9622-4d51-9c92-8d75afad43ac-kube-api-access-4wtf5\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.197359 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cxt8\" (UniqueName: \"kubernetes.io/projected/c2a1442f-5fd6-45ff-9002-5532bf8593d2-kube-api-access-5cxt8\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.197370 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55f04aa-b47f-40b3-99bc-a8a260d421db-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.201209 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a59eee1-b0e8-402b-b89e-d7e461672a21-kube-api-access-bj96p" (OuterVolumeSpecName: "kube-api-access-bj96p") pod "2a59eee1-b0e8-402b-b89e-d7e461672a21" (UID: "2a59eee1-b0e8-402b-b89e-d7e461672a21"). InnerVolumeSpecName "kube-api-access-bj96p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.201821 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2a59eee1-b0e8-402b-b89e-d7e461672a21" (UID: "2a59eee1-b0e8-402b-b89e-d7e461672a21"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.219437 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a59eee1-b0e8-402b-b89e-d7e461672a21" (UID: "2a59eee1-b0e8-402b-b89e-d7e461672a21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.252797 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-config-data" (OuterVolumeSpecName: "config-data") pod "2a59eee1-b0e8-402b-b89e-d7e461672a21" (UID: "2a59eee1-b0e8-402b-b89e-d7e461672a21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.299124 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj96p\" (UniqueName: \"kubernetes.io/projected/2a59eee1-b0e8-402b-b89e-d7e461672a21-kube-api-access-bj96p\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.299181 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.299199 4843 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.299219 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a59eee1-b0e8-402b-b89e-d7e461672a21-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.312314 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e9b5-account-create-update-sbgfm" event={"ID":"03b52bc5-9622-4d51-9c92-8d75afad43ac","Type":"ContainerDied","Data":"ffccf7f3457334eb3918a61acd70ad5b6901e08f4fd82f083762194df2df0c8f"} Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.312362 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffccf7f3457334eb3918a61acd70ad5b6901e08f4fd82f083762194df2df0c8f" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.312419 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e9b5-account-create-update-sbgfm" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.315198 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-72hc6" event={"ID":"9d22873a-9171-47bc-9c90-29e3cd5f79f1","Type":"ContainerDied","Data":"55b7c5745a07b2dd5e740396ac2b57711b26660fc1ceca0c618db3bafb28e043"} Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.315260 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55b7c5745a07b2dd5e740396ac2b57711b26660fc1ceca0c618db3bafb28e043" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.315352 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-72hc6" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.318752 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-30c9-account-create-update-ctzdf" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.318768 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-30c9-account-create-update-ctzdf" event={"ID":"44130b76-6642-46ec-9e38-38880255a091","Type":"ContainerDied","Data":"b11527e26ad3cab48bdfdf38a044a09e9c2cc54700b3f726da9f34b7d316ab0e"} Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.318851 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b11527e26ad3cab48bdfdf38a044a09e9c2cc54700b3f726da9f34b7d316ab0e" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.320829 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8d63-account-create-update-pkpgt" event={"ID":"353e0d74-8cd5-4bf0-bd10-38f049806e4f","Type":"ContainerDied","Data":"1cd34a48b1ac05eb8e93ab917ba6cbdde5f97963bf8ee23cda5fecddd713a1ce"} Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.320871 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cd34a48b1ac05eb8e93ab917ba6cbdde5f97963bf8ee23cda5fecddd713a1ce" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.320932 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8d63-account-create-update-pkpgt" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.324419 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jqd9q" event={"ID":"2a59eee1-b0e8-402b-b89e-d7e461672a21","Type":"ContainerDied","Data":"ad6eacb3e6d411e4382dd853b530e9b7e4a11adb74179ec5730a5f7fcb27593b"} Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.324460 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad6eacb3e6d411e4382dd853b530e9b7e4a11adb74179ec5730a5f7fcb27593b" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.324475 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jqd9q" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.327972 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-85z5m" event={"ID":"c2a1442f-5fd6-45ff-9002-5532bf8593d2","Type":"ContainerDied","Data":"1a9d0bc83e74355ab6c1f46f3bee39af9d99232fd0bf5325d1acb67bfed1339f"} Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.328015 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a9d0bc83e74355ab6c1f46f3bee39af9d99232fd0bf5325d1acb67bfed1339f" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.329358 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-85z5m" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.331764 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-96rfj" event={"ID":"c55f04aa-b47f-40b3-99bc-a8a260d421db","Type":"ContainerDied","Data":"512e3b080eb289d938b5e34e86966dafff6ec42f0a5af3b025f901a0cf0c172d"} Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.331804 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="512e3b080eb289d938b5e34e86966dafff6ec42f0a5af3b025f901a0cf0c172d" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.331781 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-96rfj" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.740806 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.802059 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rhmgx"] Mar 18 12:31:55 crc kubenswrapper[4843]: I0318 12:31:55.802459 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" podUID="0cfdef9a-e628-4679-9bac-c3efe39b5a41" containerName="dnsmasq-dns" containerID="cri-o://d885470a42a8c4aef1e7dd2897a5cc93b1a08a58322a56f930810ccae974ef1e" gracePeriod=10 Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.341310 4843 generic.go:334] "Generic (PLEG): container finished" podID="0cfdef9a-e628-4679-9bac-c3efe39b5a41" containerID="d885470a42a8c4aef1e7dd2897a5cc93b1a08a58322a56f930810ccae974ef1e" exitCode=0 Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.341376 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" event={"ID":"0cfdef9a-e628-4679-9bac-c3efe39b5a41","Type":"ContainerDied","Data":"d885470a42a8c4aef1e7dd2897a5cc93b1a08a58322a56f930810ccae974ef1e"} Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.341404 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" event={"ID":"0cfdef9a-e628-4679-9bac-c3efe39b5a41","Type":"ContainerDied","Data":"61271786b96714d7ec6a68b627c5591ab469e63d77dc1d06792adc71c9db152f"} Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.341419 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61271786b96714d7ec6a68b627c5591ab469e63d77dc1d06792adc71c9db152f" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.342661 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xswdv" event={"ID":"24bc5f90-1452-42d2-90c3-72bb22f30972","Type":"ContainerStarted","Data":"b2ff41297aa76480541413c952e384c624a076e36b9c789c975b0e5552cee218"} Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.366040 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.401805 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-xswdv" podStartSLOduration=3.128150021 podStartE2EDuration="7.401780883s" podCreationTimestamp="2026-03-18 12:31:49 +0000 UTC" firstStartedPulling="2026-03-18 12:31:50.274778275 +0000 UTC m=+1343.990603799" lastFinishedPulling="2026-03-18 12:31:54.548409097 +0000 UTC m=+1348.264234661" observedRunningTime="2026-03-18 12:31:56.38304681 +0000 UTC m=+1350.098872334" watchObservedRunningTime="2026-03-18 12:31:56.401780883 +0000 UTC m=+1350.117606397" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.497346 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-dq2hl"] Mar 18 12:31:56 crc kubenswrapper[4843]: E0318 12:31:56.497905 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a1442f-5fd6-45ff-9002-5532bf8593d2" containerName="mariadb-database-create" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.497920 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a1442f-5fd6-45ff-9002-5532bf8593d2" containerName="mariadb-database-create" Mar 18 12:31:56 crc kubenswrapper[4843]: E0318 12:31:56.497935 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfdef9a-e628-4679-9bac-c3efe39b5a41" containerName="dnsmasq-dns" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.497942 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfdef9a-e628-4679-9bac-c3efe39b5a41" containerName="dnsmasq-dns" Mar 18 12:31:56 crc kubenswrapper[4843]: E0318 12:31:56.497954 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a59eee1-b0e8-402b-b89e-d7e461672a21" containerName="glance-db-sync" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.497960 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a59eee1-b0e8-402b-b89e-d7e461672a21" containerName="glance-db-sync" Mar 18 12:31:56 crc kubenswrapper[4843]: E0318 12:31:56.497975 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b52bc5-9622-4d51-9c92-8d75afad43ac" containerName="mariadb-account-create-update" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.497981 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b52bc5-9622-4d51-9c92-8d75afad43ac" containerName="mariadb-account-create-update" Mar 18 12:31:56 crc kubenswrapper[4843]: E0318 12:31:56.497991 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55f04aa-b47f-40b3-99bc-a8a260d421db" containerName="mariadb-database-create" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.497997 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55f04aa-b47f-40b3-99bc-a8a260d421db" containerName="mariadb-database-create" Mar 18 12:31:56 crc kubenswrapper[4843]: E0318 12:31:56.498010 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d22873a-9171-47bc-9c90-29e3cd5f79f1" containerName="mariadb-database-create" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.498016 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d22873a-9171-47bc-9c90-29e3cd5f79f1" containerName="mariadb-database-create" Mar 18 12:31:56 crc kubenswrapper[4843]: E0318 12:31:56.498028 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353e0d74-8cd5-4bf0-bd10-38f049806e4f" containerName="mariadb-account-create-update" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.498034 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="353e0d74-8cd5-4bf0-bd10-38f049806e4f" containerName="mariadb-account-create-update" Mar 18 12:31:56 crc kubenswrapper[4843]: E0318 12:31:56.498047 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfdef9a-e628-4679-9bac-c3efe39b5a41" containerName="init" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.498052 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfdef9a-e628-4679-9bac-c3efe39b5a41" containerName="init" Mar 18 12:31:56 crc kubenswrapper[4843]: E0318 12:31:56.498058 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44130b76-6642-46ec-9e38-38880255a091" containerName="mariadb-account-create-update" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.498064 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="44130b76-6642-46ec-9e38-38880255a091" containerName="mariadb-account-create-update" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.498216 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="353e0d74-8cd5-4bf0-bd10-38f049806e4f" containerName="mariadb-account-create-update" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.498225 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a1442f-5fd6-45ff-9002-5532bf8593d2" containerName="mariadb-database-create" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.498232 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55f04aa-b47f-40b3-99bc-a8a260d421db" containerName="mariadb-database-create" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.498245 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="44130b76-6642-46ec-9e38-38880255a091" containerName="mariadb-account-create-update" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.498252 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d22873a-9171-47bc-9c90-29e3cd5f79f1" containerName="mariadb-database-create" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.498260 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a59eee1-b0e8-402b-b89e-d7e461672a21" containerName="glance-db-sync" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.498273 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b52bc5-9622-4d51-9c92-8d75afad43ac" containerName="mariadb-account-create-update" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.498281 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cfdef9a-e628-4679-9bac-c3efe39b5a41" containerName="dnsmasq-dns" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.499080 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.524263 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-dns-svc\") pod \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.524319 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-ovsdbserver-nb\") pod \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.524425 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l99p5\" (UniqueName: \"kubernetes.io/projected/0cfdef9a-e628-4679-9bac-c3efe39b5a41-kube-api-access-l99p5\") pod \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.524457 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-ovsdbserver-sb\") pod \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.524475 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-config\") pod \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\" (UID: \"0cfdef9a-e628-4679-9bac-c3efe39b5a41\") " Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.544267 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-dq2hl"] Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.546097 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cfdef9a-e628-4679-9bac-c3efe39b5a41-kube-api-access-l99p5" (OuterVolumeSpecName: "kube-api-access-l99p5") pod "0cfdef9a-e628-4679-9bac-c3efe39b5a41" (UID: "0cfdef9a-e628-4679-9bac-c3efe39b5a41"). InnerVolumeSpecName "kube-api-access-l99p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.627932 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-config\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.627997 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.628024 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-dns-svc\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.628114 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.628142 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7nmz\" (UniqueName: \"kubernetes.io/projected/3ec25e55-6573-4241-a6ed-060c1a28a410-kube-api-access-h7nmz\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.628168 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.628229 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l99p5\" (UniqueName: \"kubernetes.io/projected/0cfdef9a-e628-4679-9bac-c3efe39b5a41-kube-api-access-l99p5\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.662546 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0cfdef9a-e628-4679-9bac-c3efe39b5a41" (UID: "0cfdef9a-e628-4679-9bac-c3efe39b5a41"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.682336 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0cfdef9a-e628-4679-9bac-c3efe39b5a41" (UID: "0cfdef9a-e628-4679-9bac-c3efe39b5a41"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.682744 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-config" (OuterVolumeSpecName: "config") pod "0cfdef9a-e628-4679-9bac-c3efe39b5a41" (UID: "0cfdef9a-e628-4679-9bac-c3efe39b5a41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.695001 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0cfdef9a-e628-4679-9bac-c3efe39b5a41" (UID: "0cfdef9a-e628-4679-9bac-c3efe39b5a41"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.729286 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.729329 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7nmz\" (UniqueName: \"kubernetes.io/projected/3ec25e55-6573-4241-a6ed-060c1a28a410-kube-api-access-h7nmz\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.729360 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.729418 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-config\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.729443 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.729460 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-dns-svc\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.729533 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.729545 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.729555 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.729563 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0cfdef9a-e628-4679-9bac-c3efe39b5a41-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.731835 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.733068 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-dns-svc\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.733207 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.733832 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.733927 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-config\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:56 crc kubenswrapper[4843]: I0318 12:31:56.746536 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7nmz\" (UniqueName: \"kubernetes.io/projected/3ec25e55-6573-4241-a6ed-060c1a28a410-kube-api-access-h7nmz\") pod \"dnsmasq-dns-895cf5cf-dq2hl\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:57 crc kubenswrapper[4843]: I0318 12:31:57.024692 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:57 crc kubenswrapper[4843]: I0318 12:31:57.348174 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-dq2hl"] Mar 18 12:31:57 crc kubenswrapper[4843]: I0318 12:31:57.350814 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-rhmgx" Mar 18 12:31:57 crc kubenswrapper[4843]: W0318 12:31:57.358130 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ec25e55_6573_4241_a6ed_060c1a28a410.slice/crio-60bef15b741a4b414d0f2f93261f27f31334348de2983adb7c72e4c5def088d0 WatchSource:0}: Error finding container 60bef15b741a4b414d0f2f93261f27f31334348de2983adb7c72e4c5def088d0: Status 404 returned error can't find the container with id 60bef15b741a4b414d0f2f93261f27f31334348de2983adb7c72e4c5def088d0 Mar 18 12:31:57 crc kubenswrapper[4843]: I0318 12:31:57.379922 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rhmgx"] Mar 18 12:31:57 crc kubenswrapper[4843]: I0318 12:31:57.390156 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rhmgx"] Mar 18 12:31:58 crc kubenswrapper[4843]: I0318 12:31:58.367391 4843 generic.go:334] "Generic (PLEG): container finished" podID="3ec25e55-6573-4241-a6ed-060c1a28a410" containerID="e32c24b34f82d3dc52ff27ef27e8f819c746186e2ba40452861258700cc88809" exitCode=0 Mar 18 12:31:58 crc kubenswrapper[4843]: I0318 12:31:58.367645 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" event={"ID":"3ec25e55-6573-4241-a6ed-060c1a28a410","Type":"ContainerDied","Data":"e32c24b34f82d3dc52ff27ef27e8f819c746186e2ba40452861258700cc88809"} Mar 18 12:31:58 crc kubenswrapper[4843]: I0318 12:31:58.367768 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" event={"ID":"3ec25e55-6573-4241-a6ed-060c1a28a410","Type":"ContainerStarted","Data":"60bef15b741a4b414d0f2f93261f27f31334348de2983adb7c72e4c5def088d0"} Mar 18 12:31:58 crc kubenswrapper[4843]: I0318 12:31:58.999101 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cfdef9a-e628-4679-9bac-c3efe39b5a41" path="/var/lib/kubelet/pods/0cfdef9a-e628-4679-9bac-c3efe39b5a41/volumes" Mar 18 12:31:59 crc kubenswrapper[4843]: I0318 12:31:59.382777 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" event={"ID":"3ec25e55-6573-4241-a6ed-060c1a28a410","Type":"ContainerStarted","Data":"2064f2c2e10c8533561f927473953c169a52c91ac553f96e1461c3dcceed54b0"} Mar 18 12:31:59 crc kubenswrapper[4843]: I0318 12:31:59.383736 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:31:59 crc kubenswrapper[4843]: I0318 12:31:59.387207 4843 generic.go:334] "Generic (PLEG): container finished" podID="24bc5f90-1452-42d2-90c3-72bb22f30972" containerID="b2ff41297aa76480541413c952e384c624a076e36b9c789c975b0e5552cee218" exitCode=0 Mar 18 12:31:59 crc kubenswrapper[4843]: I0318 12:31:59.387286 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xswdv" event={"ID":"24bc5f90-1452-42d2-90c3-72bb22f30972","Type":"ContainerDied","Data":"b2ff41297aa76480541413c952e384c624a076e36b9c789c975b0e5552cee218"} Mar 18 12:31:59 crc kubenswrapper[4843]: I0318 12:31:59.427066 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" podStartSLOduration=3.427038454 podStartE2EDuration="3.427038454s" podCreationTimestamp="2026-03-18 12:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:59.419028396 +0000 UTC m=+1353.134853970" watchObservedRunningTime="2026-03-18 12:31:59.427038454 +0000 UTC m=+1353.142864018" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.129709 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563952-gfrqt"] Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.131025 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563952-gfrqt" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.132985 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.139736 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.140000 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.141261 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563952-gfrqt"] Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.214629 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vfpk\" (UniqueName: \"kubernetes.io/projected/f1fb0226-598c-4497-8d1f-8711a07f13a6-kube-api-access-5vfpk\") pod \"auto-csr-approver-29563952-gfrqt\" (UID: \"f1fb0226-598c-4497-8d1f-8711a07f13a6\") " pod="openshift-infra/auto-csr-approver-29563952-gfrqt" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.316416 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vfpk\" (UniqueName: \"kubernetes.io/projected/f1fb0226-598c-4497-8d1f-8711a07f13a6-kube-api-access-5vfpk\") pod \"auto-csr-approver-29563952-gfrqt\" (UID: \"f1fb0226-598c-4497-8d1f-8711a07f13a6\") " pod="openshift-infra/auto-csr-approver-29563952-gfrqt" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.355310 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vfpk\" (UniqueName: \"kubernetes.io/projected/f1fb0226-598c-4497-8d1f-8711a07f13a6-kube-api-access-5vfpk\") pod \"auto-csr-approver-29563952-gfrqt\" (UID: \"f1fb0226-598c-4497-8d1f-8711a07f13a6\") " pod="openshift-infra/auto-csr-approver-29563952-gfrqt" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.449847 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563952-gfrqt" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.756969 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xswdv" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.827591 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmfh2\" (UniqueName: \"kubernetes.io/projected/24bc5f90-1452-42d2-90c3-72bb22f30972-kube-api-access-gmfh2\") pod \"24bc5f90-1452-42d2-90c3-72bb22f30972\" (UID: \"24bc5f90-1452-42d2-90c3-72bb22f30972\") " Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.827759 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24bc5f90-1452-42d2-90c3-72bb22f30972-config-data\") pod \"24bc5f90-1452-42d2-90c3-72bb22f30972\" (UID: \"24bc5f90-1452-42d2-90c3-72bb22f30972\") " Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.827968 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bc5f90-1452-42d2-90c3-72bb22f30972-combined-ca-bundle\") pod \"24bc5f90-1452-42d2-90c3-72bb22f30972\" (UID: \"24bc5f90-1452-42d2-90c3-72bb22f30972\") " Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.833031 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24bc5f90-1452-42d2-90c3-72bb22f30972-kube-api-access-gmfh2" (OuterVolumeSpecName: "kube-api-access-gmfh2") pod "24bc5f90-1452-42d2-90c3-72bb22f30972" (UID: "24bc5f90-1452-42d2-90c3-72bb22f30972"). InnerVolumeSpecName "kube-api-access-gmfh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.850516 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bc5f90-1452-42d2-90c3-72bb22f30972-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24bc5f90-1452-42d2-90c3-72bb22f30972" (UID: "24bc5f90-1452-42d2-90c3-72bb22f30972"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.871741 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bc5f90-1452-42d2-90c3-72bb22f30972-config-data" (OuterVolumeSpecName: "config-data") pod "24bc5f90-1452-42d2-90c3-72bb22f30972" (UID: "24bc5f90-1452-42d2-90c3-72bb22f30972"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.907446 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563952-gfrqt"] Mar 18 12:32:00 crc kubenswrapper[4843]: W0318 12:32:00.913110 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1fb0226_598c_4497_8d1f_8711a07f13a6.slice/crio-99fee4c5b1731b1975a353811a2b4c8fd83203dcc57526658c45f1b6b4fce789 WatchSource:0}: Error finding container 99fee4c5b1731b1975a353811a2b4c8fd83203dcc57526658c45f1b6b4fce789: Status 404 returned error can't find the container with id 99fee4c5b1731b1975a353811a2b4c8fd83203dcc57526658c45f1b6b4fce789 Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.929765 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bc5f90-1452-42d2-90c3-72bb22f30972-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.929994 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmfh2\" (UniqueName: \"kubernetes.io/projected/24bc5f90-1452-42d2-90c3-72bb22f30972-kube-api-access-gmfh2\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:00 crc kubenswrapper[4843]: I0318 12:32:00.930063 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24bc5f90-1452-42d2-90c3-72bb22f30972-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.405084 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563952-gfrqt" event={"ID":"f1fb0226-598c-4497-8d1f-8711a07f13a6","Type":"ContainerStarted","Data":"99fee4c5b1731b1975a353811a2b4c8fd83203dcc57526658c45f1b6b4fce789"} Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.406863 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xswdv" event={"ID":"24bc5f90-1452-42d2-90c3-72bb22f30972","Type":"ContainerDied","Data":"dfc303163d8b95a86668967cfda95e51b7dcd03f27f3ff59db61f89656fe651f"} Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.406888 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfc303163d8b95a86668967cfda95e51b7dcd03f27f3ff59db61f89656fe651f" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.406902 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xswdv" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.646673 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-dq2hl"] Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.647449 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" podUID="3ec25e55-6573-4241-a6ed-060c1a28a410" containerName="dnsmasq-dns" containerID="cri-o://2064f2c2e10c8533561f927473953c169a52c91ac553f96e1461c3dcceed54b0" gracePeriod=10 Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.673937 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-gzntp"] Mar 18 12:32:01 crc kubenswrapper[4843]: E0318 12:32:01.674302 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24bc5f90-1452-42d2-90c3-72bb22f30972" containerName="keystone-db-sync" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.674322 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="24bc5f90-1452-42d2-90c3-72bb22f30972" containerName="keystone-db-sync" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.674553 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="24bc5f90-1452-42d2-90c3-72bb22f30972" containerName="keystone-db-sync" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.675599 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.712953 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-n2lkt"] Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.725856 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.745914 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.746081 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lq56r" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.746171 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.746567 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.746683 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.749936 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.749986 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.750035 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.750084 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-config\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.750108 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.750277 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vjtj\" (UniqueName: \"kubernetes.io/projected/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-kube-api-access-9vjtj\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.798734 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-gzntp"] Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.814715 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n2lkt"] Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.851631 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vjtj\" (UniqueName: \"kubernetes.io/projected/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-kube-api-access-9vjtj\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.857473 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-combined-ca-bundle\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.857536 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-credential-keys\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.857580 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.857612 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.857679 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.857730 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-config-data\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.857749 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-config\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.857763 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-fernet-keys\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.857780 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-scripts\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.857799 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t79c2\" (UniqueName: \"kubernetes.io/projected/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-kube-api-access-t79c2\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.857821 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.858917 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.863367 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.863969 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.864557 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.870626 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-config\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.881189 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8d86ff97c-mjh75"] Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.884934 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.890090 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.892445 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-jsv7z" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.891113 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.896131 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.896721 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vjtj\" (UniqueName: \"kubernetes.io/projected/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-kube-api-access-9vjtj\") pod \"dnsmasq-dns-6c9c9f998c-gzntp\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.904157 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8d86ff97c-mjh75"] Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.920537 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9tcmj"] Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.921551 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.928303 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-gkdfj" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.928496 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.928603 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.961464 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-combined-ca-bundle\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.961641 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s42ct\" (UniqueName: \"kubernetes.io/projected/e4b581ed-0818-4c7e-be7e-4c2121d784e5-kube-api-access-s42ct\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.961729 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-config-data\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.961810 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-db-sync-config-data\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.967829 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4b581ed-0818-4c7e-be7e-4c2121d784e5-config-data\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.968016 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-config-data\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.968060 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-fernet-keys\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.968082 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-scripts\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.968111 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t79c2\" (UniqueName: \"kubernetes.io/projected/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-kube-api-access-t79c2\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.968239 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-scripts\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.968280 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4b581ed-0818-4c7e-be7e-4c2121d784e5-scripts\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.968332 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b581ed-0818-4c7e-be7e-4c2121d784e5-logs\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.968361 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-combined-ca-bundle\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.968412 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16d32f17-ac20-4f6d-8e00-db5fdafdc210-etc-machine-id\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.968435 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d7rs\" (UniqueName: \"kubernetes.io/projected/16d32f17-ac20-4f6d-8e00-db5fdafdc210-kube-api-access-9d7rs\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.968480 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4b581ed-0818-4c7e-be7e-4c2121d784e5-horizon-secret-key\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.968498 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-credential-keys\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.975290 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-credential-keys\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.976215 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-config-data\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.986874 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9tcmj"] Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.993704 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-fernet-keys\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:01 crc kubenswrapper[4843]: I0318 12:32:01.994918 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-scripts\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.007516 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-combined-ca-bundle\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.010420 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t79c2\" (UniqueName: \"kubernetes.io/projected/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-kube-api-access-t79c2\") pod \"keystone-bootstrap-n2lkt\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.019032 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.071439 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4b581ed-0818-4c7e-be7e-4c2121d784e5-horizon-secret-key\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.071508 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-combined-ca-bundle\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.071541 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s42ct\" (UniqueName: \"kubernetes.io/projected/e4b581ed-0818-4c7e-be7e-4c2121d784e5-kube-api-access-s42ct\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.071563 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-config-data\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.071587 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-db-sync-config-data\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.071699 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4b581ed-0818-4c7e-be7e-4c2121d784e5-config-data\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.071762 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-scripts\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.071795 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4b581ed-0818-4c7e-be7e-4c2121d784e5-scripts\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.071824 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b581ed-0818-4c7e-be7e-4c2121d784e5-logs\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.071860 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16d32f17-ac20-4f6d-8e00-db5fdafdc210-etc-machine-id\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.071882 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d7rs\" (UniqueName: \"kubernetes.io/projected/16d32f17-ac20-4f6d-8e00-db5fdafdc210-kube-api-access-9d7rs\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.074116 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b581ed-0818-4c7e-be7e-4c2121d784e5-logs\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.074957 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4b581ed-0818-4c7e-be7e-4c2121d784e5-scripts\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.075019 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16d32f17-ac20-4f6d-8e00-db5fdafdc210-etc-machine-id\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.079426 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-combined-ca-bundle\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.081222 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4b581ed-0818-4c7e-be7e-4c2121d784e5-config-data\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.082845 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4b581ed-0818-4c7e-be7e-4c2121d784e5-horizon-secret-key\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.090811 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-db-sync-config-data\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.091753 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-config-data\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.099947 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-sq4hg"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.101385 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sq4hg" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.104487 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-sq4hg"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.104496 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.107486 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-46w9n" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.107719 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.111761 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s42ct\" (UniqueName: \"kubernetes.io/projected/e4b581ed-0818-4c7e-be7e-4c2121d784e5-kube-api-access-s42ct\") pod \"horizon-8d86ff97c-mjh75\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.116275 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-scripts\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.119464 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d7rs\" (UniqueName: \"kubernetes.io/projected/16d32f17-ac20-4f6d-8e00-db5fdafdc210-kube-api-access-9d7rs\") pod \"cinder-db-sync-9tcmj\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.142730 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f8f6d7945-z6kt6"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.144426 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.175456 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f8f6d7945-z6kt6"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.179427 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6c6l\" (UniqueName: \"kubernetes.io/projected/9b82175f-cf5a-4d25-81c2-2c70df039edd-kube-api-access-b6c6l\") pod \"barbican-db-sync-sq4hg\" (UID: \"9b82175f-cf5a-4d25-81c2-2c70df039edd\") " pod="openstack/barbican-db-sync-sq4hg" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.179498 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls6fv\" (UniqueName: \"kubernetes.io/projected/4f65306a-1421-4fa8-a629-dcaa746da646-kube-api-access-ls6fv\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.179555 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f65306a-1421-4fa8-a629-dcaa746da646-horizon-secret-key\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.179640 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b82175f-cf5a-4d25-81c2-2c70df039edd-db-sync-config-data\") pod \"barbican-db-sync-sq4hg\" (UID: \"9b82175f-cf5a-4d25-81c2-2c70df039edd\") " pod="openstack/barbican-db-sync-sq4hg" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.179726 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f65306a-1421-4fa8-a629-dcaa746da646-scripts\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.179770 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f65306a-1421-4fa8-a629-dcaa746da646-config-data\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.180019 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f65306a-1421-4fa8-a629-dcaa746da646-logs\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.180112 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b82175f-cf5a-4d25-81c2-2c70df039edd-combined-ca-bundle\") pod \"barbican-db-sync-sq4hg\" (UID: \"9b82175f-cf5a-4d25-81c2-2c70df039edd\") " pod="openstack/barbican-db-sync-sq4hg" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.188866 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.190814 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.211055 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.211128 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.236739 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-p7qr8"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.237929 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p7qr8" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.246162 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.246269 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cbstj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.247708 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.257311 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.258792 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.279015 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p7qr8"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.281098 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t7lx\" (UniqueName: \"kubernetes.io/projected/e8648504-98a7-406e-9838-c8dc2d64ffe9-kube-api-access-4t7lx\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.281207 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b82175f-cf5a-4d25-81c2-2c70df039edd-combined-ca-bundle\") pod \"barbican-db-sync-sq4hg\" (UID: \"9b82175f-cf5a-4d25-81c2-2c70df039edd\") " pod="openstack/barbican-db-sync-sq4hg" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.281293 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.281378 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-scripts\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.281455 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6c6l\" (UniqueName: \"kubernetes.io/projected/9b82175f-cf5a-4d25-81c2-2c70df039edd-kube-api-access-b6c6l\") pod \"barbican-db-sync-sq4hg\" (UID: \"9b82175f-cf5a-4d25-81c2-2c70df039edd\") " pod="openstack/barbican-db-sync-sq4hg" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.281565 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls6fv\" (UniqueName: \"kubernetes.io/projected/4f65306a-1421-4fa8-a629-dcaa746da646-kube-api-access-ls6fv\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.281752 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f65306a-1421-4fa8-a629-dcaa746da646-horizon-secret-key\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.281825 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-config\") pod \"neutron-db-sync-p7qr8\" (UID: \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\") " pod="openstack/neutron-db-sync-p7qr8" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.281909 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8648504-98a7-406e-9838-c8dc2d64ffe9-run-httpd\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.281991 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b82175f-cf5a-4d25-81c2-2c70df039edd-db-sync-config-data\") pod \"barbican-db-sync-sq4hg\" (UID: \"9b82175f-cf5a-4d25-81c2-2c70df039edd\") " pod="openstack/barbican-db-sync-sq4hg" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.282085 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhjnw\" (UniqueName: \"kubernetes.io/projected/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-kube-api-access-xhjnw\") pod \"neutron-db-sync-p7qr8\" (UID: \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\") " pod="openstack/neutron-db-sync-p7qr8" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.282158 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8648504-98a7-406e-9838-c8dc2d64ffe9-log-httpd\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.282253 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f65306a-1421-4fa8-a629-dcaa746da646-scripts\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.282327 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.282412 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-config-data\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.282477 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f65306a-1421-4fa8-a629-dcaa746da646-config-data\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.282556 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-combined-ca-bundle\") pod \"neutron-db-sync-p7qr8\" (UID: \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\") " pod="openstack/neutron-db-sync-p7qr8" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.282632 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f65306a-1421-4fa8-a629-dcaa746da646-logs\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.283031 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f65306a-1421-4fa8-a629-dcaa746da646-logs\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.288502 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b82175f-cf5a-4d25-81c2-2c70df039edd-db-sync-config-data\") pod \"barbican-db-sync-sq4hg\" (UID: \"9b82175f-cf5a-4d25-81c2-2c70df039edd\") " pod="openstack/barbican-db-sync-sq4hg" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.288621 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f65306a-1421-4fa8-a629-dcaa746da646-scripts\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.289784 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f65306a-1421-4fa8-a629-dcaa746da646-config-data\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.289836 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-gzntp"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.292450 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b82175f-cf5a-4d25-81c2-2c70df039edd-combined-ca-bundle\") pod \"barbican-db-sync-sq4hg\" (UID: \"9b82175f-cf5a-4d25-81c2-2c70df039edd\") " pod="openstack/barbican-db-sync-sq4hg" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.299191 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f65306a-1421-4fa8-a629-dcaa746da646-horizon-secret-key\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.302208 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-pz9j4"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.303450 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.314716 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pz9j4"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.325224 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.325403 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.326035 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-65hnn" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.338271 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls6fv\" (UniqueName: \"kubernetes.io/projected/4f65306a-1421-4fa8-a629-dcaa746da646-kube-api-access-ls6fv\") pod \"horizon-f8f6d7945-z6kt6\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.339124 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6c6l\" (UniqueName: \"kubernetes.io/projected/9b82175f-cf5a-4d25-81c2-2c70df039edd-kube-api-access-b6c6l\") pod \"barbican-db-sync-sq4hg\" (UID: \"9b82175f-cf5a-4d25-81c2-2c70df039edd\") " pod="openstack/barbican-db-sync-sq4hg" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387216 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-config\") pod \"neutron-db-sync-p7qr8\" (UID: \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\") " pod="openstack/neutron-db-sync-p7qr8" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387247 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8648504-98a7-406e-9838-c8dc2d64ffe9-run-httpd\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387294 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhjnw\" (UniqueName: \"kubernetes.io/projected/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-kube-api-access-xhjnw\") pod \"neutron-db-sync-p7qr8\" (UID: \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\") " pod="openstack/neutron-db-sync-p7qr8" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387313 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8648504-98a7-406e-9838-c8dc2d64ffe9-log-httpd\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387347 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-combined-ca-bundle\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387373 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a1c39df-268b-4d85-a616-32c282f9a19b-logs\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387400 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-config-data\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387429 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pp2l\" (UniqueName: \"kubernetes.io/projected/2a1c39df-268b-4d85-a616-32c282f9a19b-kube-api-access-6pp2l\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387449 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387467 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-config-data\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387487 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-combined-ca-bundle\") pod \"neutron-db-sync-p7qr8\" (UID: \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\") " pod="openstack/neutron-db-sync-p7qr8" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387538 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t7lx\" (UniqueName: \"kubernetes.io/projected/e8648504-98a7-406e-9838-c8dc2d64ffe9-kube-api-access-4t7lx\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387557 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-scripts\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387591 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.387605 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-scripts\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.392998 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.394434 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8648504-98a7-406e-9838-c8dc2d64ffe9-run-httpd\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.401535 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8648504-98a7-406e-9838-c8dc2d64ffe9-log-httpd\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.404806 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.406589 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-config\") pod \"neutron-db-sync-p7qr8\" (UID: \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\") " pod="openstack/neutron-db-sync-p7qr8" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.412493 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-scripts\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.412553 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.413943 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.421757 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-hnqhb"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.424726 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-config-data\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.425138 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.425541 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.425907 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.426004 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fshgw" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.427530 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhjnw\" (UniqueName: \"kubernetes.io/projected/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-kube-api-access-xhjnw\") pod \"neutron-db-sync-p7qr8\" (UID: \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\") " pod="openstack/neutron-db-sync-p7qr8" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.431491 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.446867 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.447440 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sq4hg" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.452251 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t7lx\" (UniqueName: \"kubernetes.io/projected/e8648504-98a7-406e-9838-c8dc2d64ffe9-kube-api-access-4t7lx\") pod \"ceilometer-0\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.456413 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-combined-ca-bundle\") pod \"neutron-db-sync-p7qr8\" (UID: \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\") " pod="openstack/neutron-db-sync-p7qr8" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.475197 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.499471 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.499644 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-combined-ca-bundle\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.499697 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a1c39df-268b-4d85-a616-32c282f9a19b-logs\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.499719 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-config-data\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.499741 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pp2l\" (UniqueName: \"kubernetes.io/projected/2a1c39df-268b-4d85-a616-32c282f9a19b-kube-api-access-6pp2l\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.499801 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-scripts\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.508696 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a1c39df-268b-4d85-a616-32c282f9a19b-logs\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.573669 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.574687 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-scripts\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.583887 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pp2l\" (UniqueName: \"kubernetes.io/projected/2a1c39df-268b-4d85-a616-32c282f9a19b-kube-api-access-6pp2l\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.584593 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-combined-ca-bundle\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.588486 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-hnqhb"] Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.589086 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p7qr8" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.589897 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-config-data\") pod \"placement-db-sync-pz9j4\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.594070 4843 generic.go:334] "Generic (PLEG): container finished" podID="3ec25e55-6573-4241-a6ed-060c1a28a410" containerID="2064f2c2e10c8533561f927473953c169a52c91ac553f96e1461c3dcceed54b0" exitCode=0 Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.594121 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" event={"ID":"3ec25e55-6573-4241-a6ed-060c1a28a410","Type":"ContainerDied","Data":"2064f2c2e10c8533561f927473953c169a52c91ac553f96e1461c3dcceed54b0"} Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.597185 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.614236 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.624590 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3468d8b-e825-4b34-8096-20ab2ed4cccb-logs\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.624612 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.624670 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3468d8b-e825-4b34-8096-20ab2ed4cccb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.624695 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-config\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.624725 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.624753 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8qv5\" (UniqueName: \"kubernetes.io/projected/d70be5ea-b453-4cb8-8228-d46629c4ac42-kube-api-access-g8qv5\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.624814 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.624887 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6z89\" (UniqueName: \"kubernetes.io/projected/e3468d8b-e825-4b34-8096-20ab2ed4cccb-kube-api-access-n6z89\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.624912 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-config-data\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.624979 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.625015 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.625045 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-scripts\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.625069 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.726414 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6z89\" (UniqueName: \"kubernetes.io/projected/e3468d8b-e825-4b34-8096-20ab2ed4cccb-kube-api-access-n6z89\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.729842 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-config-data\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.729965 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.730023 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.730064 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-scripts\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.730091 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.730177 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.730202 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3468d8b-e825-4b34-8096-20ab2ed4cccb-logs\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.730217 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.730254 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3468d8b-e825-4b34-8096-20ab2ed4cccb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.730276 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-config\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.730305 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.730333 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8qv5\" (UniqueName: \"kubernetes.io/projected/d70be5ea-b453-4cb8-8228-d46629c4ac42-kube-api-access-g8qv5\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.730385 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.730986 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.731257 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.732244 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.732381 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-config\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.732595 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3468d8b-e825-4b34-8096-20ab2ed4cccb-logs\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.732827 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3468d8b-e825-4b34-8096-20ab2ed4cccb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.733230 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.737048 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-scripts\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.740165 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-config-data\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.741928 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.742402 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.743965 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.755980 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6z89\" (UniqueName: \"kubernetes.io/projected/e3468d8b-e825-4b34-8096-20ab2ed4cccb-kube-api-access-n6z89\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.759499 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8qv5\" (UniqueName: \"kubernetes.io/projected/d70be5ea-b453-4cb8-8228-d46629c4ac42-kube-api-access-g8qv5\") pod \"dnsmasq-dns-57c957c4ff-hnqhb\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.780712 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.931572 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.936493 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:02 crc kubenswrapper[4843]: I0318 12:32:02.968074 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.039464 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.039519 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7nmz\" (UniqueName: \"kubernetes.io/projected/3ec25e55-6573-4241-a6ed-060c1a28a410-kube-api-access-h7nmz\") pod \"3ec25e55-6573-4241-a6ed-060c1a28a410\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.039584 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-config\") pod \"3ec25e55-6573-4241-a6ed-060c1a28a410\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.039620 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-ovsdbserver-sb\") pod \"3ec25e55-6573-4241-a6ed-060c1a28a410\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.039742 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-dns-swift-storage-0\") pod \"3ec25e55-6573-4241-a6ed-060c1a28a410\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.039847 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-ovsdbserver-nb\") pod \"3ec25e55-6573-4241-a6ed-060c1a28a410\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.039885 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-dns-svc\") pod \"3ec25e55-6573-4241-a6ed-060c1a28a410\" (UID: \"3ec25e55-6573-4241-a6ed-060c1a28a410\") " Mar 18 12:32:03 crc kubenswrapper[4843]: E0318 12:32:03.040026 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ec25e55-6573-4241-a6ed-060c1a28a410" containerName="init" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.040043 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ec25e55-6573-4241-a6ed-060c1a28a410" containerName="init" Mar 18 12:32:03 crc kubenswrapper[4843]: E0318 12:32:03.040065 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ec25e55-6573-4241-a6ed-060c1a28a410" containerName="dnsmasq-dns" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.040073 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ec25e55-6573-4241-a6ed-060c1a28a410" containerName="dnsmasq-dns" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.040290 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ec25e55-6573-4241-a6ed-060c1a28a410" containerName="dnsmasq-dns" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.041430 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.045052 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.045298 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.047702 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.065470 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ec25e55-6573-4241-a6ed-060c1a28a410-kube-api-access-h7nmz" (OuterVolumeSpecName: "kube-api-access-h7nmz") pod "3ec25e55-6573-4241-a6ed-060c1a28a410" (UID: "3ec25e55-6573-4241-a6ed-060c1a28a410"). InnerVolumeSpecName "kube-api-access-h7nmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.114095 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-gzntp"] Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.134404 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8d86ff97c-mjh75"] Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.144897 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3ec25e55-6573-4241-a6ed-060c1a28a410" (UID: "3ec25e55-6573-4241-a6ed-060c1a28a410"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.161439 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7nmz\" (UniqueName: \"kubernetes.io/projected/3ec25e55-6573-4241-a6ed-060c1a28a410-kube-api-access-h7nmz\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.167108 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-config" (OuterVolumeSpecName: "config") pod "3ec25e55-6573-4241-a6ed-060c1a28a410" (UID: "3ec25e55-6573-4241-a6ed-060c1a28a410"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.184704 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3ec25e55-6573-4241-a6ed-060c1a28a410" (UID: "3ec25e55-6573-4241-a6ed-060c1a28a410"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.189527 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3ec25e55-6573-4241-a6ed-060c1a28a410" (UID: "3ec25e55-6573-4241-a6ed-060c1a28a410"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.248710 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3ec25e55-6573-4241-a6ed-060c1a28a410" (UID: "3ec25e55-6573-4241-a6ed-060c1a28a410"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.263431 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.263516 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.263536 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/055dfae7-20bf-447a-8824-ea61a90b10ee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.263620 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.263666 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhbbf\" (UniqueName: \"kubernetes.io/projected/055dfae7-20bf-447a-8824-ea61a90b10ee-kube-api-access-qhbbf\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.263687 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.263711 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.263743 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/055dfae7-20bf-447a-8824-ea61a90b10ee-logs\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.263793 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.263804 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.263813 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.263821 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.263829 4843 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec25e55-6573-4241-a6ed-060c1a28a410-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.278663 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n2lkt"] Mar 18 12:32:03 crc kubenswrapper[4843]: W0318 12:32:03.287517 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e699d0f_06ee_45ca_827a_2a8c41f9b9a4.slice/crio-34e796aa1d950eaff6fc0b82f633aa6dbc0d0f5a750d0c011a0da68fbac479d3 WatchSource:0}: Error finding container 34e796aa1d950eaff6fc0b82f633aa6dbc0d0f5a750d0c011a0da68fbac479d3: Status 404 returned error can't find the container with id 34e796aa1d950eaff6fc0b82f633aa6dbc0d0f5a750d0c011a0da68fbac479d3 Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.347131 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9tcmj"] Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.370552 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.371202 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhbbf\" (UniqueName: \"kubernetes.io/projected/055dfae7-20bf-447a-8824-ea61a90b10ee-kube-api-access-qhbbf\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.371330 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.371571 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.371826 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/055dfae7-20bf-447a-8824-ea61a90b10ee-logs\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.372112 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.372958 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/055dfae7-20bf-447a-8824-ea61a90b10ee-logs\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.373838 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.373926 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/055dfae7-20bf-447a-8824-ea61a90b10ee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.374307 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/055dfae7-20bf-447a-8824-ea61a90b10ee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.374415 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.378987 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.379848 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.380302 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.382953 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.389948 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhbbf\" (UniqueName: \"kubernetes.io/projected/055dfae7-20bf-447a-8824-ea61a90b10ee-kube-api-access-qhbbf\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.431400 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.513926 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f8f6d7945-z6kt6"] Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.527838 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-sq4hg"] Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.605489 4843 generic.go:334] "Generic (PLEG): container finished" podID="f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235" containerID="cf2a10a5e65ab23dd7aea799b1ca07305bcfd7dc8329420eef6e17ee69a6b51d" exitCode=0 Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.605554 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" event={"ID":"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235","Type":"ContainerDied","Data":"cf2a10a5e65ab23dd7aea799b1ca07305bcfd7dc8329420eef6e17ee69a6b51d"} Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.605615 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" event={"ID":"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235","Type":"ContainerStarted","Data":"772dfdc9fb108591005d425e7d0275e30a30396ee09d04ddda02e381b89441f8"} Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.609770 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n2lkt" event={"ID":"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4","Type":"ContainerStarted","Data":"34e796aa1d950eaff6fc0b82f633aa6dbc0d0f5a750d0c011a0da68fbac479d3"} Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.611384 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sq4hg" event={"ID":"9b82175f-cf5a-4d25-81c2-2c70df039edd","Type":"ContainerStarted","Data":"8dc50bff7dd634e7bb09e35b2e94496f37f7251df705183e08e3612c5f909bad"} Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.613797 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" event={"ID":"3ec25e55-6573-4241-a6ed-060c1a28a410","Type":"ContainerDied","Data":"60bef15b741a4b414d0f2f93261f27f31334348de2983adb7c72e4c5def088d0"} Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.613841 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-dq2hl" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.613873 4843 scope.go:117] "RemoveContainer" containerID="2064f2c2e10c8533561f927473953c169a52c91ac553f96e1461c3dcceed54b0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.625093 4843 generic.go:334] "Generic (PLEG): container finished" podID="f1fb0226-598c-4497-8d1f-8711a07f13a6" containerID="3ce515a7b798c870df4df626ac4d5f2aed5b96f7feee3f62285749a525f1b613" exitCode=0 Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.625161 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563952-gfrqt" event={"ID":"f1fb0226-598c-4497-8d1f-8711a07f13a6","Type":"ContainerDied","Data":"3ce515a7b798c870df4df626ac4d5f2aed5b96f7feee3f62285749a525f1b613"} Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.629179 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8d86ff97c-mjh75" event={"ID":"e4b581ed-0818-4c7e-be7e-4c2121d784e5","Type":"ContainerStarted","Data":"f30c56c2548ad48deedccb9419ff2f4f8c5b12ed96dde270c71c03c467b6fd9e"} Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.636374 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f8f6d7945-z6kt6" event={"ID":"4f65306a-1421-4fa8-a629-dcaa746da646","Type":"ContainerStarted","Data":"0c861735d78b6cbc921c8baa3d67d82b53d53c3dbc3f9fd00c051e289d2d5b1a"} Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.655858 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9tcmj" event={"ID":"16d32f17-ac20-4f6d-8e00-db5fdafdc210","Type":"ContainerStarted","Data":"c103a0b7fa3bbccda7c1c164879728370b02eee075a0492c6d29041a097e382d"} Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.699607 4843 scope.go:117] "RemoveContainer" containerID="e32c24b34f82d3dc52ff27ef27e8f819c746186e2ba40452861258700cc88809" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.703730 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.715904 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-dq2hl"] Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.723130 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-dq2hl"] Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.749565 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pz9j4"] Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.767732 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p7qr8"] Mar 18 12:32:03 crc kubenswrapper[4843]: W0318 12:32:03.785224 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a1c39df_268b_4d85_a616_32c282f9a19b.slice/crio-1d6cc21293afda571fac6e7d1069d90b08969cdb4bbbfab720d26c4c01d8eb61 WatchSource:0}: Error finding container 1d6cc21293afda571fac6e7d1069d90b08969cdb4bbbfab720d26c4c01d8eb61: Status 404 returned error can't find the container with id 1d6cc21293afda571fac6e7d1069d90b08969cdb4bbbfab720d26c4c01d8eb61 Mar 18 12:32:03 crc kubenswrapper[4843]: W0318 12:32:03.790205 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3cca079_c7f7_4ec6_a16b_30ba392d0e57.slice/crio-3e71ed1214610bba08c17ec88635bd0b6da68cb724be3d85eb59af26992b9122 WatchSource:0}: Error finding container 3e71ed1214610bba08c17ec88635bd0b6da68cb724be3d85eb59af26992b9122: Status 404 returned error can't find the container with id 3e71ed1214610bba08c17ec88635bd0b6da68cb724be3d85eb59af26992b9122 Mar 18 12:32:03 crc kubenswrapper[4843]: W0318 12:32:03.791084 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8648504_98a7_406e_9838_c8dc2d64ffe9.slice/crio-38ecf85f7b8822ac071c770370295fdbc0474891f992705a24fb353be8594af7 WatchSource:0}: Error finding container 38ecf85f7b8822ac071c770370295fdbc0474891f992705a24fb353be8594af7: Status 404 returned error can't find the container with id 38ecf85f7b8822ac071c770370295fdbc0474891f992705a24fb353be8594af7 Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.795667 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:32:03 crc kubenswrapper[4843]: I0318 12:32:03.803383 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-hnqhb"] Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.006445 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.087998 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.199309 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vjtj\" (UniqueName: \"kubernetes.io/projected/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-kube-api-access-9vjtj\") pod \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.199404 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-ovsdbserver-sb\") pod \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.199428 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-ovsdbserver-nb\") pod \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.199449 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-config\") pod \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.199467 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-dns-swift-storage-0\") pod \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.199497 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-dns-svc\") pod \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\" (UID: \"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235\") " Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.203516 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-kube-api-access-9vjtj" (OuterVolumeSpecName: "kube-api-access-9vjtj") pod "f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235" (UID: "f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235"). InnerVolumeSpecName "kube-api-access-9vjtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.226803 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235" (UID: "f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.228107 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235" (UID: "f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.228266 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235" (UID: "f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.236158 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235" (UID: "f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.255078 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-config" (OuterVolumeSpecName: "config") pod "f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235" (UID: "f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.302880 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vjtj\" (UniqueName: \"kubernetes.io/projected/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-kube-api-access-9vjtj\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.302909 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.302918 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.302927 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.302936 4843 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.302964 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.353706 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:04 crc kubenswrapper[4843]: W0318 12:32:04.426727 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod055dfae7_20bf_447a_8824_ea61a90b10ee.slice/crio-560e46dff000fbf710aeb4cb0a8d763ffd66b7f023d0557957f3cef2f3c2072b WatchSource:0}: Error finding container 560e46dff000fbf710aeb4cb0a8d763ffd66b7f023d0557957f3cef2f3c2072b: Status 404 returned error can't find the container with id 560e46dff000fbf710aeb4cb0a8d763ffd66b7f023d0557957f3cef2f3c2072b Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.669627 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n2lkt" event={"ID":"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4","Type":"ContainerStarted","Data":"d59a8d024fb419c81992a96af82ccabddd6bd91e85d89a50af97553cf048110b"} Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.673910 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"055dfae7-20bf-447a-8824-ea61a90b10ee","Type":"ContainerStarted","Data":"560e46dff000fbf710aeb4cb0a8d763ffd66b7f023d0557957f3cef2f3c2072b"} Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.686330 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pz9j4" event={"ID":"2a1c39df-268b-4d85-a616-32c282f9a19b","Type":"ContainerStarted","Data":"1d6cc21293afda571fac6e7d1069d90b08969cdb4bbbfab720d26c4c01d8eb61"} Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.693489 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-n2lkt" podStartSLOduration=3.69347409 podStartE2EDuration="3.69347409s" podCreationTimestamp="2026-03-18 12:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:04.688252311 +0000 UTC m=+1358.404077835" watchObservedRunningTime="2026-03-18 12:32:04.69347409 +0000 UTC m=+1358.409299614" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.695025 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e3468d8b-e825-4b34-8096-20ab2ed4cccb","Type":"ContainerStarted","Data":"e357e51490bbaf73770424518e395fcb33fa71a0572740e071442a34f60343ce"} Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.711874 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" event={"ID":"f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235","Type":"ContainerDied","Data":"772dfdc9fb108591005d425e7d0275e30a30396ee09d04ddda02e381b89441f8"} Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.711923 4843 scope.go:117] "RemoveContainer" containerID="cf2a10a5e65ab23dd7aea799b1ca07305bcfd7dc8329420eef6e17ee69a6b51d" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.711966 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-gzntp" Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.729929 4843 generic.go:334] "Generic (PLEG): container finished" podID="d70be5ea-b453-4cb8-8228-d46629c4ac42" containerID="8e33d0394cbad0aaf1482b3bb8d8021b8f7bdd40b8156e856dc4587d95d1f08b" exitCode=0 Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.729981 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" event={"ID":"d70be5ea-b453-4cb8-8228-d46629c4ac42","Type":"ContainerDied","Data":"8e33d0394cbad0aaf1482b3bb8d8021b8f7bdd40b8156e856dc4587d95d1f08b"} Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.730114 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" event={"ID":"d70be5ea-b453-4cb8-8228-d46629c4ac42","Type":"ContainerStarted","Data":"d34be7ed83abfd94c7a5b1a83c5aef8662d1fad7bff657c17b085a7ed6e33ecd"} Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.793828 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p7qr8" event={"ID":"d3cca079-c7f7-4ec6-a16b-30ba392d0e57","Type":"ContainerStarted","Data":"13bd3fef3bec19cff1dcfd6380c20c71e6ea6cb88b1ca2611be4adc3e5038eb6"} Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.794175 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p7qr8" event={"ID":"d3cca079-c7f7-4ec6-a16b-30ba392d0e57","Type":"ContainerStarted","Data":"3e71ed1214610bba08c17ec88635bd0b6da68cb724be3d85eb59af26992b9122"} Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.822805 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8648504-98a7-406e-9838-c8dc2d64ffe9","Type":"ContainerStarted","Data":"38ecf85f7b8822ac071c770370295fdbc0474891f992705a24fb353be8594af7"} Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.824849 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-gzntp"] Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.833025 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-gzntp"] Mar 18 12:32:04 crc kubenswrapper[4843]: I0318 12:32:04.842011 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-p7qr8" podStartSLOduration=2.841992322 podStartE2EDuration="2.841992322s" podCreationTimestamp="2026-03-18 12:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:04.825638507 +0000 UTC m=+1358.541464031" watchObservedRunningTime="2026-03-18 12:32:04.841992322 +0000 UTC m=+1358.557817846" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.013415 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ec25e55-6573-4241-a6ed-060c1a28a410" path="/var/lib/kubelet/pods/3ec25e55-6573-4241-a6ed-060c1a28a410/volumes" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.014317 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235" path="/var/lib/kubelet/pods/f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235/volumes" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.268049 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563952-gfrqt" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.368048 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.424434 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8d86ff97c-mjh75"] Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.434003 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vfpk\" (UniqueName: \"kubernetes.io/projected/f1fb0226-598c-4497-8d1f-8711a07f13a6-kube-api-access-5vfpk\") pod \"f1fb0226-598c-4497-8d1f-8711a07f13a6\" (UID: \"f1fb0226-598c-4497-8d1f-8711a07f13a6\") " Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.451495 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.458899 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1fb0226-598c-4497-8d1f-8711a07f13a6-kube-api-access-5vfpk" (OuterVolumeSpecName: "kube-api-access-5vfpk") pod "f1fb0226-598c-4497-8d1f-8711a07f13a6" (UID: "f1fb0226-598c-4497-8d1f-8711a07f13a6"). InnerVolumeSpecName "kube-api-access-5vfpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.475845 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-644695cbfc-54s69"] Mar 18 12:32:05 crc kubenswrapper[4843]: E0318 12:32:05.477033 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1fb0226-598c-4497-8d1f-8711a07f13a6" containerName="oc" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.477057 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1fb0226-598c-4497-8d1f-8711a07f13a6" containerName="oc" Mar 18 12:32:05 crc kubenswrapper[4843]: E0318 12:32:05.477092 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235" containerName="init" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.477102 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235" containerName="init" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.477344 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ebe83b-bdbc-44e0-b1c1-a64c2f1d1235" containerName="init" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.477364 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1fb0226-598c-4497-8d1f-8711a07f13a6" containerName="oc" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.479046 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.508063 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-644695cbfc-54s69"] Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.537546 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vfpk\" (UniqueName: \"kubernetes.io/projected/f1fb0226-598c-4497-8d1f-8711a07f13a6-kube-api-access-5vfpk\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.541788 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.638726 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9f84b91-a59a-4850-8928-658cee6c95d9-config-data\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.638781 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9f84b91-a59a-4850-8928-658cee6c95d9-scripts\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.638802 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9f84b91-a59a-4850-8928-658cee6c95d9-horizon-secret-key\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.638834 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9f84b91-a59a-4850-8928-658cee6c95d9-logs\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.638870 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5lnv\" (UniqueName: \"kubernetes.io/projected/b9f84b91-a59a-4850-8928-658cee6c95d9-kube-api-access-l5lnv\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.742750 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9f84b91-a59a-4850-8928-658cee6c95d9-config-data\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.743766 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9f84b91-a59a-4850-8928-658cee6c95d9-config-data\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.743923 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9f84b91-a59a-4850-8928-658cee6c95d9-scripts\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.744041 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9f84b91-a59a-4850-8928-658cee6c95d9-scripts\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.744173 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9f84b91-a59a-4850-8928-658cee6c95d9-horizon-secret-key\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.744645 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9f84b91-a59a-4850-8928-658cee6c95d9-logs\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.744832 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5lnv\" (UniqueName: \"kubernetes.io/projected/b9f84b91-a59a-4850-8928-658cee6c95d9-kube-api-access-l5lnv\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.745018 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9f84b91-a59a-4850-8928-658cee6c95d9-logs\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.750098 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9f84b91-a59a-4850-8928-658cee6c95d9-horizon-secret-key\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.762207 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5lnv\" (UniqueName: \"kubernetes.io/projected/b9f84b91-a59a-4850-8928-658cee6c95d9-kube-api-access-l5lnv\") pod \"horizon-644695cbfc-54s69\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.848783 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.865673 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563952-gfrqt" event={"ID":"f1fb0226-598c-4497-8d1f-8711a07f13a6","Type":"ContainerDied","Data":"99fee4c5b1731b1975a353811a2b4c8fd83203dcc57526658c45f1b6b4fce789"} Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.865719 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99fee4c5b1731b1975a353811a2b4c8fd83203dcc57526658c45f1b6b4fce789" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.865827 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563952-gfrqt" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.894165 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e3468d8b-e825-4b34-8096-20ab2ed4cccb","Type":"ContainerStarted","Data":"89bd04bc26003a405893212660bbbe31f8bf553f3e5fbf7afdb1be3b4b0813aa"} Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.901736 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" event={"ID":"d70be5ea-b453-4cb8-8228-d46629c4ac42","Type":"ContainerStarted","Data":"0520631e5f41d20b6d8791ff85c855d49835049d6868f59e3f4c669cc7543ccd"} Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.902928 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.916965 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"055dfae7-20bf-447a-8824-ea61a90b10ee","Type":"ContainerStarted","Data":"03162de42dfd503424d31b8ea267a38ff5514238637c5419c795569846155212"} Mar 18 12:32:05 crc kubenswrapper[4843]: I0318 12:32:05.937323 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" podStartSLOduration=3.93729991 podStartE2EDuration="3.93729991s" podCreationTimestamp="2026-03-18 12:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:05.920829521 +0000 UTC m=+1359.636655045" watchObservedRunningTime="2026-03-18 12:32:05.93729991 +0000 UTC m=+1359.653125444" Mar 18 12:32:06 crc kubenswrapper[4843]: I0318 12:32:06.328845 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563946-9cgbr"] Mar 18 12:32:06 crc kubenswrapper[4843]: I0318 12:32:06.335763 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563946-9cgbr"] Mar 18 12:32:06 crc kubenswrapper[4843]: I0318 12:32:06.425042 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-644695cbfc-54s69"] Mar 18 12:32:06 crc kubenswrapper[4843]: W0318 12:32:06.447962 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9f84b91_a59a_4850_8928_658cee6c95d9.slice/crio-03598ddca039fa0263a84cf73e3875a7137dcf47a1232152045ac3c4fac7ff14 WatchSource:0}: Error finding container 03598ddca039fa0263a84cf73e3875a7137dcf47a1232152045ac3c4fac7ff14: Status 404 returned error can't find the container with id 03598ddca039fa0263a84cf73e3875a7137dcf47a1232152045ac3c4fac7ff14 Mar 18 12:32:06 crc kubenswrapper[4843]: I0318 12:32:06.929611 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"055dfae7-20bf-447a-8824-ea61a90b10ee","Type":"ContainerStarted","Data":"39c5bf6032b417d9d4f64f005e6f9614e08285231098d330fa386ba2f2c7a93b"} Mar 18 12:32:06 crc kubenswrapper[4843]: I0318 12:32:06.930450 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="055dfae7-20bf-447a-8824-ea61a90b10ee" containerName="glance-httpd" containerID="cri-o://39c5bf6032b417d9d4f64f005e6f9614e08285231098d330fa386ba2f2c7a93b" gracePeriod=30 Mar 18 12:32:06 crc kubenswrapper[4843]: I0318 12:32:06.929736 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="055dfae7-20bf-447a-8824-ea61a90b10ee" containerName="glance-log" containerID="cri-o://03162de42dfd503424d31b8ea267a38ff5514238637c5419c795569846155212" gracePeriod=30 Mar 18 12:32:06 crc kubenswrapper[4843]: I0318 12:32:06.934512 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e3468d8b-e825-4b34-8096-20ab2ed4cccb","Type":"ContainerStarted","Data":"137cf20677da6ead9e8437dec8234c8d4f2177e59b3c861cb90b0956a7b729b5"} Mar 18 12:32:06 crc kubenswrapper[4843]: I0318 12:32:06.934594 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e3468d8b-e825-4b34-8096-20ab2ed4cccb" containerName="glance-log" containerID="cri-o://89bd04bc26003a405893212660bbbe31f8bf553f3e5fbf7afdb1be3b4b0813aa" gracePeriod=30 Mar 18 12:32:06 crc kubenswrapper[4843]: I0318 12:32:06.934742 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e3468d8b-e825-4b34-8096-20ab2ed4cccb" containerName="glance-httpd" containerID="cri-o://137cf20677da6ead9e8437dec8234c8d4f2177e59b3c861cb90b0956a7b729b5" gracePeriod=30 Mar 18 12:32:06 crc kubenswrapper[4843]: I0318 12:32:06.937100 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-644695cbfc-54s69" event={"ID":"b9f84b91-a59a-4850-8928-658cee6c95d9","Type":"ContainerStarted","Data":"03598ddca039fa0263a84cf73e3875a7137dcf47a1232152045ac3c4fac7ff14"} Mar 18 12:32:06 crc kubenswrapper[4843]: I0318 12:32:06.952430 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.952409997 podStartE2EDuration="5.952409997s" podCreationTimestamp="2026-03-18 12:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:06.947208009 +0000 UTC m=+1360.663033533" watchObservedRunningTime="2026-03-18 12:32:06.952409997 +0000 UTC m=+1360.668235521" Mar 18 12:32:06 crc kubenswrapper[4843]: I0318 12:32:06.982710 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.982689408 podStartE2EDuration="4.982689408s" podCreationTimestamp="2026-03-18 12:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:06.976390159 +0000 UTC m=+1360.692215683" watchObservedRunningTime="2026-03-18 12:32:06.982689408 +0000 UTC m=+1360.698514932" Mar 18 12:32:07 crc kubenswrapper[4843]: I0318 12:32:07.014566 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d61bcdf2-7d5a-4d36-ac50-896209b2d468" path="/var/lib/kubelet/pods/d61bcdf2-7d5a-4d36-ac50-896209b2d468/volumes" Mar 18 12:32:07 crc kubenswrapper[4843]: I0318 12:32:07.949900 4843 generic.go:334] "Generic (PLEG): container finished" podID="055dfae7-20bf-447a-8824-ea61a90b10ee" containerID="39c5bf6032b417d9d4f64f005e6f9614e08285231098d330fa386ba2f2c7a93b" exitCode=0 Mar 18 12:32:07 crc kubenswrapper[4843]: I0318 12:32:07.949935 4843 generic.go:334] "Generic (PLEG): container finished" podID="055dfae7-20bf-447a-8824-ea61a90b10ee" containerID="03162de42dfd503424d31b8ea267a38ff5514238637c5419c795569846155212" exitCode=143 Mar 18 12:32:07 crc kubenswrapper[4843]: I0318 12:32:07.950009 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"055dfae7-20bf-447a-8824-ea61a90b10ee","Type":"ContainerDied","Data":"39c5bf6032b417d9d4f64f005e6f9614e08285231098d330fa386ba2f2c7a93b"} Mar 18 12:32:07 crc kubenswrapper[4843]: I0318 12:32:07.950041 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"055dfae7-20bf-447a-8824-ea61a90b10ee","Type":"ContainerDied","Data":"03162de42dfd503424d31b8ea267a38ff5514238637c5419c795569846155212"} Mar 18 12:32:07 crc kubenswrapper[4843]: I0318 12:32:07.954080 4843 generic.go:334] "Generic (PLEG): container finished" podID="e3468d8b-e825-4b34-8096-20ab2ed4cccb" containerID="137cf20677da6ead9e8437dec8234c8d4f2177e59b3c861cb90b0956a7b729b5" exitCode=0 Mar 18 12:32:07 crc kubenswrapper[4843]: I0318 12:32:07.954113 4843 generic.go:334] "Generic (PLEG): container finished" podID="e3468d8b-e825-4b34-8096-20ab2ed4cccb" containerID="89bd04bc26003a405893212660bbbe31f8bf553f3e5fbf7afdb1be3b4b0813aa" exitCode=143 Mar 18 12:32:07 crc kubenswrapper[4843]: I0318 12:32:07.955413 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e3468d8b-e825-4b34-8096-20ab2ed4cccb","Type":"ContainerDied","Data":"137cf20677da6ead9e8437dec8234c8d4f2177e59b3c861cb90b0956a7b729b5"} Mar 18 12:32:07 crc kubenswrapper[4843]: I0318 12:32:07.955445 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e3468d8b-e825-4b34-8096-20ab2ed4cccb","Type":"ContainerDied","Data":"89bd04bc26003a405893212660bbbe31f8bf553f3e5fbf7afdb1be3b4b0813aa"} Mar 18 12:32:08 crc kubenswrapper[4843]: I0318 12:32:08.973857 4843 generic.go:334] "Generic (PLEG): container finished" podID="2e699d0f-06ee-45ca-827a-2a8c41f9b9a4" containerID="d59a8d024fb419c81992a96af82ccabddd6bd91e85d89a50af97553cf048110b" exitCode=0 Mar 18 12:32:08 crc kubenswrapper[4843]: I0318 12:32:08.974012 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n2lkt" event={"ID":"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4","Type":"ContainerDied","Data":"d59a8d024fb419c81992a96af82ccabddd6bd91e85d89a50af97553cf048110b"} Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.665518 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f8f6d7945-z6kt6"] Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.712315 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7dc96fcc9b-lt524"] Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.713830 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.720160 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.753797 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7dc96fcc9b-lt524"] Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.785762 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-644695cbfc-54s69"] Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.814562 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5bf5d4bdcb-8xfkn"] Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.816270 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.830437 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bf5d4bdcb-8xfkn"] Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.861434 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-horizon-secret-key\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.861499 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-config-data\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.861571 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-combined-ca-bundle\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.861642 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-scripts\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.861681 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-horizon-tls-certs\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.861705 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mckf7\" (UniqueName: \"kubernetes.io/projected/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-kube-api-access-mckf7\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.861728 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-logs\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963172 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-combined-ca-bundle\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963542 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-scripts\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963565 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-horizon-tls-certs\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963587 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz448\" (UniqueName: \"kubernetes.io/projected/0199f761-6d2f-4921-8060-6960a0141f0a-kube-api-access-bz448\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963608 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mckf7\" (UniqueName: \"kubernetes.io/projected/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-kube-api-access-mckf7\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963633 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0199f761-6d2f-4921-8060-6960a0141f0a-combined-ca-bundle\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963675 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-logs\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963704 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0199f761-6d2f-4921-8060-6960a0141f0a-horizon-secret-key\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963736 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0199f761-6d2f-4921-8060-6960a0141f0a-horizon-tls-certs\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963762 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-horizon-secret-key\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963784 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0199f761-6d2f-4921-8060-6960a0141f0a-scripts\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963810 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-config-data\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963851 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0199f761-6d2f-4921-8060-6960a0141f0a-config-data\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.963871 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0199f761-6d2f-4921-8060-6960a0141f0a-logs\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.964598 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-logs\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.965002 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-scripts\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.966111 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-config-data\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.969167 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-horizon-tls-certs\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.969618 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-horizon-secret-key\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.971024 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-combined-ca-bundle\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:10 crc kubenswrapper[4843]: I0318 12:32:10.985060 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mckf7\" (UniqueName: \"kubernetes.io/projected/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-kube-api-access-mckf7\") pod \"horizon-7dc96fcc9b-lt524\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.042237 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.070902 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0199f761-6d2f-4921-8060-6960a0141f0a-horizon-secret-key\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.070990 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0199f761-6d2f-4921-8060-6960a0141f0a-horizon-tls-certs\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.071062 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0199f761-6d2f-4921-8060-6960a0141f0a-scripts\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.071218 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0199f761-6d2f-4921-8060-6960a0141f0a-config-data\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.071252 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0199f761-6d2f-4921-8060-6960a0141f0a-logs\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.075973 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz448\" (UniqueName: \"kubernetes.io/projected/0199f761-6d2f-4921-8060-6960a0141f0a-kube-api-access-bz448\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.076062 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0199f761-6d2f-4921-8060-6960a0141f0a-combined-ca-bundle\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.077093 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0199f761-6d2f-4921-8060-6960a0141f0a-logs\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.079228 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0199f761-6d2f-4921-8060-6960a0141f0a-config-data\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.079785 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0199f761-6d2f-4921-8060-6960a0141f0a-scripts\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.079897 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0199f761-6d2f-4921-8060-6960a0141f0a-combined-ca-bundle\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.095876 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0199f761-6d2f-4921-8060-6960a0141f0a-horizon-tls-certs\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.104961 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz448\" (UniqueName: \"kubernetes.io/projected/0199f761-6d2f-4921-8060-6960a0141f0a-kube-api-access-bz448\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.116334 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0199f761-6d2f-4921-8060-6960a0141f0a-horizon-secret-key\") pod \"horizon-5bf5d4bdcb-8xfkn\" (UID: \"0199f761-6d2f-4921-8060-6960a0141f0a\") " pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:11 crc kubenswrapper[4843]: I0318 12:32:11.150623 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:12 crc kubenswrapper[4843]: I0318 12:32:12.937833 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:32:13 crc kubenswrapper[4843]: I0318 12:32:13.027089 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qkkd9"] Mar 18 12:32:13 crc kubenswrapper[4843]: I0318 12:32:13.027309 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" podUID="75459f7a-55b8-42da-9072-357f8e2f0065" containerName="dnsmasq-dns" containerID="cri-o://1805d281818e0d92552e777f452a5dba15d3386963556aa5dc9824a3a2fd1dba" gracePeriod=10 Mar 18 12:32:14 crc kubenswrapper[4843]: I0318 12:32:14.047026 4843 generic.go:334] "Generic (PLEG): container finished" podID="75459f7a-55b8-42da-9072-357f8e2f0065" containerID="1805d281818e0d92552e777f452a5dba15d3386963556aa5dc9824a3a2fd1dba" exitCode=0 Mar 18 12:32:14 crc kubenswrapper[4843]: I0318 12:32:14.047077 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" event={"ID":"75459f7a-55b8-42da-9072-357f8e2f0065","Type":"ContainerDied","Data":"1805d281818e0d92552e777f452a5dba15d3386963556aa5dc9824a3a2fd1dba"} Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.556414 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.577945 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.590873 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.664582 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/055dfae7-20bf-447a-8824-ea61a90b10ee-httpd-run\") pod \"055dfae7-20bf-447a-8824-ea61a90b10ee\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.664630 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-combined-ca-bundle\") pod \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.664699 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-combined-ca-bundle\") pod \"055dfae7-20bf-447a-8824-ea61a90b10ee\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.664737 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-scripts\") pod \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.664787 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-scripts\") pod \"055dfae7-20bf-447a-8824-ea61a90b10ee\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.664817 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-scripts\") pod \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.664836 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-credential-keys\") pod \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.664864 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-config-data\") pod \"055dfae7-20bf-447a-8824-ea61a90b10ee\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.664898 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/055dfae7-20bf-447a-8824-ea61a90b10ee-logs\") pod \"055dfae7-20bf-447a-8824-ea61a90b10ee\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.664940 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-fernet-keys\") pod \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.664967 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-config-data\") pod \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.665008 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-combined-ca-bundle\") pod \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.665027 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.665061 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"055dfae7-20bf-447a-8824-ea61a90b10ee\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.665092 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhbbf\" (UniqueName: \"kubernetes.io/projected/055dfae7-20bf-447a-8824-ea61a90b10ee-kube-api-access-qhbbf\") pod \"055dfae7-20bf-447a-8824-ea61a90b10ee\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.665119 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-public-tls-certs\") pod \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.665136 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-internal-tls-certs\") pod \"055dfae7-20bf-447a-8824-ea61a90b10ee\" (UID: \"055dfae7-20bf-447a-8824-ea61a90b10ee\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.665142 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/055dfae7-20bf-447a-8824-ea61a90b10ee-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "055dfae7-20bf-447a-8824-ea61a90b10ee" (UID: "055dfae7-20bf-447a-8824-ea61a90b10ee"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.665156 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t79c2\" (UniqueName: \"kubernetes.io/projected/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-kube-api-access-t79c2\") pod \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.665281 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6z89\" (UniqueName: \"kubernetes.io/projected/e3468d8b-e825-4b34-8096-20ab2ed4cccb-kube-api-access-n6z89\") pod \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.665332 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3468d8b-e825-4b34-8096-20ab2ed4cccb-httpd-run\") pod \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.665367 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-config-data\") pod \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\" (UID: \"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.665400 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3468d8b-e825-4b34-8096-20ab2ed4cccb-logs\") pod \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\" (UID: \"e3468d8b-e825-4b34-8096-20ab2ed4cccb\") " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.666553 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/055dfae7-20bf-447a-8824-ea61a90b10ee-logs" (OuterVolumeSpecName: "logs") pod "055dfae7-20bf-447a-8824-ea61a90b10ee" (UID: "055dfae7-20bf-447a-8824-ea61a90b10ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.668247 4843 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/055dfae7-20bf-447a-8824-ea61a90b10ee-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.668437 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/055dfae7-20bf-447a-8824-ea61a90b10ee-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.673048 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3468d8b-e825-4b34-8096-20ab2ed4cccb-logs" (OuterVolumeSpecName: "logs") pod "e3468d8b-e825-4b34-8096-20ab2ed4cccb" (UID: "e3468d8b-e825-4b34-8096-20ab2ed4cccb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.673581 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3468d8b-e825-4b34-8096-20ab2ed4cccb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e3468d8b-e825-4b34-8096-20ab2ed4cccb" (UID: "e3468d8b-e825-4b34-8096-20ab2ed4cccb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.681615 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2e699d0f-06ee-45ca-827a-2a8c41f9b9a4" (UID: "2e699d0f-06ee-45ca-827a-2a8c41f9b9a4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.681745 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2e699d0f-06ee-45ca-827a-2a8c41f9b9a4" (UID: "2e699d0f-06ee-45ca-827a-2a8c41f9b9a4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.681875 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/055dfae7-20bf-447a-8824-ea61a90b10ee-kube-api-access-qhbbf" (OuterVolumeSpecName: "kube-api-access-qhbbf") pod "055dfae7-20bf-447a-8824-ea61a90b10ee" (UID: "055dfae7-20bf-447a-8824-ea61a90b10ee"). InnerVolumeSpecName "kube-api-access-qhbbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.681937 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-scripts" (OuterVolumeSpecName: "scripts") pod "2e699d0f-06ee-45ca-827a-2a8c41f9b9a4" (UID: "2e699d0f-06ee-45ca-827a-2a8c41f9b9a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.682163 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-scripts" (OuterVolumeSpecName: "scripts") pod "e3468d8b-e825-4b34-8096-20ab2ed4cccb" (UID: "e3468d8b-e825-4b34-8096-20ab2ed4cccb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.682838 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "055dfae7-20bf-447a-8824-ea61a90b10ee" (UID: "055dfae7-20bf-447a-8824-ea61a90b10ee"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.700317 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "e3468d8b-e825-4b34-8096-20ab2ed4cccb" (UID: "e3468d8b-e825-4b34-8096-20ab2ed4cccb"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.701842 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-scripts" (OuterVolumeSpecName: "scripts") pod "055dfae7-20bf-447a-8824-ea61a90b10ee" (UID: "055dfae7-20bf-447a-8824-ea61a90b10ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.701929 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3468d8b-e825-4b34-8096-20ab2ed4cccb-kube-api-access-n6z89" (OuterVolumeSpecName: "kube-api-access-n6z89") pod "e3468d8b-e825-4b34-8096-20ab2ed4cccb" (UID: "e3468d8b-e825-4b34-8096-20ab2ed4cccb"). InnerVolumeSpecName "kube-api-access-n6z89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.702060 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-kube-api-access-t79c2" (OuterVolumeSpecName: "kube-api-access-t79c2") pod "2e699d0f-06ee-45ca-827a-2a8c41f9b9a4" (UID: "2e699d0f-06ee-45ca-827a-2a8c41f9b9a4"). InnerVolumeSpecName "kube-api-access-t79c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.712357 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-config-data" (OuterVolumeSpecName: "config-data") pod "2e699d0f-06ee-45ca-827a-2a8c41f9b9a4" (UID: "2e699d0f-06ee-45ca-827a-2a8c41f9b9a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.725040 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3468d8b-e825-4b34-8096-20ab2ed4cccb" (UID: "e3468d8b-e825-4b34-8096-20ab2ed4cccb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.751466 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e3468d8b-e825-4b34-8096-20ab2ed4cccb" (UID: "e3468d8b-e825-4b34-8096-20ab2ed4cccb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.754370 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e699d0f-06ee-45ca-827a-2a8c41f9b9a4" (UID: "2e699d0f-06ee-45ca-827a-2a8c41f9b9a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.756613 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-config-data" (OuterVolumeSpecName: "config-data") pod "e3468d8b-e825-4b34-8096-20ab2ed4cccb" (UID: "e3468d8b-e825-4b34-8096-20ab2ed4cccb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.767814 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "055dfae7-20bf-447a-8824-ea61a90b10ee" (UID: "055dfae7-20bf-447a-8824-ea61a90b10ee"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.769924 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.769953 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.769963 4843 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.769974 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.769981 4843 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.769991 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.769999 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.770026 4843 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.770039 4843 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.770050 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhbbf\" (UniqueName: \"kubernetes.io/projected/055dfae7-20bf-447a-8824-ea61a90b10ee-kube-api-access-qhbbf\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.770059 4843 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3468d8b-e825-4b34-8096-20ab2ed4cccb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.770068 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t79c2\" (UniqueName: \"kubernetes.io/projected/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-kube-api-access-t79c2\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.770076 4843 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.770086 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6z89\" (UniqueName: \"kubernetes.io/projected/e3468d8b-e825-4b34-8096-20ab2ed4cccb-kube-api-access-n6z89\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.770094 4843 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3468d8b-e825-4b34-8096-20ab2ed4cccb-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.770102 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.770110 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3468d8b-e825-4b34-8096-20ab2ed4cccb-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.770117 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.773634 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "055dfae7-20bf-447a-8824-ea61a90b10ee" (UID: "055dfae7-20bf-447a-8824-ea61a90b10ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.780803 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-config-data" (OuterVolumeSpecName: "config-data") pod "055dfae7-20bf-447a-8824-ea61a90b10ee" (UID: "055dfae7-20bf-447a-8824-ea61a90b10ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.789125 4843 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.790339 4843 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.872223 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.872259 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/055dfae7-20bf-447a-8824-ea61a90b10ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.872273 4843 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:15 crc kubenswrapper[4843]: I0318 12:32:15.872282 4843 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.072506 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n2lkt" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.072514 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n2lkt" event={"ID":"2e699d0f-06ee-45ca-827a-2a8c41f9b9a4","Type":"ContainerDied","Data":"34e796aa1d950eaff6fc0b82f633aa6dbc0d0f5a750d0c011a0da68fbac479d3"} Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.072617 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34e796aa1d950eaff6fc0b82f633aa6dbc0d0f5a750d0c011a0da68fbac479d3" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.077033 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"055dfae7-20bf-447a-8824-ea61a90b10ee","Type":"ContainerDied","Data":"560e46dff000fbf710aeb4cb0a8d763ffd66b7f023d0557957f3cef2f3c2072b"} Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.077086 4843 scope.go:117] "RemoveContainer" containerID="39c5bf6032b417d9d4f64f005e6f9614e08285231098d330fa386ba2f2c7a93b" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.077249 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.086496 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e3468d8b-e825-4b34-8096-20ab2ed4cccb","Type":"ContainerDied","Data":"e357e51490bbaf73770424518e395fcb33fa71a0572740e071442a34f60343ce"} Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.086595 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.139018 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.151525 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.163217 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:32:16 crc kubenswrapper[4843]: E0318 12:32:16.182431 4843 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3468d8b_e825_4b34_8096_20ab2ed4cccb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e699d0f_06ee_45ca_827a_2a8c41f9b9a4.slice\": RecentStats: unable to find data in memory cache]" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.189606 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.203157 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:16 crc kubenswrapper[4843]: E0318 12:32:16.203668 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="055dfae7-20bf-447a-8824-ea61a90b10ee" containerName="glance-log" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.203688 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="055dfae7-20bf-447a-8824-ea61a90b10ee" containerName="glance-log" Mar 18 12:32:16 crc kubenswrapper[4843]: E0318 12:32:16.203700 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3468d8b-e825-4b34-8096-20ab2ed4cccb" containerName="glance-log" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.203706 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3468d8b-e825-4b34-8096-20ab2ed4cccb" containerName="glance-log" Mar 18 12:32:16 crc kubenswrapper[4843]: E0318 12:32:16.203733 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="055dfae7-20bf-447a-8824-ea61a90b10ee" containerName="glance-httpd" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.203739 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="055dfae7-20bf-447a-8824-ea61a90b10ee" containerName="glance-httpd" Mar 18 12:32:16 crc kubenswrapper[4843]: E0318 12:32:16.203758 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e699d0f-06ee-45ca-827a-2a8c41f9b9a4" containerName="keystone-bootstrap" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.203764 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e699d0f-06ee-45ca-827a-2a8c41f9b9a4" containerName="keystone-bootstrap" Mar 18 12:32:16 crc kubenswrapper[4843]: E0318 12:32:16.203773 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3468d8b-e825-4b34-8096-20ab2ed4cccb" containerName="glance-httpd" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.203781 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3468d8b-e825-4b34-8096-20ab2ed4cccb" containerName="glance-httpd" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.203949 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="055dfae7-20bf-447a-8824-ea61a90b10ee" containerName="glance-httpd" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.203971 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3468d8b-e825-4b34-8096-20ab2ed4cccb" containerName="glance-log" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.203978 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="055dfae7-20bf-447a-8824-ea61a90b10ee" containerName="glance-log" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.203987 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3468d8b-e825-4b34-8096-20ab2ed4cccb" containerName="glance-httpd" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.204001 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e699d0f-06ee-45ca-827a-2a8c41f9b9a4" containerName="keystone-bootstrap" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.204924 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.208224 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.208523 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.209687 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.210986 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fshgw" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.214114 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.218508 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.222353 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.222593 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.234593 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.245220 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.277808 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.277848 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed72136-75da-4a94-a22a-c2511b6bb45a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.277904 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.277927 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.278117 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d89l5\" (UniqueName: \"kubernetes.io/projected/8ed72136-75da-4a94-a22a-c2511b6bb45a-kube-api-access-d89l5\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.278268 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.278545 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.278668 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed72136-75da-4a94-a22a-c2511b6bb45a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.379947 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380012 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed72136-75da-4a94-a22a-c2511b6bb45a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380067 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/faa05482-2d1f-43b3-93a2-94178afbc8a7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380096 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380119 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380155 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380180 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d89l5\" (UniqueName: \"kubernetes.io/projected/8ed72136-75da-4a94-a22a-c2511b6bb45a-kube-api-access-d89l5\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380210 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwpgt\" (UniqueName: \"kubernetes.io/projected/faa05482-2d1f-43b3-93a2-94178afbc8a7-kube-api-access-qwpgt\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380249 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380299 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa05482-2d1f-43b3-93a2-94178afbc8a7-logs\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380332 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380347 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380387 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380408 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-config-data\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380429 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-scripts\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380457 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed72136-75da-4a94-a22a-c2511b6bb45a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380517 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed72136-75da-4a94-a22a-c2511b6bb45a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380788 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.380979 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed72136-75da-4a94-a22a-c2511b6bb45a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.386030 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.388001 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.388195 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.391250 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.401202 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d89l5\" (UniqueName: \"kubernetes.io/projected/8ed72136-75da-4a94-a22a-c2511b6bb45a-kube-api-access-d89l5\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.410971 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.482817 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/faa05482-2d1f-43b3-93a2-94178afbc8a7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.482889 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.482928 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwpgt\" (UniqueName: \"kubernetes.io/projected/faa05482-2d1f-43b3-93a2-94178afbc8a7-kube-api-access-qwpgt\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.483010 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa05482-2d1f-43b3-93a2-94178afbc8a7-logs\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.483054 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.483076 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.483130 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-config-data\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.483161 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-scripts\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.483339 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/faa05482-2d1f-43b3-93a2-94178afbc8a7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.483605 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.483674 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa05482-2d1f-43b3-93a2-94178afbc8a7-logs\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.488270 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-config-data\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.490779 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.491107 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-scripts\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.493760 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.501072 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwpgt\" (UniqueName: \"kubernetes.io/projected/faa05482-2d1f-43b3-93a2-94178afbc8a7-kube-api-access-qwpgt\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.512662 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.523689 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.540282 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.639973 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-n2lkt"] Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.649592 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-n2lkt"] Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.730230 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9g2xs"] Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.731465 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.734670 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lq56r" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.734690 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.734943 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.735182 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.737025 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.743076 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9g2xs"] Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.791671 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhcnz\" (UniqueName: \"kubernetes.io/projected/d7dbbfca-f6b9-4421-8662-64ce08dade2d-kube-api-access-fhcnz\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.791755 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-config-data\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.791788 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-credential-keys\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.791814 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-fernet-keys\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.791943 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-scripts\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.792258 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-combined-ca-bundle\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.894156 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-combined-ca-bundle\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.894259 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhcnz\" (UniqueName: \"kubernetes.io/projected/d7dbbfca-f6b9-4421-8662-64ce08dade2d-kube-api-access-fhcnz\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.894296 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-config-data\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.894325 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-credential-keys\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.894350 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-fernet-keys\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.894375 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-scripts\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.899192 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-combined-ca-bundle\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.902428 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-scripts\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.902626 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-config-data\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.904752 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-credential-keys\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.905318 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-fernet-keys\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:16 crc kubenswrapper[4843]: I0318 12:32:16.912758 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhcnz\" (UniqueName: \"kubernetes.io/projected/d7dbbfca-f6b9-4421-8662-64ce08dade2d-kube-api-access-fhcnz\") pod \"keystone-bootstrap-9g2xs\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:17 crc kubenswrapper[4843]: I0318 12:32:17.000090 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="055dfae7-20bf-447a-8824-ea61a90b10ee" path="/var/lib/kubelet/pods/055dfae7-20bf-447a-8824-ea61a90b10ee/volumes" Mar 18 12:32:17 crc kubenswrapper[4843]: I0318 12:32:17.001022 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e699d0f-06ee-45ca-827a-2a8c41f9b9a4" path="/var/lib/kubelet/pods/2e699d0f-06ee-45ca-827a-2a8c41f9b9a4/volumes" Mar 18 12:32:17 crc kubenswrapper[4843]: I0318 12:32:17.001858 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3468d8b-e825-4b34-8096-20ab2ed4cccb" path="/var/lib/kubelet/pods/e3468d8b-e825-4b34-8096-20ab2ed4cccb/volumes" Mar 18 12:32:17 crc kubenswrapper[4843]: I0318 12:32:17.049588 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:20 crc kubenswrapper[4843]: I0318 12:32:20.740270 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" podUID="75459f7a-55b8-42da-9072-357f8e2f0065" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Mar 18 12:32:21 crc kubenswrapper[4843]: I0318 12:32:21.130508 4843 generic.go:334] "Generic (PLEG): container finished" podID="d3cca079-c7f7-4ec6-a16b-30ba392d0e57" containerID="13bd3fef3bec19cff1dcfd6380c20c71e6ea6cb88b1ca2611be4adc3e5038eb6" exitCode=0 Mar 18 12:32:21 crc kubenswrapper[4843]: I0318 12:32:21.130554 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p7qr8" event={"ID":"d3cca079-c7f7-4ec6-a16b-30ba392d0e57","Type":"ContainerDied","Data":"13bd3fef3bec19cff1dcfd6380c20c71e6ea6cb88b1ca2611be4adc3e5038eb6"} Mar 18 12:32:25 crc kubenswrapper[4843]: I0318 12:32:25.630751 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7dc96fcc9b-lt524"] Mar 18 12:32:25 crc kubenswrapper[4843]: I0318 12:32:25.741962 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" podUID="75459f7a-55b8-42da-9072-357f8e2f0065" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Mar 18 12:32:30 crc kubenswrapper[4843]: E0318 12:32:30.290752 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 18 12:32:30 crc kubenswrapper[4843]: E0318 12:32:30.291258 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n658h68fh5b6h5c5h9ch75h5ffh5c5h84h55fh64dh5b7h58ch668h8bh57bh55bh57dh64h646h6bhf6h56bhbbhfh5f7hc8h599h5d8h5fbhffh668q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ls6fv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-f8f6d7945-z6kt6_openstack(4f65306a-1421-4fa8-a629-dcaa746da646): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:32:30 crc kubenswrapper[4843]: E0318 12:32:30.293212 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-f8f6d7945-z6kt6" podUID="4f65306a-1421-4fa8-a629-dcaa746da646" Mar 18 12:32:30 crc kubenswrapper[4843]: E0318 12:32:30.304901 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 18 12:32:30 crc kubenswrapper[4843]: E0318 12:32:30.305102 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n56fh66bh657hc5h558h87h67h54ch558h6ch667h66fh697h5b9h569h657h64ch57h597h69h66bh687h66dh85h544hb4h58fh548h588h5c8h5f7h54bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l5lnv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-644695cbfc-54s69_openstack(b9f84b91-a59a-4850-8928-658cee6c95d9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:32:30 crc kubenswrapper[4843]: E0318 12:32:30.307724 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-644695cbfc-54s69" podUID="b9f84b91-a59a-4850-8928-658cee6c95d9" Mar 18 12:32:30 crc kubenswrapper[4843]: E0318 12:32:30.316119 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 18 12:32:30 crc kubenswrapper[4843]: E0318 12:32:30.316329 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndh58h69h65ch78hb8h5f6h6fh654h5cfh5f8h9dh585h55dh675h99h65h5b7h9fh96h587h5chdbh54dh5dbh658h598hdh55hc4h87h74q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s42ct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-8d86ff97c-mjh75_openstack(e4b581ed-0818-4c7e-be7e-4c2121d784e5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:32:30 crc kubenswrapper[4843]: E0318 12:32:30.318965 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-8d86ff97c-mjh75" podUID="e4b581ed-0818-4c7e-be7e-4c2121d784e5" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.430494 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.435746 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p7qr8" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.711926 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-config\") pod \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\" (UID: \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\") " Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.712030 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-ovsdbserver-sb\") pod \"75459f7a-55b8-42da-9072-357f8e2f0065\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.712053 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-combined-ca-bundle\") pod \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\" (UID: \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\") " Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.712073 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-ovsdbserver-nb\") pod \"75459f7a-55b8-42da-9072-357f8e2f0065\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.712094 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-dns-swift-storage-0\") pod \"75459f7a-55b8-42da-9072-357f8e2f0065\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.712125 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-dns-svc\") pod \"75459f7a-55b8-42da-9072-357f8e2f0065\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.712141 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfnxb\" (UniqueName: \"kubernetes.io/projected/75459f7a-55b8-42da-9072-357f8e2f0065-kube-api-access-qfnxb\") pod \"75459f7a-55b8-42da-9072-357f8e2f0065\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.712160 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-config\") pod \"75459f7a-55b8-42da-9072-357f8e2f0065\" (UID: \"75459f7a-55b8-42da-9072-357f8e2f0065\") " Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.712207 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhjnw\" (UniqueName: \"kubernetes.io/projected/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-kube-api-access-xhjnw\") pod \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\" (UID: \"d3cca079-c7f7-4ec6-a16b-30ba392d0e57\") " Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.722069 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-kube-api-access-xhjnw" (OuterVolumeSpecName: "kube-api-access-xhjnw") pod "d3cca079-c7f7-4ec6-a16b-30ba392d0e57" (UID: "d3cca079-c7f7-4ec6-a16b-30ba392d0e57"). InnerVolumeSpecName "kube-api-access-xhjnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.722465 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75459f7a-55b8-42da-9072-357f8e2f0065-kube-api-access-qfnxb" (OuterVolumeSpecName: "kube-api-access-qfnxb") pod "75459f7a-55b8-42da-9072-357f8e2f0065" (UID: "75459f7a-55b8-42da-9072-357f8e2f0065"). InnerVolumeSpecName "kube-api-access-qfnxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.738483 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3cca079-c7f7-4ec6-a16b-30ba392d0e57" (UID: "d3cca079-c7f7-4ec6-a16b-30ba392d0e57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.742771 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" podUID="75459f7a-55b8-42da-9072-357f8e2f0065" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.743256 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.755165 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-config" (OuterVolumeSpecName: "config") pod "d3cca079-c7f7-4ec6-a16b-30ba392d0e57" (UID: "d3cca079-c7f7-4ec6-a16b-30ba392d0e57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.765839 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-config" (OuterVolumeSpecName: "config") pod "75459f7a-55b8-42da-9072-357f8e2f0065" (UID: "75459f7a-55b8-42da-9072-357f8e2f0065"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.768100 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "75459f7a-55b8-42da-9072-357f8e2f0065" (UID: "75459f7a-55b8-42da-9072-357f8e2f0065"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.777675 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75459f7a-55b8-42da-9072-357f8e2f0065" (UID: "75459f7a-55b8-42da-9072-357f8e2f0065"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.778771 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75459f7a-55b8-42da-9072-357f8e2f0065" (UID: "75459f7a-55b8-42da-9072-357f8e2f0065"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.784726 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75459f7a-55b8-42da-9072-357f8e2f0065" (UID: "75459f7a-55b8-42da-9072-357f8e2f0065"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.813850 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhjnw\" (UniqueName: \"kubernetes.io/projected/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-kube-api-access-xhjnw\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.813884 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.813898 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.813907 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cca079-c7f7-4ec6-a16b-30ba392d0e57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.813915 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.813923 4843 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.813931 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.813940 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfnxb\" (UniqueName: \"kubernetes.io/projected/75459f7a-55b8-42da-9072-357f8e2f0065-kube-api-access-qfnxb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:30 crc kubenswrapper[4843]: I0318 12:32:30.813948 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75459f7a-55b8-42da-9072-357f8e2f0065-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.244819 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p7qr8" event={"ID":"d3cca079-c7f7-4ec6-a16b-30ba392d0e57","Type":"ContainerDied","Data":"3e71ed1214610bba08c17ec88635bd0b6da68cb724be3d85eb59af26992b9122"} Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.245171 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e71ed1214610bba08c17ec88635bd0b6da68cb724be3d85eb59af26992b9122" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.245096 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p7qr8" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.249981 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.250133 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-qkkd9" event={"ID":"75459f7a-55b8-42da-9072-357f8e2f0065","Type":"ContainerDied","Data":"83b63442094e3880a0863f15180d7301ce1aed9a8dbb19e57fdafd1958c8801c"} Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.371093 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qkkd9"] Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.385016 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-qkkd9"] Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.735367 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-vcl96"] Mar 18 12:32:31 crc kubenswrapper[4843]: E0318 12:32:31.736010 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75459f7a-55b8-42da-9072-357f8e2f0065" containerName="init" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.736027 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="75459f7a-55b8-42da-9072-357f8e2f0065" containerName="init" Mar 18 12:32:31 crc kubenswrapper[4843]: E0318 12:32:31.736062 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75459f7a-55b8-42da-9072-357f8e2f0065" containerName="dnsmasq-dns" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.736069 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="75459f7a-55b8-42da-9072-357f8e2f0065" containerName="dnsmasq-dns" Mar 18 12:32:31 crc kubenswrapper[4843]: E0318 12:32:31.736080 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3cca079-c7f7-4ec6-a16b-30ba392d0e57" containerName="neutron-db-sync" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.736087 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3cca079-c7f7-4ec6-a16b-30ba392d0e57" containerName="neutron-db-sync" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.736258 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="75459f7a-55b8-42da-9072-357f8e2f0065" containerName="dnsmasq-dns" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.736280 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3cca079-c7f7-4ec6-a16b-30ba392d0e57" containerName="neutron-db-sync" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.737362 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.759712 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-vcl96"] Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.861430 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-55f88f498d-gppxz"] Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.863034 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.866963 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.867259 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cbstj" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.867336 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.867931 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.873267 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.873393 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.873453 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4mzv\" (UniqueName: \"kubernetes.io/projected/e264861b-2898-4550-845c-7842781c5650-kube-api-access-h4mzv\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.873483 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-config\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.873518 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.873623 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.880915 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55f88f498d-gppxz"] Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.975433 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.975483 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-combined-ca-bundle\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.975533 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sts2\" (UniqueName: \"kubernetes.io/projected/653a3476-1fa6-4afc-a036-df13d5a0c6e6-kube-api-access-4sts2\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.975563 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.975581 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-httpd-config\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.975624 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-config\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.975768 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.975810 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4mzv\" (UniqueName: \"kubernetes.io/projected/e264861b-2898-4550-845c-7842781c5650-kube-api-access-h4mzv\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.975829 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-config\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.975854 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.975877 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-ovndb-tls-certs\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.976277 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.976283 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.976629 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-config\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.976876 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:31 crc kubenswrapper[4843]: I0318 12:32:31.977968 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.018072 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4mzv\" (UniqueName: \"kubernetes.io/projected/e264861b-2898-4550-845c-7842781c5650-kube-api-access-h4mzv\") pod \"dnsmasq-dns-5ccc5c4795-vcl96\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.066434 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.077030 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sts2\" (UniqueName: \"kubernetes.io/projected/653a3476-1fa6-4afc-a036-df13d5a0c6e6-kube-api-access-4sts2\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.077114 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-httpd-config\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.077150 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-config\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.077275 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-ovndb-tls-certs\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.077359 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-combined-ca-bundle\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.081496 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-ovndb-tls-certs\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.081638 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-config\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.082001 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-httpd-config\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.082747 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-combined-ca-bundle\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.093266 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sts2\" (UniqueName: \"kubernetes.io/projected/653a3476-1fa6-4afc-a036-df13d5a0c6e6-kube-api-access-4sts2\") pod \"neutron-55f88f498d-gppxz\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.190889 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:32 crc kubenswrapper[4843]: E0318 12:32:32.218821 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 18 12:32:32 crc kubenswrapper[4843]: E0318 12:32:32.219229 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9d7rs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9tcmj_openstack(16d32f17-ac20-4f6d-8e00-db5fdafdc210): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:32:32 crc kubenswrapper[4843]: E0318 12:32:32.220426 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9tcmj" podUID="16d32f17-ac20-4f6d-8e00-db5fdafdc210" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.222947 4843 scope.go:117] "RemoveContainer" containerID="03162de42dfd503424d31b8ea267a38ff5514238637c5419c795569846155212" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.253093 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.265682 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f8f6d7945-z6kt6" event={"ID":"4f65306a-1421-4fa8-a629-dcaa746da646","Type":"ContainerDied","Data":"0c861735d78b6cbc921c8baa3d67d82b53d53c3dbc3f9fd00c051e289d2d5b1a"} Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.266079 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f8f6d7945-z6kt6" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.271147 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc96fcc9b-lt524" event={"ID":"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434","Type":"ContainerStarted","Data":"c6e05db2a8d042c71ae1d32e7d0214f89e5655e3fb6937991c21daa07e7c0fdc"} Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.271802 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:32 crc kubenswrapper[4843]: E0318 12:32:32.278031 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-9tcmj" podUID="16d32f17-ac20-4f6d-8e00-db5fdafdc210" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.285566 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.341211 4843 scope.go:117] "RemoveContainer" containerID="137cf20677da6ead9e8437dec8234c8d4f2177e59b3c861cb90b0956a7b729b5" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.384101 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f65306a-1421-4fa8-a629-dcaa746da646-scripts\") pod \"4f65306a-1421-4fa8-a629-dcaa746da646\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.384154 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls6fv\" (UniqueName: \"kubernetes.io/projected/4f65306a-1421-4fa8-a629-dcaa746da646-kube-api-access-ls6fv\") pod \"4f65306a-1421-4fa8-a629-dcaa746da646\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.384187 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4b581ed-0818-4c7e-be7e-4c2121d784e5-config-data\") pod \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.384260 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4b581ed-0818-4c7e-be7e-4c2121d784e5-horizon-secret-key\") pod \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.384283 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f65306a-1421-4fa8-a629-dcaa746da646-logs\") pod \"4f65306a-1421-4fa8-a629-dcaa746da646\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.384339 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f65306a-1421-4fa8-a629-dcaa746da646-config-data\") pod \"4f65306a-1421-4fa8-a629-dcaa746da646\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.384366 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b581ed-0818-4c7e-be7e-4c2121d784e5-logs\") pod \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.384428 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s42ct\" (UniqueName: \"kubernetes.io/projected/e4b581ed-0818-4c7e-be7e-4c2121d784e5-kube-api-access-s42ct\") pod \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.384456 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4b581ed-0818-4c7e-be7e-4c2121d784e5-scripts\") pod \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\" (UID: \"e4b581ed-0818-4c7e-be7e-4c2121d784e5\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.384524 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f65306a-1421-4fa8-a629-dcaa746da646-horizon-secret-key\") pod \"4f65306a-1421-4fa8-a629-dcaa746da646\" (UID: \"4f65306a-1421-4fa8-a629-dcaa746da646\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.385037 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f65306a-1421-4fa8-a629-dcaa746da646-config-data" (OuterVolumeSpecName: "config-data") pod "4f65306a-1421-4fa8-a629-dcaa746da646" (UID: "4f65306a-1421-4fa8-a629-dcaa746da646"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.385045 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b581ed-0818-4c7e-be7e-4c2121d784e5-config-data" (OuterVolumeSpecName: "config-data") pod "e4b581ed-0818-4c7e-be7e-4c2121d784e5" (UID: "e4b581ed-0818-4c7e-be7e-4c2121d784e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.385087 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f65306a-1421-4fa8-a629-dcaa746da646-scripts" (OuterVolumeSpecName: "scripts") pod "4f65306a-1421-4fa8-a629-dcaa746da646" (UID: "4f65306a-1421-4fa8-a629-dcaa746da646"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.385405 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f65306a-1421-4fa8-a629-dcaa746da646-logs" (OuterVolumeSpecName: "logs") pod "4f65306a-1421-4fa8-a629-dcaa746da646" (UID: "4f65306a-1421-4fa8-a629-dcaa746da646"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.387128 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b581ed-0818-4c7e-be7e-4c2121d784e5-scripts" (OuterVolumeSpecName: "scripts") pod "e4b581ed-0818-4c7e-be7e-4c2121d784e5" (UID: "e4b581ed-0818-4c7e-be7e-4c2121d784e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.387803 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b581ed-0818-4c7e-be7e-4c2121d784e5-logs" (OuterVolumeSpecName: "logs") pod "e4b581ed-0818-4c7e-be7e-4c2121d784e5" (UID: "e4b581ed-0818-4c7e-be7e-4c2121d784e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.388674 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f65306a-1421-4fa8-a629-dcaa746da646-kube-api-access-ls6fv" (OuterVolumeSpecName: "kube-api-access-ls6fv") pod "4f65306a-1421-4fa8-a629-dcaa746da646" (UID: "4f65306a-1421-4fa8-a629-dcaa746da646"). InnerVolumeSpecName "kube-api-access-ls6fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.396101 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b581ed-0818-4c7e-be7e-4c2121d784e5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e4b581ed-0818-4c7e-be7e-4c2121d784e5" (UID: "e4b581ed-0818-4c7e-be7e-4c2121d784e5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.396319 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f65306a-1421-4fa8-a629-dcaa746da646-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4f65306a-1421-4fa8-a629-dcaa746da646" (UID: "4f65306a-1421-4fa8-a629-dcaa746da646"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.396853 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4b581ed-0818-4c7e-be7e-4c2121d784e5-kube-api-access-s42ct" (OuterVolumeSpecName: "kube-api-access-s42ct") pod "e4b581ed-0818-4c7e-be7e-4c2121d784e5" (UID: "e4b581ed-0818-4c7e-be7e-4c2121d784e5"). InnerVolumeSpecName "kube-api-access-s42ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.487344 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9f84b91-a59a-4850-8928-658cee6c95d9-logs\") pod \"b9f84b91-a59a-4850-8928-658cee6c95d9\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.487459 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9f84b91-a59a-4850-8928-658cee6c95d9-horizon-secret-key\") pod \"b9f84b91-a59a-4850-8928-658cee6c95d9\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.487518 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9f84b91-a59a-4850-8928-658cee6c95d9-config-data\") pod \"b9f84b91-a59a-4850-8928-658cee6c95d9\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.487544 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9f84b91-a59a-4850-8928-658cee6c95d9-scripts\") pod \"b9f84b91-a59a-4850-8928-658cee6c95d9\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.487610 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5lnv\" (UniqueName: \"kubernetes.io/projected/b9f84b91-a59a-4850-8928-658cee6c95d9-kube-api-access-l5lnv\") pod \"b9f84b91-a59a-4850-8928-658cee6c95d9\" (UID: \"b9f84b91-a59a-4850-8928-658cee6c95d9\") " Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.488002 4843 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e4b581ed-0818-4c7e-be7e-4c2121d784e5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.488015 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f65306a-1421-4fa8-a629-dcaa746da646-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.488024 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f65306a-1421-4fa8-a629-dcaa746da646-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.488032 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4b581ed-0818-4c7e-be7e-4c2121d784e5-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.488040 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s42ct\" (UniqueName: \"kubernetes.io/projected/e4b581ed-0818-4c7e-be7e-4c2121d784e5-kube-api-access-s42ct\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.488051 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4b581ed-0818-4c7e-be7e-4c2121d784e5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.488059 4843 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f65306a-1421-4fa8-a629-dcaa746da646-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.488069 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f65306a-1421-4fa8-a629-dcaa746da646-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.488078 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4b581ed-0818-4c7e-be7e-4c2121d784e5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.488086 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls6fv\" (UniqueName: \"kubernetes.io/projected/4f65306a-1421-4fa8-a629-dcaa746da646-kube-api-access-ls6fv\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.490643 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9f84b91-a59a-4850-8928-658cee6c95d9-logs" (OuterVolumeSpecName: "logs") pod "b9f84b91-a59a-4850-8928-658cee6c95d9" (UID: "b9f84b91-a59a-4850-8928-658cee6c95d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.491878 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9f84b91-a59a-4850-8928-658cee6c95d9-scripts" (OuterVolumeSpecName: "scripts") pod "b9f84b91-a59a-4850-8928-658cee6c95d9" (UID: "b9f84b91-a59a-4850-8928-658cee6c95d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.492112 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9f84b91-a59a-4850-8928-658cee6c95d9-config-data" (OuterVolumeSpecName: "config-data") pod "b9f84b91-a59a-4850-8928-658cee6c95d9" (UID: "b9f84b91-a59a-4850-8928-658cee6c95d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.495912 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9f84b91-a59a-4850-8928-658cee6c95d9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b9f84b91-a59a-4850-8928-658cee6c95d9" (UID: "b9f84b91-a59a-4850-8928-658cee6c95d9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.500981 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9f84b91-a59a-4850-8928-658cee6c95d9-kube-api-access-l5lnv" (OuterVolumeSpecName: "kube-api-access-l5lnv") pod "b9f84b91-a59a-4850-8928-658cee6c95d9" (UID: "b9f84b91-a59a-4850-8928-658cee6c95d9"). InnerVolumeSpecName "kube-api-access-l5lnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.552997 4843 scope.go:117] "RemoveContainer" containerID="89bd04bc26003a405893212660bbbe31f8bf553f3e5fbf7afdb1be3b4b0813aa" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.590030 4843 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9f84b91-a59a-4850-8928-658cee6c95d9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.590060 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9f84b91-a59a-4850-8928-658cee6c95d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.590069 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9f84b91-a59a-4850-8928-658cee6c95d9-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.590079 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5lnv\" (UniqueName: \"kubernetes.io/projected/b9f84b91-a59a-4850-8928-658cee6c95d9-kube-api-access-l5lnv\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.590090 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9f84b91-a59a-4850-8928-658cee6c95d9-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.630784 4843 scope.go:117] "RemoveContainer" containerID="1805d281818e0d92552e777f452a5dba15d3386963556aa5dc9824a3a2fd1dba" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.677683 4843 scope.go:117] "RemoveContainer" containerID="a2e31fb0ca2c937825765d3a994057c0bb25143c5ab0a5250096f5d6d5939d84" Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.700714 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f8f6d7945-z6kt6"] Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.716012 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f8f6d7945-z6kt6"] Mar 18 12:32:32 crc kubenswrapper[4843]: I0318 12:32:32.809543 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bf5d4bdcb-8xfkn"] Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.123003 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f65306a-1421-4fa8-a629-dcaa746da646" path="/var/lib/kubelet/pods/4f65306a-1421-4fa8-a629-dcaa746da646/volumes" Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.123403 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75459f7a-55b8-42da-9072-357f8e2f0065" path="/var/lib/kubelet/pods/75459f7a-55b8-42da-9072-357f8e2f0065/volumes" Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.179447 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-vcl96"] Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.203075 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9g2xs"] Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.227008 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.288820 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bf5d4bdcb-8xfkn" event={"ID":"0199f761-6d2f-4921-8060-6960a0141f0a","Type":"ContainerStarted","Data":"491a3807b317d427093cbd28900b60d5d15dace93b02addeebe3b7962e720852"} Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.292196 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9g2xs" event={"ID":"d7dbbfca-f6b9-4421-8662-64ce08dade2d","Type":"ContainerStarted","Data":"fe8b69ad8851c28553b2c145704514c3a2abefd5bcde637176617463e3bde5ae"} Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.302870 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sq4hg" event={"ID":"9b82175f-cf5a-4d25-81c2-2c70df039edd","Type":"ContainerStarted","Data":"6f282cc8f8ddfdc08e0adeca5d7a07024ca87f9c22fcd319b45a763036b3bf59"} Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.304787 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.334782 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8648504-98a7-406e-9838-c8dc2d64ffe9","Type":"ContainerStarted","Data":"7c835082341f7ff229787df7ee9fb56ae8248570b01f14e98bd2a1ebbbfb1329"} Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.336695 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-644695cbfc-54s69" event={"ID":"b9f84b91-a59a-4850-8928-658cee6c95d9","Type":"ContainerDied","Data":"03598ddca039fa0263a84cf73e3875a7137dcf47a1232152045ac3c4fac7ff14"} Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.336890 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644695cbfc-54s69" Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.339644 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-sq4hg" podStartSLOduration=2.640293208 podStartE2EDuration="31.339626347s" podCreationTimestamp="2026-03-18 12:32:02 +0000 UTC" firstStartedPulling="2026-03-18 12:32:03.560210452 +0000 UTC m=+1357.276035976" lastFinishedPulling="2026-03-18 12:32:32.259543591 +0000 UTC m=+1385.975369115" observedRunningTime="2026-03-18 12:32:33.329182641 +0000 UTC m=+1387.045008165" watchObservedRunningTime="2026-03-18 12:32:33.339626347 +0000 UTC m=+1387.055451871" Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.364209 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8d86ff97c-mjh75" event={"ID":"e4b581ed-0818-4c7e-be7e-4c2121d784e5","Type":"ContainerDied","Data":"f30c56c2548ad48deedccb9419ff2f4f8c5b12ed96dde270c71c03c467b6fd9e"} Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.364285 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8d86ff97c-mjh75" Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.365915 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pz9j4" event={"ID":"2a1c39df-268b-4d85-a616-32c282f9a19b","Type":"ContainerStarted","Data":"fa1fc78dced51574579618c617598c724dfdbf2b8c155f325367a1b88c042ca9"} Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.368043 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" event={"ID":"e264861b-2898-4550-845c-7842781c5650","Type":"ContainerStarted","Data":"0efaa6a5e7b26cd8fc903b28750bfc562cb5b903e2da536d7c500a242dd16c5d"} Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.390261 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-pz9j4" podStartSLOduration=2.92593594 podStartE2EDuration="31.390233675s" podCreationTimestamp="2026-03-18 12:32:02 +0000 UTC" firstStartedPulling="2026-03-18 12:32:03.789830562 +0000 UTC m=+1357.505656086" lastFinishedPulling="2026-03-18 12:32:32.254128297 +0000 UTC m=+1385.969953821" observedRunningTime="2026-03-18 12:32:33.38407119 +0000 UTC m=+1387.099896714" watchObservedRunningTime="2026-03-18 12:32:33.390233675 +0000 UTC m=+1387.106059199" Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.507719 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8d86ff97c-mjh75"] Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.532349 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8d86ff97c-mjh75"] Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.574727 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-644695cbfc-54s69"] Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.596233 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-644695cbfc-54s69"] Mar 18 12:32:33 crc kubenswrapper[4843]: I0318 12:32:33.915293 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55f88f498d-gppxz"] Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.016579 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:34 crc kubenswrapper[4843]: W0318 12:32:34.054404 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ed72136_75da_4a94_a22a_c2511b6bb45a.slice/crio-1c50a1f6a680a7d49e079147dd8158d24252e7555ca1e7a8c2cb46214b52a5ba WatchSource:0}: Error finding container 1c50a1f6a680a7d49e079147dd8158d24252e7555ca1e7a8c2cb46214b52a5ba: Status 404 returned error can't find the container with id 1c50a1f6a680a7d49e079147dd8158d24252e7555ca1e7a8c2cb46214b52a5ba Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.386642 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc96fcc9b-lt524" event={"ID":"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434","Type":"ContainerStarted","Data":"3556ca049e4c4ff3eb611a388edd9d3f28839b1285421ed541909b3b6403983e"} Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.391247 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"faa05482-2d1f-43b3-93a2-94178afbc8a7","Type":"ContainerStarted","Data":"e5120395adcb25710ae79cf4e7aca3389748547e1ac64b3de7f3cc8c340582b9"} Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.391317 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"faa05482-2d1f-43b3-93a2-94178afbc8a7","Type":"ContainerStarted","Data":"f5741fd3765fa23a1bfe77a07da1e68a853d48f3e3dd17ff299d2f253843414c"} Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.399539 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9g2xs" event={"ID":"d7dbbfca-f6b9-4421-8662-64ce08dade2d","Type":"ContainerStarted","Data":"82935de4b27ce7ee9cad5df7a2045d570aeb4185bfe355cadfae7b2fc128ace5"} Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.407632 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bf5d4bdcb-8xfkn" event={"ID":"0199f761-6d2f-4921-8060-6960a0141f0a","Type":"ContainerStarted","Data":"f9f1295ac45323cf81504cdcaea9dc671f56c2f8da8dc86ff0e3a7f9beda0bc2"} Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.411392 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55f88f498d-gppxz" event={"ID":"653a3476-1fa6-4afc-a036-df13d5a0c6e6","Type":"ContainerStarted","Data":"7029e4a052a4a9f682b33d22bad1caba1c43db169686b07d67daca065fe36456"} Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.413722 4843 generic.go:334] "Generic (PLEG): container finished" podID="e264861b-2898-4550-845c-7842781c5650" containerID="039d224f98cc43e31f2cc72d1f8c1a7a07a7c2dbe2db58fb0f694c0ead940b81" exitCode=0 Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.413772 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" event={"ID":"e264861b-2898-4550-845c-7842781c5650","Type":"ContainerDied","Data":"039d224f98cc43e31f2cc72d1f8c1a7a07a7c2dbe2db58fb0f694c0ead940b81"} Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.419581 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ed72136-75da-4a94-a22a-c2511b6bb45a","Type":"ContainerStarted","Data":"1c50a1f6a680a7d49e079147dd8158d24252e7555ca1e7a8c2cb46214b52a5ba"} Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.425865 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9g2xs" podStartSLOduration=18.425845977 podStartE2EDuration="18.425845977s" podCreationTimestamp="2026-03-18 12:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:34.417807659 +0000 UTC m=+1388.133633203" watchObservedRunningTime="2026-03-18 12:32:34.425845977 +0000 UTC m=+1388.141671491" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.516041 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-74ccddfbbf-d44d7"] Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.526274 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.529336 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.532993 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.538026 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74ccddfbbf-d44d7"] Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.671059 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-combined-ca-bundle\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.671239 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-internal-tls-certs\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.671394 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-ovndb-tls-certs\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.671423 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-config\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.671454 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-public-tls-certs\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.671625 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gfds\" (UniqueName: \"kubernetes.io/projected/67c6737f-51b5-4262-9bd4-fe8afcb262ad-kube-api-access-2gfds\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.671843 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-httpd-config\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.773483 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-public-tls-certs\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.773546 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gfds\" (UniqueName: \"kubernetes.io/projected/67c6737f-51b5-4262-9bd4-fe8afcb262ad-kube-api-access-2gfds\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.773567 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-httpd-config\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.773620 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-combined-ca-bundle\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.773657 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-internal-tls-certs\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.773730 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-ovndb-tls-certs\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.773752 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-config\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.781310 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-internal-tls-certs\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.781340 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-ovndb-tls-certs\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.781893 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-public-tls-certs\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.781942 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-combined-ca-bundle\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.782188 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-httpd-config\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.783522 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-config\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.794703 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gfds\" (UniqueName: \"kubernetes.io/projected/67c6737f-51b5-4262-9bd4-fe8afcb262ad-kube-api-access-2gfds\") pod \"neutron-74ccddfbbf-d44d7\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:34 crc kubenswrapper[4843]: I0318 12:32:34.857787 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.002150 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9f84b91-a59a-4850-8928-658cee6c95d9" path="/var/lib/kubelet/pods/b9f84b91-a59a-4850-8928-658cee6c95d9/volumes" Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.002602 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4b581ed-0818-4c7e-be7e-4c2121d784e5" path="/var/lib/kubelet/pods/e4b581ed-0818-4c7e-be7e-4c2121d784e5/volumes" Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.451308 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" event={"ID":"e264861b-2898-4550-845c-7842781c5650","Type":"ContainerStarted","Data":"6b17d6c109ec668eb8e2db8e77c95eebb1e03766f258852c4c38846c44d17102"} Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.453618 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.459945 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ed72136-75da-4a94-a22a-c2511b6bb45a","Type":"ContainerStarted","Data":"629d7f5287d74b525ef7341e141dde0610d2ccd86069f9f863b31b01d4ac220d"} Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.487177 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc96fcc9b-lt524" event={"ID":"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434","Type":"ContainerStarted","Data":"cc85cd1c0e532d46fd65865f18713d5bac3f4038811619aedb9a024c01f92940"} Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.505043 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bf5d4bdcb-8xfkn" event={"ID":"0199f761-6d2f-4921-8060-6960a0141f0a","Type":"ContainerStarted","Data":"69d9a6d30605d38f99566ab5660d4a169aeb207cba8c578e7eca31b82fb27d98"} Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.518329 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55f88f498d-gppxz" event={"ID":"653a3476-1fa6-4afc-a036-df13d5a0c6e6","Type":"ContainerStarted","Data":"a671e7800a3ff26c9e5e699687d50c557282c4272376ce2c7e1b0f481a7ebb3b"} Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.520974 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55f88f498d-gppxz" event={"ID":"653a3476-1fa6-4afc-a036-df13d5a0c6e6","Type":"ContainerStarted","Data":"dce2758a1c7d1c49b477ae393897a09e8d1a92c6cef613b6518c2f44ceef130b"} Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.522100 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.528971 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7dc96fcc9b-lt524" podStartSLOduration=24.192179427 podStartE2EDuration="25.528952618s" podCreationTimestamp="2026-03-18 12:32:10 +0000 UTC" firstStartedPulling="2026-03-18 12:32:32.223126126 +0000 UTC m=+1385.938951640" lastFinishedPulling="2026-03-18 12:32:33.559899297 +0000 UTC m=+1387.275724831" observedRunningTime="2026-03-18 12:32:35.5124804 +0000 UTC m=+1389.228305924" watchObservedRunningTime="2026-03-18 12:32:35.528952618 +0000 UTC m=+1389.244778142" Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.529159 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" podStartSLOduration=4.529151774 podStartE2EDuration="4.529151774s" podCreationTimestamp="2026-03-18 12:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:35.486299006 +0000 UTC m=+1389.202124520" watchObservedRunningTime="2026-03-18 12:32:35.529151774 +0000 UTC m=+1389.244977318" Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.590339 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5bf5d4bdcb-8xfkn" podStartSLOduration=24.767618702 podStartE2EDuration="25.590315852s" podCreationTimestamp="2026-03-18 12:32:10 +0000 UTC" firstStartedPulling="2026-03-18 12:32:32.823379946 +0000 UTC m=+1386.539205460" lastFinishedPulling="2026-03-18 12:32:33.646077086 +0000 UTC m=+1387.361902610" observedRunningTime="2026-03-18 12:32:35.53993661 +0000 UTC m=+1389.255762134" watchObservedRunningTime="2026-03-18 12:32:35.590315852 +0000 UTC m=+1389.306141366" Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.620178 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74ccddfbbf-d44d7"] Mar 18 12:32:35 crc kubenswrapper[4843]: I0318 12:32:35.639856 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-55f88f498d-gppxz" podStartSLOduration=4.63983676 podStartE2EDuration="4.63983676s" podCreationTimestamp="2026-03-18 12:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:35.583727155 +0000 UTC m=+1389.299552679" watchObservedRunningTime="2026-03-18 12:32:35.63983676 +0000 UTC m=+1389.355662274" Mar 18 12:32:36 crc kubenswrapper[4843]: I0318 12:32:36.584783 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"faa05482-2d1f-43b3-93a2-94178afbc8a7","Type":"ContainerStarted","Data":"e952b22fa5cad4c50fae86de9f9e0bb81b48b450d35116902db80c66fe04e6c6"} Mar 18 12:32:36 crc kubenswrapper[4843]: I0318 12:32:36.611864 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74ccddfbbf-d44d7" event={"ID":"67c6737f-51b5-4262-9bd4-fe8afcb262ad","Type":"ContainerStarted","Data":"54367971f1ac999dc3e4856a8fb90d7293cc68bcd2ef1102827ff4a6c7481412"} Mar 18 12:32:36 crc kubenswrapper[4843]: I0318 12:32:36.611912 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74ccddfbbf-d44d7" event={"ID":"67c6737f-51b5-4262-9bd4-fe8afcb262ad","Type":"ContainerStarted","Data":"ebfc5b6a6da817c7161344f5173ae8dffd127485c51c681c5f7f5beac79e8c00"} Mar 18 12:32:36 crc kubenswrapper[4843]: I0318 12:32:36.611922 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74ccddfbbf-d44d7" event={"ID":"67c6737f-51b5-4262-9bd4-fe8afcb262ad","Type":"ContainerStarted","Data":"0f31fd1601db2327ed33dc45f5b1c44536a42a0fc034769057cad8cef207664d"} Mar 18 12:32:36 crc kubenswrapper[4843]: I0318 12:32:36.612800 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:32:36 crc kubenswrapper[4843]: I0318 12:32:36.624607 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=20.624586427 podStartE2EDuration="20.624586427s" podCreationTimestamp="2026-03-18 12:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:36.613383828 +0000 UTC m=+1390.329209352" watchObservedRunningTime="2026-03-18 12:32:36.624586427 +0000 UTC m=+1390.340411951" Mar 18 12:32:36 crc kubenswrapper[4843]: I0318 12:32:36.656072 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8648504-98a7-406e-9838-c8dc2d64ffe9","Type":"ContainerStarted","Data":"b6c155ad15706ebdbe27889daf0cd88a255bb6094c925dde218ff066e1894018"} Mar 18 12:32:36 crc kubenswrapper[4843]: I0318 12:32:36.663094 4843 generic.go:334] "Generic (PLEG): container finished" podID="2a1c39df-268b-4d85-a616-32c282f9a19b" containerID="fa1fc78dced51574579618c617598c724dfdbf2b8c155f325367a1b88c042ca9" exitCode=0 Mar 18 12:32:36 crc kubenswrapper[4843]: I0318 12:32:36.663185 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pz9j4" event={"ID":"2a1c39df-268b-4d85-a616-32c282f9a19b","Type":"ContainerDied","Data":"fa1fc78dced51574579618c617598c724dfdbf2b8c155f325367a1b88c042ca9"} Mar 18 12:32:36 crc kubenswrapper[4843]: I0318 12:32:36.667681 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-74ccddfbbf-d44d7" podStartSLOduration=2.667664051 podStartE2EDuration="2.667664051s" podCreationTimestamp="2026-03-18 12:32:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:36.665071607 +0000 UTC m=+1390.380897131" watchObservedRunningTime="2026-03-18 12:32:36.667664051 +0000 UTC m=+1390.383489575" Mar 18 12:32:36 crc kubenswrapper[4843]: I0318 12:32:36.670834 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ed72136-75da-4a94-a22a-c2511b6bb45a","Type":"ContainerStarted","Data":"80974ee1c1587c80aaa6ca7d23a6ee4bb548ba7b77992ab0e0be1c8c208d2f69"} Mar 18 12:32:36 crc kubenswrapper[4843]: I0318 12:32:36.756763 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=20.756746823 podStartE2EDuration="20.756746823s" podCreationTimestamp="2026-03-18 12:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:36.75208358 +0000 UTC m=+1390.467909094" watchObservedRunningTime="2026-03-18 12:32:36.756746823 +0000 UTC m=+1390.472572347" Mar 18 12:32:37 crc kubenswrapper[4843]: I0318 12:32:37.697152 4843 generic.go:334] "Generic (PLEG): container finished" podID="9b82175f-cf5a-4d25-81c2-2c70df039edd" containerID="6f282cc8f8ddfdc08e0adeca5d7a07024ca87f9c22fcd319b45a763036b3bf59" exitCode=0 Mar 18 12:32:37 crc kubenswrapper[4843]: I0318 12:32:37.697433 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sq4hg" event={"ID":"9b82175f-cf5a-4d25-81c2-2c70df039edd","Type":"ContainerDied","Data":"6f282cc8f8ddfdc08e0adeca5d7a07024ca87f9c22fcd319b45a763036b3bf59"} Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.156320 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.250642 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-config-data\") pod \"2a1c39df-268b-4d85-a616-32c282f9a19b\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.250701 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-scripts\") pod \"2a1c39df-268b-4d85-a616-32c282f9a19b\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.250745 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a1c39df-268b-4d85-a616-32c282f9a19b-logs\") pod \"2a1c39df-268b-4d85-a616-32c282f9a19b\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.250789 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pp2l\" (UniqueName: \"kubernetes.io/projected/2a1c39df-268b-4d85-a616-32c282f9a19b-kube-api-access-6pp2l\") pod \"2a1c39df-268b-4d85-a616-32c282f9a19b\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.250852 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-combined-ca-bundle\") pod \"2a1c39df-268b-4d85-a616-32c282f9a19b\" (UID: \"2a1c39df-268b-4d85-a616-32c282f9a19b\") " Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.252034 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a1c39df-268b-4d85-a616-32c282f9a19b-logs" (OuterVolumeSpecName: "logs") pod "2a1c39df-268b-4d85-a616-32c282f9a19b" (UID: "2a1c39df-268b-4d85-a616-32c282f9a19b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.268590 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-scripts" (OuterVolumeSpecName: "scripts") pod "2a1c39df-268b-4d85-a616-32c282f9a19b" (UID: "2a1c39df-268b-4d85-a616-32c282f9a19b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.281851 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1c39df-268b-4d85-a616-32c282f9a19b-kube-api-access-6pp2l" (OuterVolumeSpecName: "kube-api-access-6pp2l") pod "2a1c39df-268b-4d85-a616-32c282f9a19b" (UID: "2a1c39df-268b-4d85-a616-32c282f9a19b"). InnerVolumeSpecName "kube-api-access-6pp2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.286417 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a1c39df-268b-4d85-a616-32c282f9a19b" (UID: "2a1c39df-268b-4d85-a616-32c282f9a19b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.293954 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-config-data" (OuterVolumeSpecName: "config-data") pod "2a1c39df-268b-4d85-a616-32c282f9a19b" (UID: "2a1c39df-268b-4d85-a616-32c282f9a19b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.352518 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.352552 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.352561 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1c39df-268b-4d85-a616-32c282f9a19b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.352572 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a1c39df-268b-4d85-a616-32c282f9a19b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.352581 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pp2l\" (UniqueName: \"kubernetes.io/projected/2a1c39df-268b-4d85-a616-32c282f9a19b-kube-api-access-6pp2l\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.712485 4843 generic.go:334] "Generic (PLEG): container finished" podID="d7dbbfca-f6b9-4421-8662-64ce08dade2d" containerID="82935de4b27ce7ee9cad5df7a2045d570aeb4185bfe355cadfae7b2fc128ace5" exitCode=0 Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.712791 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9g2xs" event={"ID":"d7dbbfca-f6b9-4421-8662-64ce08dade2d","Type":"ContainerDied","Data":"82935de4b27ce7ee9cad5df7a2045d570aeb4185bfe355cadfae7b2fc128ace5"} Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.715701 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pz9j4" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.721698 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pz9j4" event={"ID":"2a1c39df-268b-4d85-a616-32c282f9a19b","Type":"ContainerDied","Data":"1d6cc21293afda571fac6e7d1069d90b08969cdb4bbbfab720d26c4c01d8eb61"} Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.721776 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d6cc21293afda571fac6e7d1069d90b08969cdb4bbbfab720d26c4c01d8eb61" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.891698 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-859b44d9b8-z27mw"] Mar 18 12:32:38 crc kubenswrapper[4843]: E0318 12:32:38.892137 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1c39df-268b-4d85-a616-32c282f9a19b" containerName="placement-db-sync" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.892151 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1c39df-268b-4d85-a616-32c282f9a19b" containerName="placement-db-sync" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.892391 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1c39df-268b-4d85-a616-32c282f9a19b" containerName="placement-db-sync" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.895075 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.897674 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.898110 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.898302 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.900497 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-65hnn" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.900530 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.906789 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-859b44d9b8-z27mw"] Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.968320 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-config-data\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.968384 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-internal-tls-certs\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.968406 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfn5c\" (UniqueName: \"kubernetes.io/projected/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-kube-api-access-gfn5c\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.968437 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-logs\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.968456 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-scripts\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.968471 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-public-tls-certs\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:38 crc kubenswrapper[4843]: I0318 12:32:38.968491 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-combined-ca-bundle\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.070136 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-config-data\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.070263 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-internal-tls-certs\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.070296 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfn5c\" (UniqueName: \"kubernetes.io/projected/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-kube-api-access-gfn5c\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.070355 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-logs\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.070391 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-scripts\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.070420 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-public-tls-certs\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.070454 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-combined-ca-bundle\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.072418 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-logs\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.234433 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfn5c\" (UniqueName: \"kubernetes.io/projected/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-kube-api-access-gfn5c\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.267463 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-public-tls-certs\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.270529 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-config-data\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.276076 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-scripts\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.279162 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-internal-tls-certs\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.279558 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-combined-ca-bundle\") pod \"placement-859b44d9b8-z27mw\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.351703 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sq4hg" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.596759 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.701047 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b82175f-cf5a-4d25-81c2-2c70df039edd-combined-ca-bundle\") pod \"9b82175f-cf5a-4d25-81c2-2c70df039edd\" (UID: \"9b82175f-cf5a-4d25-81c2-2c70df039edd\") " Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.701112 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b82175f-cf5a-4d25-81c2-2c70df039edd-db-sync-config-data\") pod \"9b82175f-cf5a-4d25-81c2-2c70df039edd\" (UID: \"9b82175f-cf5a-4d25-81c2-2c70df039edd\") " Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.701156 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6c6l\" (UniqueName: \"kubernetes.io/projected/9b82175f-cf5a-4d25-81c2-2c70df039edd-kube-api-access-b6c6l\") pod \"9b82175f-cf5a-4d25-81c2-2c70df039edd\" (UID: \"9b82175f-cf5a-4d25-81c2-2c70df039edd\") " Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.708935 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b82175f-cf5a-4d25-81c2-2c70df039edd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9b82175f-cf5a-4d25-81c2-2c70df039edd" (UID: "9b82175f-cf5a-4d25-81c2-2c70df039edd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.709558 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b82175f-cf5a-4d25-81c2-2c70df039edd-kube-api-access-b6c6l" (OuterVolumeSpecName: "kube-api-access-b6c6l") pod "9b82175f-cf5a-4d25-81c2-2c70df039edd" (UID: "9b82175f-cf5a-4d25-81c2-2c70df039edd"). InnerVolumeSpecName "kube-api-access-b6c6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.737898 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sq4hg" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.738383 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sq4hg" event={"ID":"9b82175f-cf5a-4d25-81c2-2c70df039edd","Type":"ContainerDied","Data":"8dc50bff7dd634e7bb09e35b2e94496f37f7251df705183e08e3612c5f909bad"} Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.738412 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dc50bff7dd634e7bb09e35b2e94496f37f7251df705183e08e3612c5f909bad" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.747894 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b82175f-cf5a-4d25-81c2-2c70df039edd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b82175f-cf5a-4d25-81c2-2c70df039edd" (UID: "9b82175f-cf5a-4d25-81c2-2c70df039edd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.803420 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b82175f-cf5a-4d25-81c2-2c70df039edd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.803685 4843 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b82175f-cf5a-4d25-81c2-2c70df039edd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:39 crc kubenswrapper[4843]: I0318 12:32:39.803704 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6c6l\" (UniqueName: \"kubernetes.io/projected/9b82175f-cf5a-4d25-81c2-2c70df039edd-kube-api-access-b6c6l\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.372788 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6db4d7d84b-xpgq7"] Mar 18 12:32:40 crc kubenswrapper[4843]: E0318 12:32:40.373231 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b82175f-cf5a-4d25-81c2-2c70df039edd" containerName="barbican-db-sync" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.373245 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b82175f-cf5a-4d25-81c2-2c70df039edd" containerName="barbican-db-sync" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.373425 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b82175f-cf5a-4d25-81c2-2c70df039edd" containerName="barbican-db-sync" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.375648 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.387774 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.387867 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.388096 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-46w9n" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.417417 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6db4d7d84b-xpgq7"] Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.449069 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7bffcb8fb5-8ktgs"] Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.450616 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.455629 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.473586 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7bffcb8fb5-8ktgs"] Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.474687 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n52xd\" (UniqueName: \"kubernetes.io/projected/5976626b-cd22-4f02-bd2b-d5d452e898c2-kube-api-access-n52xd\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.474829 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-combined-ca-bundle\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.474848 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-config-data-custom\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.474879 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-config-data\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.474916 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5976626b-cd22-4f02-bd2b-d5d452e898c2-logs\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.507964 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-859b44d9b8-z27mw"] Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.577663 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-config-data\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.577770 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-config-data-custom\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.577814 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-combined-ca-bundle\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.577840 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daaeab38-2bfe-40e6-8137-4aea071dfc05-logs\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.577871 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5976626b-cd22-4f02-bd2b-d5d452e898c2-logs\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.577906 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n52xd\" (UniqueName: \"kubernetes.io/projected/5976626b-cd22-4f02-bd2b-d5d452e898c2-kube-api-access-n52xd\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.577935 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjjfg\" (UniqueName: \"kubernetes.io/projected/daaeab38-2bfe-40e6-8137-4aea071dfc05-kube-api-access-wjjfg\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.578215 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-config-data\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.578267 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-combined-ca-bundle\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.578302 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-config-data-custom\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.583154 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5976626b-cd22-4f02-bd2b-d5d452e898c2-logs\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.606502 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-combined-ca-bundle\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.618625 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-vcl96"] Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.619000 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" podUID="e264861b-2898-4550-845c-7842781c5650" containerName="dnsmasq-dns" containerID="cri-o://6b17d6c109ec668eb8e2db8e77c95eebb1e03766f258852c4c38846c44d17102" gracePeriod=10 Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.630563 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-config-data-custom\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.640285 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-config-data\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.643956 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n52xd\" (UniqueName: \"kubernetes.io/projected/5976626b-cd22-4f02-bd2b-d5d452e898c2-kube-api-access-n52xd\") pod \"barbican-keystone-listener-6db4d7d84b-xpgq7\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.647619 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.719935 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-config-data-custom\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.720013 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-combined-ca-bundle\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.720044 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daaeab38-2bfe-40e6-8137-4aea071dfc05-logs\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.720107 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjjfg\" (UniqueName: \"kubernetes.io/projected/daaeab38-2bfe-40e6-8137-4aea071dfc05-kube-api-access-wjjfg\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.720340 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-config-data\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.722834 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daaeab38-2bfe-40e6-8137-4aea071dfc05-logs\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.727264 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-config-data-custom\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.727604 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-combined-ca-bundle\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.734923 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-zztvz"] Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.738245 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-config-data\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.737831 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.770667 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjjfg\" (UniqueName: \"kubernetes.io/projected/daaeab38-2bfe-40e6-8137-4aea071dfc05-kube-api-access-wjjfg\") pod \"barbican-worker-7bffcb8fb5-8ktgs\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.786926 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-859b44d9b8-z27mw" event={"ID":"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba","Type":"ContainerStarted","Data":"42c8ebedd47a5d8e0e9546811537815ee466f776e630f1a5ee0b0ae21a74becc"} Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.792310 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-zztvz"] Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.807275 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9g2xs" event={"ID":"d7dbbfca-f6b9-4421-8662-64ce08dade2d","Type":"ContainerDied","Data":"fe8b69ad8851c28553b2c145704514c3a2abefd5bcde637176617463e3bde5ae"} Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.807326 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe8b69ad8851c28553b2c145704514c3a2abefd5bcde637176617463e3bde5ae" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.808750 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.808968 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.841541 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.875823 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6fb7d6bc54-x5f24"] Mar 18 12:32:40 crc kubenswrapper[4843]: E0318 12:32:40.876474 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7dbbfca-f6b9-4421-8662-64ce08dade2d" containerName="keystone-bootstrap" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.876516 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7dbbfca-f6b9-4421-8662-64ce08dade2d" containerName="keystone-bootstrap" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.876759 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7dbbfca-f6b9-4421-8662-64ce08dade2d" containerName="keystone-bootstrap" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.878721 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.885393 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fb7d6bc54-x5f24"] Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.886163 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.940773 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-dns-svc\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.940821 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.940848 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75rj2\" (UniqueName: \"kubernetes.io/projected/a9f366e3-403e-4dad-9f88-b734ab67badd-kube-api-access-75rj2\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.940873 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-config\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.940908 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:40 crc kubenswrapper[4843]: I0318 12:32:40.941005 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.042181 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-scripts\") pod \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.042269 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-combined-ca-bundle\") pod \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.042303 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhcnz\" (UniqueName: \"kubernetes.io/projected/d7dbbfca-f6b9-4421-8662-64ce08dade2d-kube-api-access-fhcnz\") pod \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.042381 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-fernet-keys\") pod \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.042442 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-config-data\") pod \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.042471 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-credential-keys\") pod \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\" (UID: \"d7dbbfca-f6b9-4421-8662-64ce08dade2d\") " Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.042701 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mbsx\" (UniqueName: \"kubernetes.io/projected/7321460c-9acb-47b8-b14c-d3f17cd937ec-kube-api-access-6mbsx\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.042734 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-config-data\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.042774 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.042806 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-config-data-custom\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.042892 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-dns-svc\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.042925 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-combined-ca-bundle\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.042964 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.043007 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7321460c-9acb-47b8-b14c-d3f17cd937ec-logs\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.043038 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75rj2\" (UniqueName: \"kubernetes.io/projected/a9f366e3-403e-4dad-9f88-b734ab67badd-kube-api-access-75rj2\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.043082 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-config\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.043136 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.044621 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.179785 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.181245 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.181993 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.187557 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-dns-svc\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.194443 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-scripts" (OuterVolumeSpecName: "scripts") pod "d7dbbfca-f6b9-4421-8662-64ce08dade2d" (UID: "d7dbbfca-f6b9-4421-8662-64ce08dade2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.194546 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.201426 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.203127 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7dbbfca-f6b9-4421-8662-64ce08dade2d-kube-api-access-fhcnz" (OuterVolumeSpecName: "kube-api-access-fhcnz") pod "d7dbbfca-f6b9-4421-8662-64ce08dade2d" (UID: "d7dbbfca-f6b9-4421-8662-64ce08dade2d"). InnerVolumeSpecName "kube-api-access-fhcnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.203352 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d7dbbfca-f6b9-4421-8662-64ce08dade2d" (UID: "d7dbbfca-f6b9-4421-8662-64ce08dade2d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.213053 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-config\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.214737 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mbsx\" (UniqueName: \"kubernetes.io/projected/7321460c-9acb-47b8-b14c-d3f17cd937ec-kube-api-access-6mbsx\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.214783 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-config-data\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.214837 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-config-data-custom\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.214939 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-combined-ca-bundle\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.214989 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7321460c-9acb-47b8-b14c-d3f17cd937ec-logs\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.215694 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d7dbbfca-f6b9-4421-8662-64ce08dade2d" (UID: "d7dbbfca-f6b9-4421-8662-64ce08dade2d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.216016 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.218718 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhcnz\" (UniqueName: \"kubernetes.io/projected/d7dbbfca-f6b9-4421-8662-64ce08dade2d-kube-api-access-fhcnz\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.218751 4843 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.218765 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.221918 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7321460c-9acb-47b8-b14c-d3f17cd937ec-logs\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.228039 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-config-data\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.245068 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-combined-ca-bundle\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.247606 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75rj2\" (UniqueName: \"kubernetes.io/projected/a9f366e3-403e-4dad-9f88-b734ab67badd-kube-api-access-75rj2\") pod \"dnsmasq-dns-688c87cc99-zztvz\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.249136 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-config-data-custom\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.254222 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mbsx\" (UniqueName: \"kubernetes.io/projected/7321460c-9acb-47b8-b14c-d3f17cd937ec-kube-api-access-6mbsx\") pod \"barbican-api-6fb7d6bc54-x5f24\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.259476 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7dbbfca-f6b9-4421-8662-64ce08dade2d" (UID: "d7dbbfca-f6b9-4421-8662-64ce08dade2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.316234 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-config-data" (OuterVolumeSpecName: "config-data") pod "d7dbbfca-f6b9-4421-8662-64ce08dade2d" (UID: "d7dbbfca-f6b9-4421-8662-64ce08dade2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.320458 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.320492 4843 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.320503 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7dbbfca-f6b9-4421-8662-64ce08dade2d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.447790 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.536208 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.541374 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.645326 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-ovsdbserver-nb\") pod \"e264861b-2898-4550-845c-7842781c5650\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.645574 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-dns-swift-storage-0\") pod \"e264861b-2898-4550-845c-7842781c5650\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.645678 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-ovsdbserver-sb\") pod \"e264861b-2898-4550-845c-7842781c5650\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.645722 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4mzv\" (UniqueName: \"kubernetes.io/projected/e264861b-2898-4550-845c-7842781c5650-kube-api-access-h4mzv\") pod \"e264861b-2898-4550-845c-7842781c5650\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.645779 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-config\") pod \"e264861b-2898-4550-845c-7842781c5650\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.645798 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-dns-svc\") pod \"e264861b-2898-4550-845c-7842781c5650\" (UID: \"e264861b-2898-4550-845c-7842781c5650\") " Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.652371 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6db4d7d84b-xpgq7"] Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.658835 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e264861b-2898-4550-845c-7842781c5650-kube-api-access-h4mzv" (OuterVolumeSpecName: "kube-api-access-h4mzv") pod "e264861b-2898-4550-845c-7842781c5650" (UID: "e264861b-2898-4550-845c-7842781c5650"). InnerVolumeSpecName "kube-api-access-h4mzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.747571 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4mzv\" (UniqueName: \"kubernetes.io/projected/e264861b-2898-4550-845c-7842781c5650-kube-api-access-h4mzv\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.828464 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-859b44d9b8-z27mw" event={"ID":"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba","Type":"ContainerStarted","Data":"c540da30f2fc07d5ac506a921f7f994571430bd06a0850a32c546d419ecc2c3b"} Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.835944 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-config" (OuterVolumeSpecName: "config") pod "e264861b-2898-4550-845c-7842781c5650" (UID: "e264861b-2898-4550-845c-7842781c5650"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.842921 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7bffcb8fb5-8ktgs"] Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.990133 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.990137 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" event={"ID":"5976626b-cd22-4f02-bd2b-d5d452e898c2","Type":"ContainerStarted","Data":"242e1e24848de1bc5684c3aa0a072226fccf06f24820ee287c2fa1e0735721c8"} Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.994831 4843 generic.go:334] "Generic (PLEG): container finished" podID="e264861b-2898-4550-845c-7842781c5650" containerID="6b17d6c109ec668eb8e2db8e77c95eebb1e03766f258852c4c38846c44d17102" exitCode=0 Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.996022 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.996582 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" event={"ID":"e264861b-2898-4550-845c-7842781c5650","Type":"ContainerDied","Data":"6b17d6c109ec668eb8e2db8e77c95eebb1e03766f258852c4c38846c44d17102"} Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.996607 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-vcl96" event={"ID":"e264861b-2898-4550-845c-7842781c5650","Type":"ContainerDied","Data":"0efaa6a5e7b26cd8fc903b28750bfc562cb5b903e2da536d7c500a242dd16c5d"} Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.996624 4843 scope.go:117] "RemoveContainer" containerID="6b17d6c109ec668eb8e2db8e77c95eebb1e03766f258852c4c38846c44d17102" Mar 18 12:32:41 crc kubenswrapper[4843]: I0318 12:32:41.997226 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9g2xs" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.007942 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e264861b-2898-4550-845c-7842781c5650" (UID: "e264861b-2898-4550-845c-7842781c5650"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.043183 4843 scope.go:117] "RemoveContainer" containerID="039d224f98cc43e31f2cc72d1f8c1a7a07a7c2dbe2db58fb0f694c0ead940b81" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.048960 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e264861b-2898-4550-845c-7842781c5650" (UID: "e264861b-2898-4550-845c-7842781c5650"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.064872 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e264861b-2898-4550-845c-7842781c5650" (UID: "e264861b-2898-4550-845c-7842781c5650"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.083119 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e264861b-2898-4550-845c-7842781c5650" (UID: "e264861b-2898-4550-845c-7842781c5650"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.084814 4843 scope.go:117] "RemoveContainer" containerID="6b17d6c109ec668eb8e2db8e77c95eebb1e03766f258852c4c38846c44d17102" Mar 18 12:32:42 crc kubenswrapper[4843]: E0318 12:32:42.086033 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b17d6c109ec668eb8e2db8e77c95eebb1e03766f258852c4c38846c44d17102\": container with ID starting with 6b17d6c109ec668eb8e2db8e77c95eebb1e03766f258852c4c38846c44d17102 not found: ID does not exist" containerID="6b17d6c109ec668eb8e2db8e77c95eebb1e03766f258852c4c38846c44d17102" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.086062 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b17d6c109ec668eb8e2db8e77c95eebb1e03766f258852c4c38846c44d17102"} err="failed to get container status \"6b17d6c109ec668eb8e2db8e77c95eebb1e03766f258852c4c38846c44d17102\": rpc error: code = NotFound desc = could not find container \"6b17d6c109ec668eb8e2db8e77c95eebb1e03766f258852c4c38846c44d17102\": container with ID starting with 6b17d6c109ec668eb8e2db8e77c95eebb1e03766f258852c4c38846c44d17102 not found: ID does not exist" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.086087 4843 scope.go:117] "RemoveContainer" containerID="039d224f98cc43e31f2cc72d1f8c1a7a07a7c2dbe2db58fb0f694c0ead940b81" Mar 18 12:32:42 crc kubenswrapper[4843]: E0318 12:32:42.087603 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"039d224f98cc43e31f2cc72d1f8c1a7a07a7c2dbe2db58fb0f694c0ead940b81\": container with ID starting with 039d224f98cc43e31f2cc72d1f8c1a7a07a7c2dbe2db58fb0f694c0ead940b81 not found: ID does not exist" containerID="039d224f98cc43e31f2cc72d1f8c1a7a07a7c2dbe2db58fb0f694c0ead940b81" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.087635 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"039d224f98cc43e31f2cc72d1f8c1a7a07a7c2dbe2db58fb0f694c0ead940b81"} err="failed to get container status \"039d224f98cc43e31f2cc72d1f8c1a7a07a7c2dbe2db58fb0f694c0ead940b81\": rpc error: code = NotFound desc = could not find container \"039d224f98cc43e31f2cc72d1f8c1a7a07a7c2dbe2db58fb0f694c0ead940b81\": container with ID starting with 039d224f98cc43e31f2cc72d1f8c1a7a07a7c2dbe2db58fb0f694c0ead940b81 not found: ID does not exist" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.092382 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.092407 4843 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.092417 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.092432 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e264861b-2898-4550-845c-7842781c5650-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.193839 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f4bfb58c4-dvx2n"] Mar 18 12:32:42 crc kubenswrapper[4843]: E0318 12:32:42.194493 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e264861b-2898-4550-845c-7842781c5650" containerName="init" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.194510 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="e264861b-2898-4550-845c-7842781c5650" containerName="init" Mar 18 12:32:42 crc kubenswrapper[4843]: E0318 12:32:42.194525 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e264861b-2898-4550-845c-7842781c5650" containerName="dnsmasq-dns" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.194531 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="e264861b-2898-4550-845c-7842781c5650" containerName="dnsmasq-dns" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.194779 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="e264861b-2898-4550-845c-7842781c5650" containerName="dnsmasq-dns" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.196331 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.201201 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.201422 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.201596 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lq56r" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.201688 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.201739 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.201868 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.207968 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f4bfb58c4-dvx2n"] Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.295511 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-public-tls-certs\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.295572 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-config-data\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.295621 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-credential-keys\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.295685 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-internal-tls-certs\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.295780 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-scripts\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.295962 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-combined-ca-bundle\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.296144 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kjjn\" (UniqueName: \"kubernetes.io/projected/c8cb3725-0d98-427b-9c3f-4ae277b032c4-kube-api-access-6kjjn\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.296243 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-fernet-keys\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.357231 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-zztvz"] Mar 18 12:32:42 crc kubenswrapper[4843]: W0318 12:32:42.378224 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9f366e3_403e_4dad_9f88_b734ab67badd.slice/crio-8bae63b9a0354b038434043c5dd5875a19755fe17c76fa11808439bed89dcad1 WatchSource:0}: Error finding container 8bae63b9a0354b038434043c5dd5875a19755fe17c76fa11808439bed89dcad1: Status 404 returned error can't find the container with id 8bae63b9a0354b038434043c5dd5875a19755fe17c76fa11808439bed89dcad1 Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.393256 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-vcl96"] Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.396809 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-credential-keys\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.396854 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-internal-tls-certs\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.396938 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-scripts\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.396963 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-combined-ca-bundle\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.397000 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kjjn\" (UniqueName: \"kubernetes.io/projected/c8cb3725-0d98-427b-9c3f-4ae277b032c4-kube-api-access-6kjjn\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.397039 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-fernet-keys\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.397069 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-public-tls-certs\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.397092 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-config-data\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.406433 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-config-data\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.407197 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-credential-keys\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.412235 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-combined-ca-bundle\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.414002 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-scripts\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.414816 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-public-tls-certs\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.414964 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-internal-tls-certs\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.419026 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8cb3725-0d98-427b-9c3f-4ae277b032c4-fernet-keys\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.421925 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-vcl96"] Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.423626 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kjjn\" (UniqueName: \"kubernetes.io/projected/c8cb3725-0d98-427b-9c3f-4ae277b032c4-kube-api-access-6kjjn\") pod \"keystone-f4bfb58c4-dvx2n\" (UID: \"c8cb3725-0d98-427b-9c3f-4ae277b032c4\") " pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.467144 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fb7d6bc54-x5f24"] Mar 18 12:32:42 crc kubenswrapper[4843]: I0318 12:32:42.516103 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.123296 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e264861b-2898-4550-845c-7842781c5650" path="/var/lib/kubelet/pods/e264861b-2898-4550-845c-7842781c5650/volumes" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.124454 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fb7d6bc54-x5f24" event={"ID":"7321460c-9acb-47b8-b14c-d3f17cd937ec","Type":"ContainerStarted","Data":"cec48c31b364e0ce8d41d58b2bdaf09f90d03dbef86461b69479992bcbecda31"} Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.129174 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" event={"ID":"daaeab38-2bfe-40e6-8137-4aea071dfc05","Type":"ContainerStarted","Data":"b85d8f133514b59e520e1c736f7563e325a46f369bb068f926dd7c23d487b013"} Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.139744 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-zztvz" event={"ID":"a9f366e3-403e-4dad-9f88-b734ab67badd","Type":"ContainerStarted","Data":"8bae63b9a0354b038434043c5dd5875a19755fe17c76fa11808439bed89dcad1"} Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.157510 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-859b44d9b8-z27mw" event={"ID":"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba","Type":"ContainerStarted","Data":"fc88b2e91a473280f47da439283b67e0f0df2a316bb157d1f4957d66aee924e3"} Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.158438 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.158490 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.363571 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-94c7cb559-snwrc"] Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.365465 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.369318 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599f937c-69d8-4281-b545-e97d4678bc9b-combined-ca-bundle\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.369439 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/599f937c-69d8-4281-b545-e97d4678bc9b-config-data-custom\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.369472 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/599f937c-69d8-4281-b545-e97d4678bc9b-logs\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.369531 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/599f937c-69d8-4281-b545-e97d4678bc9b-config-data\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.369611 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzfds\" (UniqueName: \"kubernetes.io/projected/599f937c-69d8-4281-b545-e97d4678bc9b-kube-api-access-lzfds\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.393641 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-859b44d9b8-z27mw" podStartSLOduration=5.393614554 podStartE2EDuration="5.393614554s" podCreationTimestamp="2026-03-18 12:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:43.280781887 +0000 UTC m=+1396.996607411" watchObservedRunningTime="2026-03-18 12:32:43.393614554 +0000 UTC m=+1397.109440078" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.423191 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-656c8c855b-nmnnt"] Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.424943 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.450834 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-94c7cb559-snwrc"] Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.478327 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-656c8c855b-nmnnt"] Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.479601 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/599f937c-69d8-4281-b545-e97d4678bc9b-config-data-custom\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.479679 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/599f937c-69d8-4281-b545-e97d4678bc9b-logs\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.479709 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/599f937c-69d8-4281-b545-e97d4678bc9b-config-data\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.481056 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzfds\" (UniqueName: \"kubernetes.io/projected/599f937c-69d8-4281-b545-e97d4678bc9b-kube-api-access-lzfds\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.481244 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599f937c-69d8-4281-b545-e97d4678bc9b-combined-ca-bundle\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.492168 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/599f937c-69d8-4281-b545-e97d4678bc9b-logs\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.518582 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/599f937c-69d8-4281-b545-e97d4678bc9b-config-data\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.518970 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/599f937c-69d8-4281-b545-e97d4678bc9b-config-data-custom\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.521106 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599f937c-69d8-4281-b545-e97d4678bc9b-combined-ca-bundle\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.534075 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6cb8d6c886-ttksm"] Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.535683 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.586860 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d1350ab-77e3-446d-a13d-152395262970-logs\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.587168 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8wrr\" (UniqueName: \"kubernetes.io/projected/1d1350ab-77e3-446d-a13d-152395262970-kube-api-access-c8wrr\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.587353 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1350ab-77e3-446d-a13d-152395262970-config-data\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.587433 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d1350ab-77e3-446d-a13d-152395262970-config-data-custom\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.587513 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1350ab-77e3-446d-a13d-152395262970-combined-ca-bundle\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.617267 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzfds\" (UniqueName: \"kubernetes.io/projected/599f937c-69d8-4281-b545-e97d4678bc9b-kube-api-access-lzfds\") pod \"barbican-worker-94c7cb559-snwrc\" (UID: \"599f937c-69d8-4281-b545-e97d4678bc9b\") " pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.690103 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8wrr\" (UniqueName: \"kubernetes.io/projected/1d1350ab-77e3-446d-a13d-152395262970-kube-api-access-c8wrr\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.690191 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-config-data\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.690235 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cppl\" (UniqueName: \"kubernetes.io/projected/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-kube-api-access-4cppl\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.690286 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1350ab-77e3-446d-a13d-152395262970-config-data\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.690320 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d1350ab-77e3-446d-a13d-152395262970-config-data-custom\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.690362 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-combined-ca-bundle\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.690393 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1350ab-77e3-446d-a13d-152395262970-combined-ca-bundle\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.690420 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-config-data-custom\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.690518 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-logs\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.690544 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d1350ab-77e3-446d-a13d-152395262970-logs\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.706384 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d1350ab-77e3-446d-a13d-152395262970-logs\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.713250 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-94c7cb559-snwrc" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.830918 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d1350ab-77e3-446d-a13d-152395262970-config-data-custom\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.831642 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-config-data\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.897631 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cppl\" (UniqueName: \"kubernetes.io/projected/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-kube-api-access-4cppl\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.852825 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1350ab-77e3-446d-a13d-152395262970-combined-ca-bundle\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.857764 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8wrr\" (UniqueName: \"kubernetes.io/projected/1d1350ab-77e3-446d-a13d-152395262970-kube-api-access-c8wrr\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.865759 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-64d556c464-829sb"] Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.898430 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-combined-ca-bundle\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.898525 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-config-data-custom\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.898728 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-logs\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.899136 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-logs\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.854807 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1350ab-77e3-446d-a13d-152395262970-config-data\") pod \"barbican-keystone-listener-656c8c855b-nmnnt\" (UID: \"1d1350ab-77e3-446d-a13d-152395262970\") " pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.900340 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.887834 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-config-data\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.906580 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-config-data-custom\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.909519 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-combined-ca-bundle\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.914337 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64d556c464-829sb"] Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.942595 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cppl\" (UniqueName: \"kubernetes.io/projected/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-kube-api-access-4cppl\") pod \"barbican-api-6cb8d6c886-ttksm\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.952055 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cb8d6c886-ttksm"] Mar 18 12:32:43 crc kubenswrapper[4843]: I0318 12:32:43.972093 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f4bfb58c4-dvx2n"] Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.001036 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-combined-ca-bundle\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.001083 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-config-data\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.001125 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b74vq\" (UniqueName: \"kubernetes.io/projected/0bb677d0-a346-4663-b989-13b846766c47-kube-api-access-b74vq\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.001177 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-scripts\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.001214 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-internal-tls-certs\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.001298 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bb677d0-a346-4663-b989-13b846766c47-logs\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.001321 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-public-tls-certs\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.018829 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.066121 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.102630 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-internal-tls-certs\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.102726 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bb677d0-a346-4663-b989-13b846766c47-logs\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.102759 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-public-tls-certs\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.102836 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-combined-ca-bundle\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.102856 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-config-data\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.102887 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b74vq\" (UniqueName: \"kubernetes.io/projected/0bb677d0-a346-4663-b989-13b846766c47-kube-api-access-b74vq\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.102975 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-scripts\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.105693 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bb677d0-a346-4663-b989-13b846766c47-logs\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.108237 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-public-tls-certs\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.108457 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-internal-tls-certs\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.109041 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-config-data\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.110167 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-scripts\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.112551 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb677d0-a346-4663-b989-13b846766c47-combined-ca-bundle\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.124270 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b74vq\" (UniqueName: \"kubernetes.io/projected/0bb677d0-a346-4663-b989-13b846766c47-kube-api-access-b74vq\") pod \"placement-64d556c464-829sb\" (UID: \"0bb677d0-a346-4663-b989-13b846766c47\") " pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.193781 4843 generic.go:334] "Generic (PLEG): container finished" podID="a9f366e3-403e-4dad-9f88-b734ab67badd" containerID="ab81560b275f4cda29ebbfac518765cf4a2f21b192499ae9564782848dc440db" exitCode=0 Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.193851 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-zztvz" event={"ID":"a9f366e3-403e-4dad-9f88-b734ab67badd","Type":"ContainerDied","Data":"ab81560b275f4cda29ebbfac518765cf4a2f21b192499ae9564782848dc440db"} Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.208996 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fb7d6bc54-x5f24" event={"ID":"7321460c-9acb-47b8-b14c-d3f17cd937ec","Type":"ContainerStarted","Data":"fa81f9360ad799dfa6ed8d506fc0a45a50aec5c8f0871de611e40c7de41855b6"} Mar 18 12:32:44 crc kubenswrapper[4843]: I0318 12:32:44.233702 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.461559 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6fb7d6bc54-x5f24"] Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.545061 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55c9766bb-kdvz7"] Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.555676 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.558073 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.558294 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.566052 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55c9766bb-kdvz7"] Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.645022 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgxln\" (UniqueName: \"kubernetes.io/projected/c05a32ac-ec1d-497c-81be-4160787c43b3-kube-api-access-qgxln\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.645075 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-public-tls-certs\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.645102 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-combined-ca-bundle\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.645158 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c05a32ac-ec1d-497c-81be-4160787c43b3-logs\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.645211 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-internal-tls-certs\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.645265 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-config-data\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.645290 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-config-data-custom\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.746608 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c05a32ac-ec1d-497c-81be-4160787c43b3-logs\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.746714 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-internal-tls-certs\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.746783 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-config-data\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.746806 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-config-data-custom\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.746852 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgxln\" (UniqueName: \"kubernetes.io/projected/c05a32ac-ec1d-497c-81be-4160787c43b3-kube-api-access-qgxln\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.746868 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-public-tls-certs\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.746888 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-combined-ca-bundle\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.750099 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c05a32ac-ec1d-497c-81be-4160787c43b3-logs\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.756703 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-combined-ca-bundle\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.761602 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-config-data-custom\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.762142 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-public-tls-certs\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.763177 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-internal-tls-certs\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.764129 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c05a32ac-ec1d-497c-81be-4160787c43b3-config-data\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.774248 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgxln\" (UniqueName: \"kubernetes.io/projected/c05a32ac-ec1d-497c-81be-4160787c43b3-kube-api-access-qgxln\") pod \"barbican-api-55c9766bb-kdvz7\" (UID: \"c05a32ac-ec1d-497c-81be-4160787c43b3\") " pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:45 crc kubenswrapper[4843]: I0318 12:32:45.893024 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:46 crc kubenswrapper[4843]: I0318 12:32:46.648608 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 12:32:46 crc kubenswrapper[4843]: I0318 12:32:46.649355 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 12:32:46 crc kubenswrapper[4843]: I0318 12:32:46.649385 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 12:32:46 crc kubenswrapper[4843]: I0318 12:32:46.649397 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 12:32:46 crc kubenswrapper[4843]: I0318 12:32:46.649490 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:46 crc kubenswrapper[4843]: I0318 12:32:46.649533 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:46 crc kubenswrapper[4843]: I0318 12:32:46.649549 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:46 crc kubenswrapper[4843]: I0318 12:32:46.649558 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:46 crc kubenswrapper[4843]: I0318 12:32:46.713059 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:46 crc kubenswrapper[4843]: I0318 12:32:46.753549 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:46 crc kubenswrapper[4843]: I0318 12:32:46.758703 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 12:32:46 crc kubenswrapper[4843]: I0318 12:32:46.775006 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 12:32:50 crc kubenswrapper[4843]: I0318 12:32:50.633895 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 12:32:50 crc kubenswrapper[4843]: I0318 12:32:50.634445 4843 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:32:50 crc kubenswrapper[4843]: I0318 12:32:50.635535 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 12:32:50 crc kubenswrapper[4843]: I0318 12:32:50.654473 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:50 crc kubenswrapper[4843]: I0318 12:32:50.654575 4843 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:32:51 crc kubenswrapper[4843]: I0318 12:32:51.044571 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7dc96fcc9b-lt524" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 18 12:32:51 crc kubenswrapper[4843]: I0318 12:32:51.151957 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5bf5d4bdcb-8xfkn" podUID="0199f761-6d2f-4921-8060-6960a0141f0a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.155:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.155:8443: connect: connection refused" Mar 18 12:32:51 crc kubenswrapper[4843]: I0318 12:32:51.192253 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:52 crc kubenswrapper[4843]: W0318 12:32:52.115428 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8cb3725_0d98_427b_9c3f_4ae277b032c4.slice/crio-32a0ddb3a049496cbd1ea09ba264021fe8eb0dac4e1c632bea8e0c89012b81e6 WatchSource:0}: Error finding container 32a0ddb3a049496cbd1ea09ba264021fe8eb0dac4e1c632bea8e0c89012b81e6: Status 404 returned error can't find the container with id 32a0ddb3a049496cbd1ea09ba264021fe8eb0dac4e1c632bea8e0c89012b81e6 Mar 18 12:32:52 crc kubenswrapper[4843]: I0318 12:32:52.320230 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f4bfb58c4-dvx2n" event={"ID":"c8cb3725-0d98-427b-9c3f-4ae277b032c4","Type":"ContainerStarted","Data":"32a0ddb3a049496cbd1ea09ba264021fe8eb0dac4e1c632bea8e0c89012b81e6"} Mar 18 12:32:52 crc kubenswrapper[4843]: I0318 12:32:52.693238 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55c9766bb-kdvz7"] Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.108695 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-656c8c855b-nmnnt"] Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.149718 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-94c7cb559-snwrc"] Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.451932 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6fb7d6bc54-x5f24" podUID="7321460c-9acb-47b8-b14c-d3f17cd937ec" containerName="barbican-api-log" containerID="cri-o://fa81f9360ad799dfa6ed8d506fc0a45a50aec5c8f0871de611e40c7de41855b6" gracePeriod=30 Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.452187 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fb7d6bc54-x5f24" event={"ID":"7321460c-9acb-47b8-b14c-d3f17cd937ec","Type":"ContainerStarted","Data":"de5293daf4d4080c399e13311d5f89c19ae291442390ec8def56b3ccb3ab78c6"} Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.452219 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.452526 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6fb7d6bc54-x5f24" podUID="7321460c-9acb-47b8-b14c-d3f17cd937ec" containerName="barbican-api" containerID="cri-o://de5293daf4d4080c399e13311d5f89c19ae291442390ec8def56b3ccb3ab78c6" gracePeriod=30 Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.452739 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.460683 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" event={"ID":"1d1350ab-77e3-446d-a13d-152395262970","Type":"ContainerStarted","Data":"82ddfad0cfa987509232f71963a8014a01554e8bef560e0efea1d87e53514aae"} Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.462435 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fb7d6bc54-x5f24" podUID="7321460c-9acb-47b8-b14c-d3f17cd937ec" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.463247 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55c9766bb-kdvz7" event={"ID":"c05a32ac-ec1d-497c-81be-4160787c43b3","Type":"ContainerStarted","Data":"975b3c53a170d3d8edbd3dd516162b89efc2c21a0ae622f3253c54e3f9673307"} Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.473891 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-94c7cb559-snwrc" event={"ID":"599f937c-69d8-4281-b545-e97d4678bc9b","Type":"ContainerStarted","Data":"f726515627a2af07ec6581e420ad641606ceb40d71bc74081685e3b58ed06a3c"} Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.482325 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cb8d6c886-ttksm"] Mar 18 12:32:53 crc kubenswrapper[4843]: W0318 12:32:53.491960 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9e8a33c_889e_45e0_9cc2_60e054a3b5a9.slice/crio-6d11b728b4a758d7ca11cd72c858ae0390a5b398da1c06b06b31abb9e0ac304a WatchSource:0}: Error finding container 6d11b728b4a758d7ca11cd72c858ae0390a5b398da1c06b06b31abb9e0ac304a: Status 404 returned error can't find the container with id 6d11b728b4a758d7ca11cd72c858ae0390a5b398da1c06b06b31abb9e0ac304a Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.492978 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64d556c464-829sb"] Mar 18 12:32:53 crc kubenswrapper[4843]: I0318 12:32:53.503542 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6fb7d6bc54-x5f24" podStartSLOduration=13.50352289 podStartE2EDuration="13.50352289s" podCreationTimestamp="2026-03-18 12:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:53.479177048 +0000 UTC m=+1407.195002572" watchObservedRunningTime="2026-03-18 12:32:53.50352289 +0000 UTC m=+1407.219348414" Mar 18 12:32:54 crc kubenswrapper[4843]: I0318 12:32:54.494593 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-zztvz" event={"ID":"a9f366e3-403e-4dad-9f88-b734ab67badd","Type":"ContainerStarted","Data":"705ee57ffa28a7ed735b651f092ba09ff40a3417da546f92d010b4691b2f02b6"} Mar 18 12:32:54 crc kubenswrapper[4843]: I0318 12:32:54.498030 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:32:54 crc kubenswrapper[4843]: I0318 12:32:54.499433 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cb8d6c886-ttksm" event={"ID":"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9","Type":"ContainerStarted","Data":"6d11b728b4a758d7ca11cd72c858ae0390a5b398da1c06b06b31abb9e0ac304a"} Mar 18 12:32:54 crc kubenswrapper[4843]: I0318 12:32:54.501663 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d556c464-829sb" event={"ID":"0bb677d0-a346-4663-b989-13b846766c47","Type":"ContainerStarted","Data":"442ea705590bdda212d97a5eab4c743a57bfb1e853d3278e0a56d5ee022fe3ac"} Mar 18 12:32:54 crc kubenswrapper[4843]: I0318 12:32:54.503761 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f4bfb58c4-dvx2n" event={"ID":"c8cb3725-0d98-427b-9c3f-4ae277b032c4","Type":"ContainerStarted","Data":"259715efdbac23012430c2dfe23f2ab60660af90b4171cc495f7269a5e24520d"} Mar 18 12:32:54 crc kubenswrapper[4843]: I0318 12:32:54.505899 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:32:54 crc kubenswrapper[4843]: I0318 12:32:54.522910 4843 generic.go:334] "Generic (PLEG): container finished" podID="7321460c-9acb-47b8-b14c-d3f17cd937ec" containerID="fa81f9360ad799dfa6ed8d506fc0a45a50aec5c8f0871de611e40c7de41855b6" exitCode=143 Mar 18 12:32:54 crc kubenswrapper[4843]: I0318 12:32:54.523008 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fb7d6bc54-x5f24" event={"ID":"7321460c-9acb-47b8-b14c-d3f17cd937ec","Type":"ContainerDied","Data":"fa81f9360ad799dfa6ed8d506fc0a45a50aec5c8f0871de611e40c7de41855b6"} Mar 18 12:32:54 crc kubenswrapper[4843]: I0318 12:32:54.523944 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-zztvz" podStartSLOduration=14.5239246 podStartE2EDuration="14.5239246s" podCreationTimestamp="2026-03-18 12:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:54.521997105 +0000 UTC m=+1408.237822639" watchObservedRunningTime="2026-03-18 12:32:54.5239246 +0000 UTC m=+1408.239750124" Mar 18 12:32:54 crc kubenswrapper[4843]: I0318 12:32:54.527458 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" event={"ID":"daaeab38-2bfe-40e6-8137-4aea071dfc05","Type":"ContainerStarted","Data":"7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1"} Mar 18 12:32:54 crc kubenswrapper[4843]: I0318 12:32:54.552305 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-f4bfb58c4-dvx2n" podStartSLOduration=12.552286966 podStartE2EDuration="12.552286966s" podCreationTimestamp="2026-03-18 12:32:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:54.543228279 +0000 UTC m=+1408.259053803" watchObservedRunningTime="2026-03-18 12:32:54.552286966 +0000 UTC m=+1408.268112490" Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.558131 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cb8d6c886-ttksm" event={"ID":"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9","Type":"ContainerStarted","Data":"3e7f2eadd5891cac4e77075d55862553c17c1a859b82a7e8b2101caa1dbd6ea3"} Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.560353 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cb8d6c886-ttksm" event={"ID":"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9","Type":"ContainerStarted","Data":"53e8bd7305de49d562a11fb9cce9e3a9f795cd00a2a0f7840de46d4a98ef69b7"} Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.561133 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.561227 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.562988 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8648504-98a7-406e-9838-c8dc2d64ffe9","Type":"ContainerStarted","Data":"6a017f03a1efe8a6c123adf81e9c6c3551c21cab076b97b0f0a289aba1ab8853"} Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.564774 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" event={"ID":"5976626b-cd22-4f02-bd2b-d5d452e898c2","Type":"ContainerStarted","Data":"222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba"} Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.565880 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9tcmj" event={"ID":"16d32f17-ac20-4f6d-8e00-db5fdafdc210","Type":"ContainerStarted","Data":"2abf3ef0293d4755cd999d2ba56202726ddd7ea924d350e6620b91a8018db2e6"} Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.567812 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" event={"ID":"1d1350ab-77e3-446d-a13d-152395262970","Type":"ContainerStarted","Data":"9a37f70a7433b61f357c74e4de428b360ca97b083a508c2cb4c142f37aab4c31"} Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.569381 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" event={"ID":"daaeab38-2bfe-40e6-8137-4aea071dfc05","Type":"ContainerStarted","Data":"9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3"} Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.571774 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d556c464-829sb" event={"ID":"0bb677d0-a346-4663-b989-13b846766c47","Type":"ContainerStarted","Data":"1756c0ca1d9620d384a7d55c3f188af2a5266e9972e39486e183a5bc27afee2e"} Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.572159 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d556c464-829sb" event={"ID":"0bb677d0-a346-4663-b989-13b846766c47","Type":"ContainerStarted","Data":"d2b79b65429b68228c419ef00890ffd4485c5b09758e7499dea539fa93d9f9db"} Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.572940 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.573030 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64d556c464-829sb" Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.574955 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55c9766bb-kdvz7" event={"ID":"c05a32ac-ec1d-497c-81be-4160787c43b3","Type":"ContainerStarted","Data":"381e13378b2cf4d2976055bd0a46547486b56c0449f950b474a651d2eb52ffec"} Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.575789 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55c9766bb-kdvz7" event={"ID":"c05a32ac-ec1d-497c-81be-4160787c43b3","Type":"ContainerStarted","Data":"2ae7bbcd7fad03f96babfbd01abe211c8774f9d894ba42a739b8b4cbb7ef3922"} Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.575876 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.575955 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.582014 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-94c7cb559-snwrc" event={"ID":"599f937c-69d8-4281-b545-e97d4678bc9b","Type":"ContainerStarted","Data":"d3329f710192195e7787f38cb8f7d2038258f738110adb9b6b3f22e214f58f25"} Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.582040 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-94c7cb559-snwrc" event={"ID":"599f937c-69d8-4281-b545-e97d4678bc9b","Type":"ContainerStarted","Data":"ebf1e9697b37446664579d050a232dd075dbd0e34c6b2f857eaaa08601bbe23e"} Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.604716 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6cb8d6c886-ttksm" podStartSLOduration=12.604698515 podStartE2EDuration="12.604698515s" podCreationTimestamp="2026-03-18 12:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:55.595046811 +0000 UTC m=+1409.310872345" watchObservedRunningTime="2026-03-18 12:32:55.604698515 +0000 UTC m=+1409.320524039" Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.626932 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9tcmj" podStartSLOduration=5.490293282 podStartE2EDuration="54.626902576s" podCreationTimestamp="2026-03-18 12:32:01 +0000 UTC" firstStartedPulling="2026-03-18 12:32:03.365241638 +0000 UTC m=+1357.081067162" lastFinishedPulling="2026-03-18 12:32:52.501850932 +0000 UTC m=+1406.217676456" observedRunningTime="2026-03-18 12:32:55.614462273 +0000 UTC m=+1409.330287797" watchObservedRunningTime="2026-03-18 12:32:55.626902576 +0000 UTC m=+1409.342728100" Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.640406 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" podStartSLOduration=5.393584284 podStartE2EDuration="15.64038431s" podCreationTimestamp="2026-03-18 12:32:40 +0000 UTC" firstStartedPulling="2026-03-18 12:32:42.04304664 +0000 UTC m=+1395.758872164" lastFinishedPulling="2026-03-18 12:32:52.289846656 +0000 UTC m=+1406.005672190" observedRunningTime="2026-03-18 12:32:55.639595677 +0000 UTC m=+1409.355421201" watchObservedRunningTime="2026-03-18 12:32:55.64038431 +0000 UTC m=+1409.356209834" Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.677520 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-64d556c464-829sb" podStartSLOduration=12.677499184 podStartE2EDuration="12.677499184s" podCreationTimestamp="2026-03-18 12:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:55.667601633 +0000 UTC m=+1409.383427157" watchObservedRunningTime="2026-03-18 12:32:55.677499184 +0000 UTC m=+1409.393324708" Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.695671 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-94c7cb559-snwrc" podStartSLOduration=12.69564166 podStartE2EDuration="12.69564166s" podCreationTimestamp="2026-03-18 12:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:55.693931771 +0000 UTC m=+1409.409757295" watchObservedRunningTime="2026-03-18 12:32:55.69564166 +0000 UTC m=+1409.411467184" Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.722933 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7bffcb8fb5-8ktgs"] Mar 18 12:32:55 crc kubenswrapper[4843]: I0318 12:32:55.752143 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55c9766bb-kdvz7" podStartSLOduration=10.752112985 podStartE2EDuration="10.752112985s" podCreationTimestamp="2026-03-18 12:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:55.724528751 +0000 UTC m=+1409.440354295" watchObservedRunningTime="2026-03-18 12:32:55.752112985 +0000 UTC m=+1409.467938509" Mar 18 12:32:56 crc kubenswrapper[4843]: I0318 12:32:56.667213 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" event={"ID":"5976626b-cd22-4f02-bd2b-d5d452e898c2","Type":"ContainerStarted","Data":"55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897"} Mar 18 12:32:56 crc kubenswrapper[4843]: I0318 12:32:56.729708 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" podStartSLOduration=6.141990124 podStartE2EDuration="16.729674508s" podCreationTimestamp="2026-03-18 12:32:40 +0000 UTC" firstStartedPulling="2026-03-18 12:32:41.767851299 +0000 UTC m=+1395.483676823" lastFinishedPulling="2026-03-18 12:32:52.355535693 +0000 UTC m=+1406.071361207" observedRunningTime="2026-03-18 12:32:56.707383454 +0000 UTC m=+1410.423208978" watchObservedRunningTime="2026-03-18 12:32:56.729674508 +0000 UTC m=+1410.445500032" Mar 18 12:32:57 crc kubenswrapper[4843]: I0318 12:32:57.654193 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-hxxp9" podUID="f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:32:57 crc kubenswrapper[4843]: I0318 12:32:57.677792 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" podUID="daaeab38-2bfe-40e6-8137-4aea071dfc05" containerName="barbican-worker-log" containerID="cri-o://7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1" gracePeriod=30 Mar 18 12:32:57 crc kubenswrapper[4843]: I0318 12:32:57.678836 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" event={"ID":"1d1350ab-77e3-446d-a13d-152395262970","Type":"ContainerStarted","Data":"9ba2887ae1809233b1f28260aa61deec99426ebd99018e12cc1b32d9545da575"} Mar 18 12:32:57 crc kubenswrapper[4843]: I0318 12:32:57.679954 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" podUID="daaeab38-2bfe-40e6-8137-4aea071dfc05" containerName="barbican-worker" containerID="cri-o://9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3" gracePeriod=30 Mar 18 12:32:57 crc kubenswrapper[4843]: I0318 12:32:57.699198 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-656c8c855b-nmnnt" podStartSLOduration=14.699175001 podStartE2EDuration="14.699175001s" podCreationTimestamp="2026-03-18 12:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:57.695035334 +0000 UTC m=+1411.410860858" watchObservedRunningTime="2026-03-18 12:32:57.699175001 +0000 UTC m=+1411.415000525" Mar 18 12:32:57 crc kubenswrapper[4843]: I0318 12:32:57.726065 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6db4d7d84b-xpgq7"] Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.512550 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.582543 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.647856 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-config-data\") pod \"daaeab38-2bfe-40e6-8137-4aea071dfc05\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.647960 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daaeab38-2bfe-40e6-8137-4aea071dfc05-logs\") pod \"daaeab38-2bfe-40e6-8137-4aea071dfc05\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.647985 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-config-data-custom\") pod \"daaeab38-2bfe-40e6-8137-4aea071dfc05\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.648096 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-combined-ca-bundle\") pod \"daaeab38-2bfe-40e6-8137-4aea071dfc05\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.648271 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjjfg\" (UniqueName: \"kubernetes.io/projected/daaeab38-2bfe-40e6-8137-4aea071dfc05-kube-api-access-wjjfg\") pod \"daaeab38-2bfe-40e6-8137-4aea071dfc05\" (UID: \"daaeab38-2bfe-40e6-8137-4aea071dfc05\") " Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.648431 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daaeab38-2bfe-40e6-8137-4aea071dfc05-logs" (OuterVolumeSpecName: "logs") pod "daaeab38-2bfe-40e6-8137-4aea071dfc05" (UID: "daaeab38-2bfe-40e6-8137-4aea071dfc05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.649325 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daaeab38-2bfe-40e6-8137-4aea071dfc05-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.669893 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daaeab38-2bfe-40e6-8137-4aea071dfc05-kube-api-access-wjjfg" (OuterVolumeSpecName: "kube-api-access-wjjfg") pod "daaeab38-2bfe-40e6-8137-4aea071dfc05" (UID: "daaeab38-2bfe-40e6-8137-4aea071dfc05"). InnerVolumeSpecName "kube-api-access-wjjfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.670248 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "daaeab38-2bfe-40e6-8137-4aea071dfc05" (UID: "daaeab38-2bfe-40e6-8137-4aea071dfc05"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.680789 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "daaeab38-2bfe-40e6-8137-4aea071dfc05" (UID: "daaeab38-2bfe-40e6-8137-4aea071dfc05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.692254 4843 generic.go:334] "Generic (PLEG): container finished" podID="daaeab38-2bfe-40e6-8137-4aea071dfc05" containerID="9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3" exitCode=0 Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.692317 4843 generic.go:334] "Generic (PLEG): container finished" podID="daaeab38-2bfe-40e6-8137-4aea071dfc05" containerID="7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1" exitCode=143 Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.692334 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.692418 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" event={"ID":"daaeab38-2bfe-40e6-8137-4aea071dfc05","Type":"ContainerDied","Data":"9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3"} Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.692448 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" event={"ID":"daaeab38-2bfe-40e6-8137-4aea071dfc05","Type":"ContainerDied","Data":"7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1"} Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.692459 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bffcb8fb5-8ktgs" event={"ID":"daaeab38-2bfe-40e6-8137-4aea071dfc05","Type":"ContainerDied","Data":"b85d8f133514b59e520e1c736f7563e325a46f369bb068f926dd7c23d487b013"} Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.692488 4843 scope.go:117] "RemoveContainer" containerID="9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.693066 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" podUID="5976626b-cd22-4f02-bd2b-d5d452e898c2" containerName="barbican-keystone-listener-log" containerID="cri-o://222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba" gracePeriod=30 Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.693137 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" podUID="5976626b-cd22-4f02-bd2b-d5d452e898c2" containerName="barbican-keystone-listener" containerID="cri-o://55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897" gracePeriod=30 Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.712694 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-config-data" (OuterVolumeSpecName: "config-data") pod "daaeab38-2bfe-40e6-8137-4aea071dfc05" (UID: "daaeab38-2bfe-40e6-8137-4aea071dfc05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.731492 4843 scope.go:117] "RemoveContainer" containerID="7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.751268 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.751299 4843 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.751308 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daaeab38-2bfe-40e6-8137-4aea071dfc05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.751320 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjjfg\" (UniqueName: \"kubernetes.io/projected/daaeab38-2bfe-40e6-8137-4aea071dfc05-kube-api-access-wjjfg\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.756340 4843 scope.go:117] "RemoveContainer" containerID="9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3" Mar 18 12:32:58 crc kubenswrapper[4843]: E0318 12:32:58.762500 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3\": container with ID starting with 9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3 not found: ID does not exist" containerID="9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.762543 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3"} err="failed to get container status \"9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3\": rpc error: code = NotFound desc = could not find container \"9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3\": container with ID starting with 9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3 not found: ID does not exist" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.762568 4843 scope.go:117] "RemoveContainer" containerID="7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1" Mar 18 12:32:58 crc kubenswrapper[4843]: E0318 12:32:58.762935 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1\": container with ID starting with 7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1 not found: ID does not exist" containerID="7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.762985 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1"} err="failed to get container status \"7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1\": rpc error: code = NotFound desc = could not find container \"7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1\": container with ID starting with 7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1 not found: ID does not exist" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.763049 4843 scope.go:117] "RemoveContainer" containerID="9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.763483 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3"} err="failed to get container status \"9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3\": rpc error: code = NotFound desc = could not find container \"9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3\": container with ID starting with 9e773a60ec972bca9e44187615005697e20169783057368665a5a1c2a26911b3 not found: ID does not exist" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.763529 4843 scope.go:117] "RemoveContainer" containerID="7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1" Mar 18 12:32:58 crc kubenswrapper[4843]: I0318 12:32:58.763810 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1"} err="failed to get container status \"7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1\": rpc error: code = NotFound desc = could not find container \"7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1\": container with ID starting with 7e1727bdb6fe97f83e160ed178a5ba4282afff355d99140dedc7e049934103c1 not found: ID does not exist" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.033073 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7bffcb8fb5-8ktgs"] Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.043163 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7bffcb8fb5-8ktgs"] Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.472471 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.565194 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-config-data-custom\") pod \"5976626b-cd22-4f02-bd2b-d5d452e898c2\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.565282 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n52xd\" (UniqueName: \"kubernetes.io/projected/5976626b-cd22-4f02-bd2b-d5d452e898c2-kube-api-access-n52xd\") pod \"5976626b-cd22-4f02-bd2b-d5d452e898c2\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.565324 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5976626b-cd22-4f02-bd2b-d5d452e898c2-logs\") pod \"5976626b-cd22-4f02-bd2b-d5d452e898c2\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.565409 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-config-data\") pod \"5976626b-cd22-4f02-bd2b-d5d452e898c2\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.565446 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-combined-ca-bundle\") pod \"5976626b-cd22-4f02-bd2b-d5d452e898c2\" (UID: \"5976626b-cd22-4f02-bd2b-d5d452e898c2\") " Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.566998 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5976626b-cd22-4f02-bd2b-d5d452e898c2-logs" (OuterVolumeSpecName: "logs") pod "5976626b-cd22-4f02-bd2b-d5d452e898c2" (UID: "5976626b-cd22-4f02-bd2b-d5d452e898c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.570258 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5976626b-cd22-4f02-bd2b-d5d452e898c2" (UID: "5976626b-cd22-4f02-bd2b-d5d452e898c2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.573816 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5976626b-cd22-4f02-bd2b-d5d452e898c2-kube-api-access-n52xd" (OuterVolumeSpecName: "kube-api-access-n52xd") pod "5976626b-cd22-4f02-bd2b-d5d452e898c2" (UID: "5976626b-cd22-4f02-bd2b-d5d452e898c2"). InnerVolumeSpecName "kube-api-access-n52xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.605001 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5976626b-cd22-4f02-bd2b-d5d452e898c2" (UID: "5976626b-cd22-4f02-bd2b-d5d452e898c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.627243 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-config-data" (OuterVolumeSpecName: "config-data") pod "5976626b-cd22-4f02-bd2b-d5d452e898c2" (UID: "5976626b-cd22-4f02-bd2b-d5d452e898c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.671414 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n52xd\" (UniqueName: \"kubernetes.io/projected/5976626b-cd22-4f02-bd2b-d5d452e898c2-kube-api-access-n52xd\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.671460 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5976626b-cd22-4f02-bd2b-d5d452e898c2-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.671473 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.671483 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.671493 4843 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5976626b-cd22-4f02-bd2b-d5d452e898c2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.726917 4843 generic.go:334] "Generic (PLEG): container finished" podID="5976626b-cd22-4f02-bd2b-d5d452e898c2" containerID="55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897" exitCode=0 Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.726960 4843 generic.go:334] "Generic (PLEG): container finished" podID="5976626b-cd22-4f02-bd2b-d5d452e898c2" containerID="222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba" exitCode=143 Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.726986 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" event={"ID":"5976626b-cd22-4f02-bd2b-d5d452e898c2","Type":"ContainerDied","Data":"55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897"} Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.727018 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" event={"ID":"5976626b-cd22-4f02-bd2b-d5d452e898c2","Type":"ContainerDied","Data":"222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba"} Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.727036 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" event={"ID":"5976626b-cd22-4f02-bd2b-d5d452e898c2","Type":"ContainerDied","Data":"242e1e24848de1bc5684c3aa0a072226fccf06f24820ee287c2fa1e0735721c8"} Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.727063 4843 scope.go:117] "RemoveContainer" containerID="55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.727255 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6db4d7d84b-xpgq7" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.782538 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6db4d7d84b-xpgq7"] Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.791663 4843 scope.go:117] "RemoveContainer" containerID="222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.792946 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6db4d7d84b-xpgq7"] Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.852802 4843 scope.go:117] "RemoveContainer" containerID="55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897" Mar 18 12:32:59 crc kubenswrapper[4843]: E0318 12:32:59.853303 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897\": container with ID starting with 55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897 not found: ID does not exist" containerID="55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.853354 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897"} err="failed to get container status \"55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897\": rpc error: code = NotFound desc = could not find container \"55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897\": container with ID starting with 55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897 not found: ID does not exist" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.853384 4843 scope.go:117] "RemoveContainer" containerID="222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba" Mar 18 12:32:59 crc kubenswrapper[4843]: E0318 12:32:59.853986 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba\": container with ID starting with 222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba not found: ID does not exist" containerID="222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.854049 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba"} err="failed to get container status \"222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba\": rpc error: code = NotFound desc = could not find container \"222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba\": container with ID starting with 222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba not found: ID does not exist" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.854080 4843 scope.go:117] "RemoveContainer" containerID="55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.854433 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897"} err="failed to get container status \"55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897\": rpc error: code = NotFound desc = could not find container \"55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897\": container with ID starting with 55dc6bf23562963f96a096162fd5b756df02786fb60f524f6ce2b305575d8897 not found: ID does not exist" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.854465 4843 scope.go:117] "RemoveContainer" containerID="222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.854856 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba"} err="failed to get container status \"222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba\": rpc error: code = NotFound desc = could not find container \"222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba\": container with ID starting with 222234c782a5b0877b51e9b082cafe8a0f6cf79dec24a742dde67cb467f31aba not found: ID does not exist" Mar 18 12:32:59 crc kubenswrapper[4843]: I0318 12:32:59.951281 4843 scope.go:117] "RemoveContainer" containerID="5fe5d24533ad2ffb865be576a8383e9595bcec8f7b4c831fc413219d301d176d" Mar 18 12:33:00 crc kubenswrapper[4843]: I0318 12:33:00.510758 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:33:00 crc kubenswrapper[4843]: I0318 12:33:00.743350 4843 generic.go:334] "Generic (PLEG): container finished" podID="16d32f17-ac20-4f6d-8e00-db5fdafdc210" containerID="2abf3ef0293d4755cd999d2ba56202726ddd7ea924d350e6620b91a8018db2e6" exitCode=0 Mar 18 12:33:00 crc kubenswrapper[4843]: I0318 12:33:00.743401 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9tcmj" event={"ID":"16d32f17-ac20-4f6d-8e00-db5fdafdc210","Type":"ContainerDied","Data":"2abf3ef0293d4755cd999d2ba56202726ddd7ea924d350e6620b91a8018db2e6"} Mar 18 12:33:01 crc kubenswrapper[4843]: I0318 12:33:01.001056 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5976626b-cd22-4f02-bd2b-d5d452e898c2" path="/var/lib/kubelet/pods/5976626b-cd22-4f02-bd2b-d5d452e898c2/volumes" Mar 18 12:33:01 crc kubenswrapper[4843]: I0318 12:33:01.002411 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daaeab38-2bfe-40e6-8137-4aea071dfc05" path="/var/lib/kubelet/pods/daaeab38-2bfe-40e6-8137-4aea071dfc05/volumes" Mar 18 12:33:01 crc kubenswrapper[4843]: I0318 12:33:01.046449 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7dc96fcc9b-lt524" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 18 12:33:01 crc kubenswrapper[4843]: I0318 12:33:01.152064 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5bf5d4bdcb-8xfkn" podUID="0199f761-6d2f-4921-8060-6960a0141f0a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.155:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.155:8443: connect: connection refused" Mar 18 12:33:01 crc kubenswrapper[4843]: I0318 12:33:01.449804 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:33:01 crc kubenswrapper[4843]: I0318 12:33:01.523971 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-hnqhb"] Mar 18 12:33:01 crc kubenswrapper[4843]: I0318 12:33:01.525947 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" podUID="d70be5ea-b453-4cb8-8228-d46629c4ac42" containerName="dnsmasq-dns" containerID="cri-o://0520631e5f41d20b6d8791ff85c855d49835049d6868f59e3f4c669cc7543ccd" gracePeriod=10 Mar 18 12:33:01 crc kubenswrapper[4843]: I0318 12:33:01.757295 4843 generic.go:334] "Generic (PLEG): container finished" podID="d70be5ea-b453-4cb8-8228-d46629c4ac42" containerID="0520631e5f41d20b6d8791ff85c855d49835049d6868f59e3f4c669cc7543ccd" exitCode=0 Mar 18 12:33:01 crc kubenswrapper[4843]: I0318 12:33:01.757340 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" event={"ID":"d70be5ea-b453-4cb8-8228-d46629c4ac42","Type":"ContainerDied","Data":"0520631e5f41d20b6d8791ff85c855d49835049d6868f59e3f4c669cc7543ccd"} Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.214924 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.478421 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-74ccddfbbf-d44d7"] Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.478770 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-74ccddfbbf-d44d7" podUID="67c6737f-51b5-4262-9bd4-fe8afcb262ad" containerName="neutron-api" containerID="cri-o://ebfc5b6a6da817c7161344f5173ae8dffd127485c51c681c5f7f5beac79e8c00" gracePeriod=30 Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.480332 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-74ccddfbbf-d44d7" podUID="67c6737f-51b5-4262-9bd4-fe8afcb262ad" containerName="neutron-httpd" containerID="cri-o://54367971f1ac999dc3e4856a8fb90d7293cc68bcd2ef1102827ff4a6c7481412" gracePeriod=30 Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.491069 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-74ccddfbbf-d44d7" podUID="67c6737f-51b5-4262-9bd4-fe8afcb262ad" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.161:9696/\": read tcp 10.217.0.2:49082->10.217.0.161:9696: read: connection reset by peer" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.540723 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7555f69bf7-z6jmw"] Mar 18 12:33:02 crc kubenswrapper[4843]: E0318 12:33:02.541234 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5976626b-cd22-4f02-bd2b-d5d452e898c2" containerName="barbican-keystone-listener" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.541246 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="5976626b-cd22-4f02-bd2b-d5d452e898c2" containerName="barbican-keystone-listener" Mar 18 12:33:02 crc kubenswrapper[4843]: E0318 12:33:02.541267 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daaeab38-2bfe-40e6-8137-4aea071dfc05" containerName="barbican-worker-log" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.541273 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="daaeab38-2bfe-40e6-8137-4aea071dfc05" containerName="barbican-worker-log" Mar 18 12:33:02 crc kubenswrapper[4843]: E0318 12:33:02.541287 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5976626b-cd22-4f02-bd2b-d5d452e898c2" containerName="barbican-keystone-listener-log" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.541293 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="5976626b-cd22-4f02-bd2b-d5d452e898c2" containerName="barbican-keystone-listener-log" Mar 18 12:33:02 crc kubenswrapper[4843]: E0318 12:33:02.541308 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daaeab38-2bfe-40e6-8137-4aea071dfc05" containerName="barbican-worker" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.541324 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="daaeab38-2bfe-40e6-8137-4aea071dfc05" containerName="barbican-worker" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.541499 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="5976626b-cd22-4f02-bd2b-d5d452e898c2" containerName="barbican-keystone-listener-log" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.541511 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="daaeab38-2bfe-40e6-8137-4aea071dfc05" containerName="barbican-worker" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.541520 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="5976626b-cd22-4f02-bd2b-d5d452e898c2" containerName="barbican-keystone-listener" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.541532 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="daaeab38-2bfe-40e6-8137-4aea071dfc05" containerName="barbican-worker-log" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.542636 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.561291 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7555f69bf7-z6jmw"] Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.679798 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-config\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.680033 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqshl\" (UniqueName: \"kubernetes.io/projected/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-kube-api-access-sqshl\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.680125 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-public-tls-certs\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.680163 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-internal-tls-certs\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.680253 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-combined-ca-bundle\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.680279 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-httpd-config\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.680334 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-ovndb-tls-certs\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.771975 4843 generic.go:334] "Generic (PLEG): container finished" podID="67c6737f-51b5-4262-9bd4-fe8afcb262ad" containerID="54367971f1ac999dc3e4856a8fb90d7293cc68bcd2ef1102827ff4a6c7481412" exitCode=0 Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.772025 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74ccddfbbf-d44d7" event={"ID":"67c6737f-51b5-4262-9bd4-fe8afcb262ad","Type":"ContainerDied","Data":"54367971f1ac999dc3e4856a8fb90d7293cc68bcd2ef1102827ff4a6c7481412"} Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.782061 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqshl\" (UniqueName: \"kubernetes.io/projected/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-kube-api-access-sqshl\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.782137 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-public-tls-certs\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.782160 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-internal-tls-certs\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.782205 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-combined-ca-bundle\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.782225 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-httpd-config\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.782278 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-ovndb-tls-certs\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.782339 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-config\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.797415 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-internal-tls-certs\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.799755 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-config\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.801356 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-ovndb-tls-certs\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.801988 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-combined-ca-bundle\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.806055 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-httpd-config\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.832191 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqshl\" (UniqueName: \"kubernetes.io/projected/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-kube-api-access-sqshl\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.832776 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe913d5-4e12-4fdc-9968-5caaa8aa1271-public-tls-certs\") pod \"neutron-7555f69bf7-z6jmw\" (UID: \"ebe913d5-4e12-4fdc-9968-5caaa8aa1271\") " pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.880326 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.938159 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:33:02 crc kubenswrapper[4843]: I0318 12:33:02.938243 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" podUID="d70be5ea-b453-4cb8-8228-d46629c4ac42" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.151:5353: connect: connection refused" Mar 18 12:33:03 crc kubenswrapper[4843]: I0318 12:33:03.169105 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:33:03 crc kubenswrapper[4843]: I0318 12:33:03.391512 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55c9766bb-kdvz7" Mar 18 12:33:03 crc kubenswrapper[4843]: I0318 12:33:03.469647 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6cb8d6c886-ttksm"] Mar 18 12:33:03 crc kubenswrapper[4843]: I0318 12:33:03.781133 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6cb8d6c886-ttksm" podUID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerName="barbican-api-log" containerID="cri-o://53e8bd7305de49d562a11fb9cce9e3a9f795cd00a2a0f7840de46d4a98ef69b7" gracePeriod=30 Mar 18 12:33:03 crc kubenswrapper[4843]: I0318 12:33:03.781250 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6cb8d6c886-ttksm" podUID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerName="barbican-api" containerID="cri-o://3e7f2eadd5891cac4e77075d55862553c17c1a859b82a7e8b2101caa1dbd6ea3" gracePeriod=30 Mar 18 12:33:03 crc kubenswrapper[4843]: I0318 12:33:03.801384 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6cb8d6c886-ttksm" podUID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": EOF" Mar 18 12:33:03 crc kubenswrapper[4843]: I0318 12:33:03.801644 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6cb8d6c886-ttksm" podUID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": EOF" Mar 18 12:33:03 crc kubenswrapper[4843]: I0318 12:33:03.983522 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:33:04 crc kubenswrapper[4843]: I0318 12:33:04.798519 4843 generic.go:334] "Generic (PLEG): container finished" podID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerID="53e8bd7305de49d562a11fb9cce9e3a9f795cd00a2a0f7840de46d4a98ef69b7" exitCode=143 Mar 18 12:33:04 crc kubenswrapper[4843]: I0318 12:33:04.798566 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cb8d6c886-ttksm" event={"ID":"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9","Type":"ContainerDied","Data":"53e8bd7305de49d562a11fb9cce9e3a9f795cd00a2a0f7840de46d4a98ef69b7"} Mar 18 12:33:04 crc kubenswrapper[4843]: I0318 12:33:04.858860 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-74ccddfbbf-d44d7" podUID="67c6737f-51b5-4262-9bd4-fe8afcb262ad" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.161:9696/\": dial tcp 10.217.0.161:9696: connect: connection refused" Mar 18 12:33:07 crc kubenswrapper[4843]: I0318 12:33:07.938041 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" podUID="d70be5ea-b453-4cb8-8228-d46629c4ac42" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.151:5353: connect: connection refused" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.238184 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cb8d6c886-ttksm" podUID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": read tcp 10.217.0.2:49808->10.217.0.170:9311: read: connection reset by peer" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.238289 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cb8d6c886-ttksm" podUID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.170:9311/healthcheck\": read tcp 10.217.0.2:49820->10.217.0.170:9311: read: connection reset by peer" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.728263 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.782802 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.830266 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-combined-ca-bundle\") pod \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.830619 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8qv5\" (UniqueName: \"kubernetes.io/projected/d70be5ea-b453-4cb8-8228-d46629c4ac42-kube-api-access-g8qv5\") pod \"d70be5ea-b453-4cb8-8228-d46629c4ac42\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.830676 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d7rs\" (UniqueName: \"kubernetes.io/projected/16d32f17-ac20-4f6d-8e00-db5fdafdc210-kube-api-access-9d7rs\") pod \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.830736 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-config\") pod \"d70be5ea-b453-4cb8-8228-d46629c4ac42\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.830781 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16d32f17-ac20-4f6d-8e00-db5fdafdc210-etc-machine-id\") pod \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.830920 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-dns-svc\") pod \"d70be5ea-b453-4cb8-8228-d46629c4ac42\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.830959 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-scripts\") pod \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.830982 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-dns-swift-storage-0\") pod \"d70be5ea-b453-4cb8-8228-d46629c4ac42\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.831053 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-config-data\") pod \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.831082 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-ovsdbserver-sb\") pod \"d70be5ea-b453-4cb8-8228-d46629c4ac42\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.831114 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-ovsdbserver-nb\") pod \"d70be5ea-b453-4cb8-8228-d46629c4ac42\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.831138 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-db-sync-config-data\") pod \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\" (UID: \"16d32f17-ac20-4f6d-8e00-db5fdafdc210\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.837761 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d70be5ea-b453-4cb8-8228-d46629c4ac42-kube-api-access-g8qv5" (OuterVolumeSpecName: "kube-api-access-g8qv5") pod "d70be5ea-b453-4cb8-8228-d46629c4ac42" (UID: "d70be5ea-b453-4cb8-8228-d46629c4ac42"). InnerVolumeSpecName "kube-api-access-g8qv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.839594 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16d32f17-ac20-4f6d-8e00-db5fdafdc210-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "16d32f17-ac20-4f6d-8e00-db5fdafdc210" (UID: "16d32f17-ac20-4f6d-8e00-db5fdafdc210"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.841001 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "16d32f17-ac20-4f6d-8e00-db5fdafdc210" (UID: "16d32f17-ac20-4f6d-8e00-db5fdafdc210"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.842875 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-scripts" (OuterVolumeSpecName: "scripts") pod "16d32f17-ac20-4f6d-8e00-db5fdafdc210" (UID: "16d32f17-ac20-4f6d-8e00-db5fdafdc210"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.850773 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9tcmj" event={"ID":"16d32f17-ac20-4f6d-8e00-db5fdafdc210","Type":"ContainerDied","Data":"c103a0b7fa3bbccda7c1c164879728370b02eee075a0492c6d29041a097e382d"} Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.850815 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c103a0b7fa3bbccda7c1c164879728370b02eee075a0492c6d29041a097e382d" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.850870 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9tcmj" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.856256 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" event={"ID":"d70be5ea-b453-4cb8-8228-d46629c4ac42","Type":"ContainerDied","Data":"d34be7ed83abfd94c7a5b1a83c5aef8662d1fad7bff657c17b085a7ed6e33ecd"} Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.856313 4843 scope.go:117] "RemoveContainer" containerID="0520631e5f41d20b6d8791ff85c855d49835049d6868f59e3f4c669cc7543ccd" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.856301 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-hnqhb" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.857893 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d32f17-ac20-4f6d-8e00-db5fdafdc210-kube-api-access-9d7rs" (OuterVolumeSpecName: "kube-api-access-9d7rs") pod "16d32f17-ac20-4f6d-8e00-db5fdafdc210" (UID: "16d32f17-ac20-4f6d-8e00-db5fdafdc210"). InnerVolumeSpecName "kube-api-access-9d7rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.864708 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.864901 4843 generic.go:334] "Generic (PLEG): container finished" podID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerID="3e7f2eadd5891cac4e77075d55862553c17c1a859b82a7e8b2101caa1dbd6ea3" exitCode=0 Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.864952 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cb8d6c886-ttksm" event={"ID":"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9","Type":"ContainerDied","Data":"3e7f2eadd5891cac4e77075d55862553c17c1a859b82a7e8b2101caa1dbd6ea3"} Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.864974 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cb8d6c886-ttksm" event={"ID":"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9","Type":"ContainerDied","Data":"6d11b728b4a758d7ca11cd72c858ae0390a5b398da1c06b06b31abb9e0ac304a"} Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.867349 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="ceilometer-central-agent" containerID="cri-o://7c835082341f7ff229787df7ee9fb56ae8248570b01f14e98bd2a1ebbbfb1329" gracePeriod=30 Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.867432 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.867512 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="proxy-httpd" containerID="cri-o://0af6944646ea5a3a8abda87fd4574176f10b58646950af77946507c7144622bd" gracePeriod=30 Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.867567 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="sg-core" containerID="cri-o://6a017f03a1efe8a6c123adf81e9c6c3551c21cab076b97b0f0a289aba1ab8853" gracePeriod=30 Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.867645 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="ceilometer-notification-agent" containerID="cri-o://b6c155ad15706ebdbe27889daf0cd88a255bb6094c925dde218ff066e1894018" gracePeriod=30 Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.871618 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16d32f17-ac20-4f6d-8e00-db5fdafdc210" (UID: "16d32f17-ac20-4f6d-8e00-db5fdafdc210"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.908808 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.1472497600000002 podStartE2EDuration="1m6.908782041s" podCreationTimestamp="2026-03-18 12:32:02 +0000 UTC" firstStartedPulling="2026-03-18 12:32:03.798983592 +0000 UTC m=+1357.514809106" lastFinishedPulling="2026-03-18 12:33:08.560515863 +0000 UTC m=+1422.276341387" observedRunningTime="2026-03-18 12:33:08.904135889 +0000 UTC m=+1422.619961413" watchObservedRunningTime="2026-03-18 12:33:08.908782041 +0000 UTC m=+1422.624607565" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.912676 4843 scope.go:117] "RemoveContainer" containerID="8e33d0394cbad0aaf1482b3bb8d8021b8f7bdd40b8156e856dc4587d95d1f08b" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.918498 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d70be5ea-b453-4cb8-8228-d46629c4ac42" (UID: "d70be5ea-b453-4cb8-8228-d46629c4ac42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.933258 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-config" (OuterVolumeSpecName: "config") pod "d70be5ea-b453-4cb8-8228-d46629c4ac42" (UID: "d70be5ea-b453-4cb8-8228-d46629c4ac42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.933911 4843 scope.go:117] "RemoveContainer" containerID="3e7f2eadd5891cac4e77075d55862553c17c1a859b82a7e8b2101caa1dbd6ea3" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.939314 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cppl\" (UniqueName: \"kubernetes.io/projected/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-kube-api-access-4cppl\") pod \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.939392 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-combined-ca-bundle\") pod \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.939485 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-config-data\") pod \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.939642 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-config\") pod \"d70be5ea-b453-4cb8-8228-d46629c4ac42\" (UID: \"d70be5ea-b453-4cb8-8228-d46629c4ac42\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.939757 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-logs\") pod \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.939872 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-config-data-custom\") pod \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\" (UID: \"d9e8a33c-889e-45e0-9cc2-60e054a3b5a9\") " Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.940076 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d70be5ea-b453-4cb8-8228-d46629c4ac42" (UID: "d70be5ea-b453-4cb8-8228-d46629c4ac42"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: W0318 12:33:08.940692 4843 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d70be5ea-b453-4cb8-8228-d46629c4ac42/volumes/kubernetes.io~configmap/config Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.940778 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-config" (OuterVolumeSpecName: "config") pod "d70be5ea-b453-4cb8-8228-d46629c4ac42" (UID: "d70be5ea-b453-4cb8-8228-d46629c4ac42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.941206 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-logs" (OuterVolumeSpecName: "logs") pod "d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" (UID: "d9e8a33c-889e-45e0-9cc2-60e054a3b5a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.943520 4843 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16d32f17-ac20-4f6d-8e00-db5fdafdc210-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.943553 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.943568 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.943581 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.943592 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.943603 4843 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.943614 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.943626 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8qv5\" (UniqueName: \"kubernetes.io/projected/d70be5ea-b453-4cb8-8228-d46629c4ac42-kube-api-access-g8qv5\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.943640 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d7rs\" (UniqueName: \"kubernetes.io/projected/16d32f17-ac20-4f6d-8e00-db5fdafdc210-kube-api-access-9d7rs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.943653 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.944868 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-kube-api-access-4cppl" (OuterVolumeSpecName: "kube-api-access-4cppl") pod "d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" (UID: "d9e8a33c-889e-45e0-9cc2-60e054a3b5a9"). InnerVolumeSpecName "kube-api-access-4cppl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.945339 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d70be5ea-b453-4cb8-8228-d46629c4ac42" (UID: "d70be5ea-b453-4cb8-8228-d46629c4ac42"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.947116 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" (UID: "d9e8a33c-889e-45e0-9cc2-60e054a3b5a9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.956443 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d70be5ea-b453-4cb8-8228-d46629c4ac42" (UID: "d70be5ea-b453-4cb8-8228-d46629c4ac42"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.958719 4843 scope.go:117] "RemoveContainer" containerID="53e8bd7305de49d562a11fb9cce9e3a9f795cd00a2a0f7840de46d4a98ef69b7" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.961429 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-config-data" (OuterVolumeSpecName: "config-data") pod "16d32f17-ac20-4f6d-8e00-db5fdafdc210" (UID: "16d32f17-ac20-4f6d-8e00-db5fdafdc210"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.974184 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" (UID: "d9e8a33c-889e-45e0-9cc2-60e054a3b5a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.992881 4843 scope.go:117] "RemoveContainer" containerID="3e7f2eadd5891cac4e77075d55862553c17c1a859b82a7e8b2101caa1dbd6ea3" Mar 18 12:33:08 crc kubenswrapper[4843]: E0318 12:33:08.993481 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e7f2eadd5891cac4e77075d55862553c17c1a859b82a7e8b2101caa1dbd6ea3\": container with ID starting with 3e7f2eadd5891cac4e77075d55862553c17c1a859b82a7e8b2101caa1dbd6ea3 not found: ID does not exist" containerID="3e7f2eadd5891cac4e77075d55862553c17c1a859b82a7e8b2101caa1dbd6ea3" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.993558 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7f2eadd5891cac4e77075d55862553c17c1a859b82a7e8b2101caa1dbd6ea3"} err="failed to get container status \"3e7f2eadd5891cac4e77075d55862553c17c1a859b82a7e8b2101caa1dbd6ea3\": rpc error: code = NotFound desc = could not find container \"3e7f2eadd5891cac4e77075d55862553c17c1a859b82a7e8b2101caa1dbd6ea3\": container with ID starting with 3e7f2eadd5891cac4e77075d55862553c17c1a859b82a7e8b2101caa1dbd6ea3 not found: ID does not exist" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.993596 4843 scope.go:117] "RemoveContainer" containerID="53e8bd7305de49d562a11fb9cce9e3a9f795cd00a2a0f7840de46d4a98ef69b7" Mar 18 12:33:08 crc kubenswrapper[4843]: E0318 12:33:08.994024 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53e8bd7305de49d562a11fb9cce9e3a9f795cd00a2a0f7840de46d4a98ef69b7\": container with ID starting with 53e8bd7305de49d562a11fb9cce9e3a9f795cd00a2a0f7840de46d4a98ef69b7 not found: ID does not exist" containerID="53e8bd7305de49d562a11fb9cce9e3a9f795cd00a2a0f7840de46d4a98ef69b7" Mar 18 12:33:08 crc kubenswrapper[4843]: I0318 12:33:08.994094 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e8bd7305de49d562a11fb9cce9e3a9f795cd00a2a0f7840de46d4a98ef69b7"} err="failed to get container status \"53e8bd7305de49d562a11fb9cce9e3a9f795cd00a2a0f7840de46d4a98ef69b7\": rpc error: code = NotFound desc = could not find container \"53e8bd7305de49d562a11fb9cce9e3a9f795cd00a2a0f7840de46d4a98ef69b7\": container with ID starting with 53e8bd7305de49d562a11fb9cce9e3a9f795cd00a2a0f7840de46d4a98ef69b7 not found: ID does not exist" Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.027589 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-config-data" (OuterVolumeSpecName: "config-data") pod "d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" (UID: "d9e8a33c-889e-45e0-9cc2-60e054a3b5a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.046283 4843 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.046318 4843 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.046329 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cppl\" (UniqueName: \"kubernetes.io/projected/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-kube-api-access-4cppl\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.046339 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.046349 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d32f17-ac20-4f6d-8e00-db5fdafdc210-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.046358 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d70be5ea-b453-4cb8-8228-d46629c4ac42-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.046366 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:09 crc kubenswrapper[4843]: W0318 12:33:09.066642 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebe913d5_4e12_4fdc_9968_5caaa8aa1271.slice/crio-3b52ed712c24d695329b814972282f3441c13466a3674a101f752bb4e393d883 WatchSource:0}: Error finding container 3b52ed712c24d695329b814972282f3441c13466a3674a101f752bb4e393d883: Status 404 returned error can't find the container with id 3b52ed712c24d695329b814972282f3441c13466a3674a101f752bb4e393d883 Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.068043 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7555f69bf7-z6jmw"] Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.192219 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-hnqhb"] Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.202175 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-hnqhb"] Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.891340 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cb8d6c886-ttksm" Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.892732 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7555f69bf7-z6jmw" event={"ID":"ebe913d5-4e12-4fdc-9968-5caaa8aa1271","Type":"ContainerStarted","Data":"faed8200aba4ae93c2688c9732f018db14a945671ec6f01c5e9d39868422d5d6"} Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.892771 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7555f69bf7-z6jmw" event={"ID":"ebe913d5-4e12-4fdc-9968-5caaa8aa1271","Type":"ContainerStarted","Data":"c0686f5eeffb492baf1015d25a15296c64649133fb292c068a84d4fa37515022"} Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.892781 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7555f69bf7-z6jmw" event={"ID":"ebe913d5-4e12-4fdc-9968-5caaa8aa1271","Type":"ContainerStarted","Data":"3b52ed712c24d695329b814972282f3441c13466a3674a101f752bb4e393d883"} Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.893478 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.902558 4843 generic.go:334] "Generic (PLEG): container finished" podID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerID="6a017f03a1efe8a6c123adf81e9c6c3551c21cab076b97b0f0a289aba1ab8853" exitCode=2 Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.902595 4843 generic.go:334] "Generic (PLEG): container finished" podID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerID="7c835082341f7ff229787df7ee9fb56ae8248570b01f14e98bd2a1ebbbfb1329" exitCode=0 Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.902698 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8648504-98a7-406e-9838-c8dc2d64ffe9","Type":"ContainerStarted","Data":"0af6944646ea5a3a8abda87fd4574176f10b58646950af77946507c7144622bd"} Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.902729 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8648504-98a7-406e-9838-c8dc2d64ffe9","Type":"ContainerDied","Data":"6a017f03a1efe8a6c123adf81e9c6c3551c21cab076b97b0f0a289aba1ab8853"} Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.902743 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8648504-98a7-406e-9838-c8dc2d64ffe9","Type":"ContainerDied","Data":"7c835082341f7ff229787df7ee9fb56ae8248570b01f14e98bd2a1ebbbfb1329"} Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.936749 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7555f69bf7-z6jmw" podStartSLOduration=7.936718215 podStartE2EDuration="7.936718215s" podCreationTimestamp="2026-03-18 12:33:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:09.921640636 +0000 UTC m=+1423.637466170" watchObservedRunningTime="2026-03-18 12:33:09.936718215 +0000 UTC m=+1423.652543739" Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.954725 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6cb8d6c886-ttksm"] Mar 18 12:33:09 crc kubenswrapper[4843]: I0318 12:33:09.968499 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6cb8d6c886-ttksm"] Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.069575 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:33:10 crc kubenswrapper[4843]: E0318 12:33:10.070011 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70be5ea-b453-4cb8-8228-d46629c4ac42" containerName="dnsmasq-dns" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.070035 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70be5ea-b453-4cb8-8228-d46629c4ac42" containerName="dnsmasq-dns" Mar 18 12:33:10 crc kubenswrapper[4843]: E0318 12:33:10.070054 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerName="barbican-api" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.070060 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerName="barbican-api" Mar 18 12:33:10 crc kubenswrapper[4843]: E0318 12:33:10.070100 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70be5ea-b453-4cb8-8228-d46629c4ac42" containerName="init" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.070109 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70be5ea-b453-4cb8-8228-d46629c4ac42" containerName="init" Mar 18 12:33:10 crc kubenswrapper[4843]: E0318 12:33:10.070122 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d32f17-ac20-4f6d-8e00-db5fdafdc210" containerName="cinder-db-sync" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.070130 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d32f17-ac20-4f6d-8e00-db5fdafdc210" containerName="cinder-db-sync" Mar 18 12:33:10 crc kubenswrapper[4843]: E0318 12:33:10.070141 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerName="barbican-api-log" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.070147 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerName="barbican-api-log" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.070337 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerName="barbican-api" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.070358 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d32f17-ac20-4f6d-8e00-db5fdafdc210" containerName="cinder-db-sync" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.070369 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d70be5ea-b453-4cb8-8228-d46629c4ac42" containerName="dnsmasq-dns" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.070393 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" containerName="barbican-api-log" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.071418 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.078289 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.078445 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-gkdfj" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.078574 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.078730 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.096372 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.157183 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-nf9gt"] Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.162683 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.232006 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-nf9gt"] Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.268009 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.268065 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwbwm\" (UniqueName: \"kubernetes.io/projected/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-kube-api-access-cwbwm\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.268096 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45d711fe-44a0-4167-8155-e31794ab10e5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.268257 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.268318 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.268382 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-config-data\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.268471 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-config\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.268526 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-scripts\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.268641 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.268692 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.268761 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.268784 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sq52\" (UniqueName: \"kubernetes.io/projected/45d711fe-44a0-4167-8155-e31794ab10e5-kube-api-access-8sq52\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.306617 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.308401 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.311006 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.337911 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.370165 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.370218 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwbwm\" (UniqueName: \"kubernetes.io/projected/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-kube-api-access-cwbwm\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.370241 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45d711fe-44a0-4167-8155-e31794ab10e5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.370329 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.370348 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.370375 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-config-data\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.370403 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-config\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.370423 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-scripts\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.370454 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.370503 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.370528 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.370569 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sq52\" (UniqueName: \"kubernetes.io/projected/45d711fe-44a0-4167-8155-e31794ab10e5-kube-api-access-8sq52\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.372441 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.372499 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45d711fe-44a0-4167-8155-e31794ab10e5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.373531 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.373600 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-config\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.374250 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.374562 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.374582 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.375730 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-scripts\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.381532 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-config-data\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.387144 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.398503 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwbwm\" (UniqueName: \"kubernetes.io/projected/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-kube-api-access-cwbwm\") pod \"dnsmasq-dns-6bb4fc677f-nf9gt\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.422240 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sq52\" (UniqueName: \"kubernetes.io/projected/45d711fe-44a0-4167-8155-e31794ab10e5-kube-api-access-8sq52\") pod \"cinder-scheduler-0\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.472213 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.472575 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-logs\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.472655 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-scripts\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.472791 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-config-data\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.472876 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.472970 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hp2l\" (UniqueName: \"kubernetes.io/projected/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-kube-api-access-8hp2l\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.473050 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.482709 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.574295 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-logs\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.574367 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-scripts\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.574449 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-config-data\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.574478 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.574526 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hp2l\" (UniqueName: \"kubernetes.io/projected/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-kube-api-access-8hp2l\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.574551 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.574594 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.574763 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.577045 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-logs\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.583554 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-config-data\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.585131 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.585278 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-scripts\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.586318 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.600201 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hp2l\" (UniqueName: \"kubernetes.io/projected/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-kube-api-access-8hp2l\") pod \"cinder-api-0\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.627215 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.699453 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.823259 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-nf9gt"] Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.921282 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" event={"ID":"38dcade5-06ff-4d7c-aa4a-d334adf77bd8","Type":"ContainerStarted","Data":"8137e04b19a1301dd6f6c76206915a03538024e6062dd9868e4078744edbade4"} Mar 18 12:33:10 crc kubenswrapper[4843]: I0318 12:33:10.997152 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d70be5ea-b453-4cb8-8228-d46629c4ac42" path="/var/lib/kubelet/pods/d70be5ea-b453-4cb8-8228-d46629c4ac42/volumes" Mar 18 12:33:11 crc kubenswrapper[4843]: I0318 12:33:10.997922 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9e8a33c-889e-45e0-9cc2-60e054a3b5a9" path="/var/lib/kubelet/pods/d9e8a33c-889e-45e0-9cc2-60e054a3b5a9/volumes" Mar 18 12:33:11 crc kubenswrapper[4843]: I0318 12:33:11.074417 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:33:11 crc kubenswrapper[4843]: I0318 12:33:11.153268 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:33:11 crc kubenswrapper[4843]: I0318 12:33:11.319605 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:33:11 crc kubenswrapper[4843]: I0318 12:33:11.388978 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:33:11 crc kubenswrapper[4843]: I0318 12:33:11.930393 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47","Type":"ContainerStarted","Data":"de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea"} Mar 18 12:33:11 crc kubenswrapper[4843]: I0318 12:33:11.930763 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47","Type":"ContainerStarted","Data":"80b48ff80e40e256f7af6e28faf0714cac79a61a01bdc4a7eeb0117008d87066"} Mar 18 12:33:11 crc kubenswrapper[4843]: I0318 12:33:11.932023 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45d711fe-44a0-4167-8155-e31794ab10e5","Type":"ContainerStarted","Data":"625a3945d7baf48522ba2fc18c27ab240baa1eacd1330a614301e10381f6270f"} Mar 18 12:33:11 crc kubenswrapper[4843]: I0318 12:33:11.934688 4843 generic.go:334] "Generic (PLEG): container finished" podID="38dcade5-06ff-4d7c-aa4a-d334adf77bd8" containerID="24299ea65c850ad9d34b9a26e667c5861301df289650f33e929ab27e20f4b900" exitCode=0 Mar 18 12:33:11 crc kubenswrapper[4843]: I0318 12:33:11.934797 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" event={"ID":"38dcade5-06ff-4d7c-aa4a-d334adf77bd8","Type":"ContainerDied","Data":"24299ea65c850ad9d34b9a26e667c5861301df289650f33e929ab27e20f4b900"} Mar 18 12:33:12 crc kubenswrapper[4843]: I0318 12:33:12.184978 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:33:12 crc kubenswrapper[4843]: I0318 12:33:12.951326 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47","Type":"ContainerStarted","Data":"ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48"} Mar 18 12:33:12 crc kubenswrapper[4843]: I0318 12:33:12.951699 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 12:33:12 crc kubenswrapper[4843]: I0318 12:33:12.951609 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" containerName="cinder-api" containerID="cri-o://ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48" gracePeriod=30 Mar 18 12:33:12 crc kubenswrapper[4843]: I0318 12:33:12.951377 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" containerName="cinder-api-log" containerID="cri-o://de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea" gracePeriod=30 Mar 18 12:33:12 crc kubenswrapper[4843]: I0318 12:33:12.958678 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" event={"ID":"38dcade5-06ff-4d7c-aa4a-d334adf77bd8","Type":"ContainerStarted","Data":"18db89212e8107d0e2020f9c0ad7557bee08088b25e3ff967885ecb9400e022a"} Mar 18 12:33:12 crc kubenswrapper[4843]: I0318 12:33:12.958893 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:13 crc kubenswrapper[4843]: I0318 12:33:13.007529 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.007506497 podStartE2EDuration="3.007506497s" podCreationTimestamp="2026-03-18 12:33:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:12.997611936 +0000 UTC m=+1426.713437470" watchObservedRunningTime="2026-03-18 12:33:13.007506497 +0000 UTC m=+1426.723332021" Mar 18 12:33:13 crc kubenswrapper[4843]: I0318 12:33:13.038527 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" podStartSLOduration=3.038500068 podStartE2EDuration="3.038500068s" podCreationTimestamp="2026-03-18 12:33:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:13.032109087 +0000 UTC m=+1426.747934611" watchObservedRunningTime="2026-03-18 12:33:13.038500068 +0000 UTC m=+1426.754325592" Mar 18 12:33:13 crc kubenswrapper[4843]: I0318 12:33:13.941056 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:33:13 crc kubenswrapper[4843]: I0318 12:33:13.980522 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-combined-ca-bundle\") pod \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " Mar 18 12:33:13 crc kubenswrapper[4843]: I0318 12:33:13.980569 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-scripts\") pod \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " Mar 18 12:33:13 crc kubenswrapper[4843]: I0318 12:33:13.980588 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hp2l\" (UniqueName: \"kubernetes.io/projected/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-kube-api-access-8hp2l\") pod \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " Mar 18 12:33:13 crc kubenswrapper[4843]: I0318 12:33:13.980609 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-etc-machine-id\") pod \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " Mar 18 12:33:13 crc kubenswrapper[4843]: I0318 12:33:13.980678 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-config-data\") pod \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " Mar 18 12:33:13 crc kubenswrapper[4843]: I0318 12:33:13.980722 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-config-data-custom\") pod \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " Mar 18 12:33:13 crc kubenswrapper[4843]: I0318 12:33:13.980750 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-logs\") pod \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\" (UID: \"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47\") " Mar 18 12:33:13 crc kubenswrapper[4843]: I0318 12:33:13.981319 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" (UID: "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:33:13 crc kubenswrapper[4843]: I0318 12:33:13.981872 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-logs" (OuterVolumeSpecName: "logs") pod "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" (UID: "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.003834 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-scripts" (OuterVolumeSpecName: "scripts") pod "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" (UID: "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.003869 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-kube-api-access-8hp2l" (OuterVolumeSpecName: "kube-api-access-8hp2l") pod "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" (UID: "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47"). InnerVolumeSpecName "kube-api-access-8hp2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.003934 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" (UID: "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.020861 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" (UID: "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.025251 4843 generic.go:334] "Generic (PLEG): container finished" podID="d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" containerID="ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48" exitCode=0 Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.025352 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.025278 4843 generic.go:334] "Generic (PLEG): container finished" podID="d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" containerID="de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea" exitCode=143 Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.026196 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47","Type":"ContainerDied","Data":"ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48"} Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.026226 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47","Type":"ContainerDied","Data":"de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea"} Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.026235 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3cf4fa0-05e0-4fc8-99f1-631e8b772d47","Type":"ContainerDied","Data":"80b48ff80e40e256f7af6e28faf0714cac79a61a01bdc4a7eeb0117008d87066"} Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.026250 4843 scope.go:117] "RemoveContainer" containerID="ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.040408 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45d711fe-44a0-4167-8155-e31794ab10e5","Type":"ContainerStarted","Data":"6edfbde974f14038b9c73570c2850cb9073586496f8ea48614074132ddb0afd7"} Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.040488 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45d711fe-44a0-4167-8155-e31794ab10e5","Type":"ContainerStarted","Data":"aebb2469b89b8930f6cfa9e90cb71b157196270c2b73c90bec57a7aeb42fdaf8"} Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.060794 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-config-data" (OuterVolumeSpecName: "config-data") pod "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" (UID: "d3cf4fa0-05e0-4fc8-99f1-631e8b772d47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.068874 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.061039049 podStartE2EDuration="4.068851021s" podCreationTimestamp="2026-03-18 12:33:10 +0000 UTC" firstStartedPulling="2026-03-18 12:33:11.34044672 +0000 UTC m=+1425.056272234" lastFinishedPulling="2026-03-18 12:33:12.348258682 +0000 UTC m=+1426.064084206" observedRunningTime="2026-03-18 12:33:14.054106002 +0000 UTC m=+1427.769931526" watchObservedRunningTime="2026-03-18 12:33:14.068851021 +0000 UTC m=+1427.784676545" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.083124 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.083153 4843 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.083164 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.083172 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.083181 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.083196 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hp2l\" (UniqueName: \"kubernetes.io/projected/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-kube-api-access-8hp2l\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.083205 4843 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.142092 4843 scope.go:117] "RemoveContainer" containerID="de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.166470 4843 scope.go:117] "RemoveContainer" containerID="ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48" Mar 18 12:33:14 crc kubenswrapper[4843]: E0318 12:33:14.168146 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48\": container with ID starting with ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48 not found: ID does not exist" containerID="ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.168189 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48"} err="failed to get container status \"ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48\": rpc error: code = NotFound desc = could not find container \"ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48\": container with ID starting with ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48 not found: ID does not exist" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.168221 4843 scope.go:117] "RemoveContainer" containerID="de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea" Mar 18 12:33:14 crc kubenswrapper[4843]: E0318 12:33:14.168867 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea\": container with ID starting with de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea not found: ID does not exist" containerID="de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.168923 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea"} err="failed to get container status \"de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea\": rpc error: code = NotFound desc = could not find container \"de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea\": container with ID starting with de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea not found: ID does not exist" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.168938 4843 scope.go:117] "RemoveContainer" containerID="ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.170895 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48"} err="failed to get container status \"ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48\": rpc error: code = NotFound desc = could not find container \"ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48\": container with ID starting with ef96f43f290fc8f62c3bccdd8a93c063d1e1fc1f0f99c8dc3f99d5e36f7baa48 not found: ID does not exist" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.170928 4843 scope.go:117] "RemoveContainer" containerID="de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.172062 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea"} err="failed to get container status \"de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea\": rpc error: code = NotFound desc = could not find container \"de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea\": container with ID starting with de9026a2f81993fc58f21619c556191021ea691143f2172b0a3f97915b6749ea not found: ID does not exist" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.229261 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.357626 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.375329 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.382825 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:33:14 crc kubenswrapper[4843]: E0318 12:33:14.383176 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" containerName="cinder-api-log" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.383192 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" containerName="cinder-api-log" Mar 18 12:33:14 crc kubenswrapper[4843]: E0318 12:33:14.383213 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" containerName="cinder-api" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.383219 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" containerName="cinder-api" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.386550 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" containerName="cinder-api" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.386587 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" containerName="cinder-api-log" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.387602 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.389551 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.389579 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.389699 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.397249 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-config-data\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.397280 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.397310 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tcmk\" (UniqueName: \"kubernetes.io/projected/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-kube-api-access-6tcmk\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.397330 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-config-data-custom\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.397368 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.397385 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-logs\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.397477 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-scripts\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.397516 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.397534 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.410965 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.462114 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.498610 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.498656 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.498704 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-config-data\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.498721 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.498747 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tcmk\" (UniqueName: \"kubernetes.io/projected/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-kube-api-access-6tcmk\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.498765 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-config-data-custom\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.498803 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.498818 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-logs\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.498947 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-scripts\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.501232 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-logs\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.501292 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.506679 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-config-data\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.507249 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-scripts\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.508852 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.509143 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.510184 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.514268 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-config-data-custom\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.524203 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tcmk\" (UniqueName: \"kubernetes.io/projected/91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a-kube-api-access-6tcmk\") pod \"cinder-api-0\" (UID: \"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a\") " pod="openstack/cinder-api-0" Mar 18 12:33:14 crc kubenswrapper[4843]: I0318 12:33:14.713980 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:33:15 crc kubenswrapper[4843]: I0318 12:33:15.009926 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3cf4fa0-05e0-4fc8-99f1-631e8b772d47" path="/var/lib/kubelet/pods/d3cf4fa0-05e0-4fc8-99f1-631e8b772d47/volumes" Mar 18 12:33:15 crc kubenswrapper[4843]: I0318 12:33:15.043289 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-f4bfb58c4-dvx2n" Mar 18 12:33:15 crc kubenswrapper[4843]: I0318 12:33:15.070678 4843 generic.go:334] "Generic (PLEG): container finished" podID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerID="b6c155ad15706ebdbe27889daf0cd88a255bb6094c925dde218ff066e1894018" exitCode=0 Mar 18 12:33:15 crc kubenswrapper[4843]: I0318 12:33:15.071939 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8648504-98a7-406e-9838-c8dc2d64ffe9","Type":"ContainerDied","Data":"b6c155ad15706ebdbe27889daf0cd88a255bb6094c925dde218ff066e1894018"} Mar 18 12:33:15 crc kubenswrapper[4843]: I0318 12:33:15.302391 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:33:15 crc kubenswrapper[4843]: I0318 12:33:15.706844 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 12:33:15 crc kubenswrapper[4843]: I0318 12:33:15.714118 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64d556c464-829sb" Mar 18 12:33:15 crc kubenswrapper[4843]: I0318 12:33:15.735333 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64d556c464-829sb" Mar 18 12:33:15 crc kubenswrapper[4843]: I0318 12:33:15.815716 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-859b44d9b8-z27mw"] Mar 18 12:33:15 crc kubenswrapper[4843]: I0318 12:33:15.816019 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-859b44d9b8-z27mw" podUID="ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" containerName="placement-log" containerID="cri-o://c540da30f2fc07d5ac506a921f7f994571430bd06a0850a32c546d419ecc2c3b" gracePeriod=30 Mar 18 12:33:15 crc kubenswrapper[4843]: I0318 12:33:15.816044 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-859b44d9b8-z27mw" podUID="ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" containerName="placement-api" containerID="cri-o://fc88b2e91a473280f47da439283b67e0f0df2a316bb157d1f4957d66aee924e3" gracePeriod=30 Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.109960 4843 generic.go:334] "Generic (PLEG): container finished" podID="67c6737f-51b5-4262-9bd4-fe8afcb262ad" containerID="ebfc5b6a6da817c7161344f5173ae8dffd127485c51c681c5f7f5beac79e8c00" exitCode=0 Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.110036 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74ccddfbbf-d44d7" event={"ID":"67c6737f-51b5-4262-9bd4-fe8afcb262ad","Type":"ContainerDied","Data":"ebfc5b6a6da817c7161344f5173ae8dffd127485c51c681c5f7f5beac79e8c00"} Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.110354 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74ccddfbbf-d44d7" event={"ID":"67c6737f-51b5-4262-9bd4-fe8afcb262ad","Type":"ContainerDied","Data":"0f31fd1601db2327ed33dc45f5b1c44536a42a0fc034769057cad8cef207664d"} Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.110377 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f31fd1601db2327ed33dc45f5b1c44536a42a0fc034769057cad8cef207664d" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.112187 4843 generic.go:334] "Generic (PLEG): container finished" podID="ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" containerID="c540da30f2fc07d5ac506a921f7f994571430bd06a0850a32c546d419ecc2c3b" exitCode=143 Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.112252 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-859b44d9b8-z27mw" event={"ID":"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba","Type":"ContainerDied","Data":"c540da30f2fc07d5ac506a921f7f994571430bd06a0850a32c546d419ecc2c3b"} Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.119213 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a","Type":"ContainerStarted","Data":"9f1cd0b1d3e0e59a877a66d41e49972bba5626b70969a855d69192afaa4dfbfc"} Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.150669 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.182816 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 12:33:16 crc kubenswrapper[4843]: E0318 12:33:16.183320 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c6737f-51b5-4262-9bd4-fe8afcb262ad" containerName="neutron-httpd" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.183338 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c6737f-51b5-4262-9bd4-fe8afcb262ad" containerName="neutron-httpd" Mar 18 12:33:16 crc kubenswrapper[4843]: E0318 12:33:16.183369 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c6737f-51b5-4262-9bd4-fe8afcb262ad" containerName="neutron-api" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.183375 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c6737f-51b5-4262-9bd4-fe8afcb262ad" containerName="neutron-api" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.183606 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c6737f-51b5-4262-9bd4-fe8afcb262ad" containerName="neutron-httpd" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.183624 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c6737f-51b5-4262-9bd4-fe8afcb262ad" containerName="neutron-api" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.184291 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.242368 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-config\") pod \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.242424 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-combined-ca-bundle\") pod \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.242491 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-internal-tls-certs\") pod \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.242595 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-httpd-config\") pod \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.248831 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gfds\" (UniqueName: \"kubernetes.io/projected/67c6737f-51b5-4262-9bd4-fe8afcb262ad-kube-api-access-2gfds\") pod \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.248903 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-ovndb-tls-certs\") pod \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.248956 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-public-tls-certs\") pod \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\" (UID: \"67c6737f-51b5-4262-9bd4-fe8afcb262ad\") " Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.249351 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b95404-5989-4c88-969b-4e9afaaebe8e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b7b95404-5989-4c88-969b-4e9afaaebe8e\") " pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.249417 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7b95404-5989-4c88-969b-4e9afaaebe8e-openstack-config\") pod \"openstackclient\" (UID: \"b7b95404-5989-4c88-969b-4e9afaaebe8e\") " pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.249455 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7b95404-5989-4c88-969b-4e9afaaebe8e-openstack-config-secret\") pod \"openstackclient\" (UID: \"b7b95404-5989-4c88-969b-4e9afaaebe8e\") " pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.249486 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkbp4\" (UniqueName: \"kubernetes.io/projected/b7b95404-5989-4c88-969b-4e9afaaebe8e-kube-api-access-xkbp4\") pod \"openstackclient\" (UID: \"b7b95404-5989-4c88-969b-4e9afaaebe8e\") " pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.333206 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.334186 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.334381 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-lc2rq" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.344449 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.347026 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67c6737f-51b5-4262-9bd4-fe8afcb262ad-kube-api-access-2gfds" (OuterVolumeSpecName: "kube-api-access-2gfds") pod "67c6737f-51b5-4262-9bd4-fe8afcb262ad" (UID: "67c6737f-51b5-4262-9bd4-fe8afcb262ad"). InnerVolumeSpecName "kube-api-access-2gfds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.347065 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "67c6737f-51b5-4262-9bd4-fe8afcb262ad" (UID: "67c6737f-51b5-4262-9bd4-fe8afcb262ad"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.351279 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b95404-5989-4c88-969b-4e9afaaebe8e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b7b95404-5989-4c88-969b-4e9afaaebe8e\") " pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.351334 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7b95404-5989-4c88-969b-4e9afaaebe8e-openstack-config\") pod \"openstackclient\" (UID: \"b7b95404-5989-4c88-969b-4e9afaaebe8e\") " pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.351362 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7b95404-5989-4c88-969b-4e9afaaebe8e-openstack-config-secret\") pod \"openstackclient\" (UID: \"b7b95404-5989-4c88-969b-4e9afaaebe8e\") " pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.351378 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkbp4\" (UniqueName: \"kubernetes.io/projected/b7b95404-5989-4c88-969b-4e9afaaebe8e-kube-api-access-xkbp4\") pod \"openstackclient\" (UID: \"b7b95404-5989-4c88-969b-4e9afaaebe8e\") " pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.351464 4843 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.351476 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gfds\" (UniqueName: \"kubernetes.io/projected/67c6737f-51b5-4262-9bd4-fe8afcb262ad-kube-api-access-2gfds\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.357886 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7b95404-5989-4c88-969b-4e9afaaebe8e-openstack-config\") pod \"openstackclient\" (UID: \"b7b95404-5989-4c88-969b-4e9afaaebe8e\") " pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.382438 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b95404-5989-4c88-969b-4e9afaaebe8e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b7b95404-5989-4c88-969b-4e9afaaebe8e\") " pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.400693 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7b95404-5989-4c88-969b-4e9afaaebe8e-openstack-config-secret\") pod \"openstackclient\" (UID: \"b7b95404-5989-4c88-969b-4e9afaaebe8e\") " pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.403514 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkbp4\" (UniqueName: \"kubernetes.io/projected/b7b95404-5989-4c88-969b-4e9afaaebe8e-kube-api-access-xkbp4\") pod \"openstackclient\" (UID: \"b7b95404-5989-4c88-969b-4e9afaaebe8e\") " pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.603848 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "67c6737f-51b5-4262-9bd4-fe8afcb262ad" (UID: "67c6737f-51b5-4262-9bd4-fe8afcb262ad"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.626848 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "67c6737f-51b5-4262-9bd4-fe8afcb262ad" (UID: "67c6737f-51b5-4262-9bd4-fe8afcb262ad"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.634517 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67c6737f-51b5-4262-9bd4-fe8afcb262ad" (UID: "67c6737f-51b5-4262-9bd4-fe8afcb262ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.659901 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.659948 4843 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.659957 4843 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.670295 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.671250 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-config" (OuterVolumeSpecName: "config") pod "67c6737f-51b5-4262-9bd4-fe8afcb262ad" (UID: "67c6737f-51b5-4262-9bd4-fe8afcb262ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.733979 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "67c6737f-51b5-4262-9bd4-fe8afcb262ad" (UID: "67c6737f-51b5-4262-9bd4-fe8afcb262ad"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.767158 4843 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.767195 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/67c6737f-51b5-4262-9bd4-fe8afcb262ad-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:16 crc kubenswrapper[4843]: I0318 12:33:16.963320 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:33:17 crc kubenswrapper[4843]: I0318 12:33:17.148083 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74ccddfbbf-d44d7" Mar 18 12:33:17 crc kubenswrapper[4843]: I0318 12:33:17.148078 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a","Type":"ContainerStarted","Data":"33baa61689185aa63dc2ff577a63368830bbed19e7db8c0537f3b759751fb320"} Mar 18 12:33:17 crc kubenswrapper[4843]: I0318 12:33:17.195082 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-74ccddfbbf-d44d7"] Mar 18 12:33:17 crc kubenswrapper[4843]: I0318 12:33:17.255168 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-74ccddfbbf-d44d7"] Mar 18 12:33:17 crc kubenswrapper[4843]: I0318 12:33:17.390203 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 12:33:17 crc kubenswrapper[4843]: I0318 12:33:17.913473 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5bf5d4bdcb-8xfkn" Mar 18 12:33:17 crc kubenswrapper[4843]: I0318 12:33:17.987154 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7dc96fcc9b-lt524"] Mar 18 12:33:17 crc kubenswrapper[4843]: I0318 12:33:17.988197 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7dc96fcc9b-lt524" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon-log" containerID="cri-o://3556ca049e4c4ff3eb611a388edd9d3f28839b1285421ed541909b3b6403983e" gracePeriod=30 Mar 18 12:33:17 crc kubenswrapper[4843]: I0318 12:33:17.988357 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7dc96fcc9b-lt524" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon" containerID="cri-o://cc85cd1c0e532d46fd65865f18713d5bac3f4038811619aedb9a024c01f92940" gracePeriod=30 Mar 18 12:33:18 crc kubenswrapper[4843]: I0318 12:33:18.167269 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b7b95404-5989-4c88-969b-4e9afaaebe8e","Type":"ContainerStarted","Data":"b5387e42fe92ebe174893cd8f0ec4b414cc745560976d971aa161508a4bf6d75"} Mar 18 12:33:18 crc kubenswrapper[4843]: I0318 12:33:18.169902 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a","Type":"ContainerStarted","Data":"0b9316f4cf80df202fb78c7d9eeeaed7064582b1ad8c99894d0ef034d740bfa3"} Mar 18 12:33:18 crc kubenswrapper[4843]: I0318 12:33:18.171293 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 12:33:18 crc kubenswrapper[4843]: I0318 12:33:18.996052 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67c6737f-51b5-4262-9bd4-fe8afcb262ad" path="/var/lib/kubelet/pods/67c6737f-51b5-4262-9bd4-fe8afcb262ad/volumes" Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.763579 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.794046 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.794021714 podStartE2EDuration="5.794021714s" podCreationTimestamp="2026-03-18 12:33:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:18.231821576 +0000 UTC m=+1431.947647100" watchObservedRunningTime="2026-03-18 12:33:19.794021714 +0000 UTC m=+1433.509847228" Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.918525 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-internal-tls-certs\") pod \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.918903 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-scripts\") pod \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.919065 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-config-data\") pod \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.919191 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-combined-ca-bundle\") pod \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.919335 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfn5c\" (UniqueName: \"kubernetes.io/projected/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-kube-api-access-gfn5c\") pod \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.919443 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-logs\") pod \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.919511 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-public-tls-certs\") pod \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\" (UID: \"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba\") " Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.920868 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-logs" (OuterVolumeSpecName: "logs") pod "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" (UID: "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.925488 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-scripts" (OuterVolumeSpecName: "scripts") pod "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" (UID: "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.926537 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-kube-api-access-gfn5c" (OuterVolumeSpecName: "kube-api-access-gfn5c") pod "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" (UID: "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba"). InnerVolumeSpecName "kube-api-access-gfn5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.970864 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-config-data" (OuterVolumeSpecName: "config-data") pod "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" (UID: "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:19 crc kubenswrapper[4843]: I0318 12:33:19.986728 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" (UID: "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.022485 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.022517 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.022527 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.022548 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfn5c\" (UniqueName: \"kubernetes.io/projected/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-kube-api-access-gfn5c\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.022557 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.034924 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.034989 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.038366 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" (UID: "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.047533 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" (UID: "ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.124749 4843 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.124785 4843 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.193381 4843 generic.go:334] "Generic (PLEG): container finished" podID="ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" containerID="fc88b2e91a473280f47da439283b67e0f0df2a316bb157d1f4957d66aee924e3" exitCode=0 Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.193888 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-859b44d9b8-z27mw" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.194320 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-859b44d9b8-z27mw" event={"ID":"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba","Type":"ContainerDied","Data":"fc88b2e91a473280f47da439283b67e0f0df2a316bb157d1f4957d66aee924e3"} Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.194349 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-859b44d9b8-z27mw" event={"ID":"ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba","Type":"ContainerDied","Data":"42c8ebedd47a5d8e0e9546811537815ee466f776e630f1a5ee0b0ae21a74becc"} Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.194365 4843 scope.go:117] "RemoveContainer" containerID="fc88b2e91a473280f47da439283b67e0f0df2a316bb157d1f4957d66aee924e3" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.284465 4843 scope.go:117] "RemoveContainer" containerID="c540da30f2fc07d5ac506a921f7f994571430bd06a0850a32c546d419ecc2c3b" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.286972 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-859b44d9b8-z27mw"] Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.294734 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-859b44d9b8-z27mw"] Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.338135 4843 scope.go:117] "RemoveContainer" containerID="fc88b2e91a473280f47da439283b67e0f0df2a316bb157d1f4957d66aee924e3" Mar 18 12:33:20 crc kubenswrapper[4843]: E0318 12:33:20.339549 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc88b2e91a473280f47da439283b67e0f0df2a316bb157d1f4957d66aee924e3\": container with ID starting with fc88b2e91a473280f47da439283b67e0f0df2a316bb157d1f4957d66aee924e3 not found: ID does not exist" containerID="fc88b2e91a473280f47da439283b67e0f0df2a316bb157d1f4957d66aee924e3" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.339593 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc88b2e91a473280f47da439283b67e0f0df2a316bb157d1f4957d66aee924e3"} err="failed to get container status \"fc88b2e91a473280f47da439283b67e0f0df2a316bb157d1f4957d66aee924e3\": rpc error: code = NotFound desc = could not find container \"fc88b2e91a473280f47da439283b67e0f0df2a316bb157d1f4957d66aee924e3\": container with ID starting with fc88b2e91a473280f47da439283b67e0f0df2a316bb157d1f4957d66aee924e3 not found: ID does not exist" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.339625 4843 scope.go:117] "RemoveContainer" containerID="c540da30f2fc07d5ac506a921f7f994571430bd06a0850a32c546d419ecc2c3b" Mar 18 12:33:20 crc kubenswrapper[4843]: E0318 12:33:20.340139 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c540da30f2fc07d5ac506a921f7f994571430bd06a0850a32c546d419ecc2c3b\": container with ID starting with c540da30f2fc07d5ac506a921f7f994571430bd06a0850a32c546d419ecc2c3b not found: ID does not exist" containerID="c540da30f2fc07d5ac506a921f7f994571430bd06a0850a32c546d419ecc2c3b" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.340170 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c540da30f2fc07d5ac506a921f7f994571430bd06a0850a32c546d419ecc2c3b"} err="failed to get container status \"c540da30f2fc07d5ac506a921f7f994571430bd06a0850a32c546d419ecc2c3b\": rpc error: code = NotFound desc = could not find container \"c540da30f2fc07d5ac506a921f7f994571430bd06a0850a32c546d419ecc2c3b\": container with ID starting with c540da30f2fc07d5ac506a921f7f994571430bd06a0850a32c546d419ecc2c3b not found: ID does not exist" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.485898 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.548682 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-zztvz"] Mar 18 12:33:20 crc kubenswrapper[4843]: I0318 12:33:20.549109 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-zztvz" podUID="a9f366e3-403e-4dad-9f88-b734ab67badd" containerName="dnsmasq-dns" containerID="cri-o://705ee57ffa28a7ed735b651f092ba09ff40a3417da546f92d010b4691b2f02b6" gracePeriod=10 Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.095559 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" path="/var/lib/kubelet/pods/ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba/volumes" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.096355 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.169204 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.191680 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.221189 4843 generic.go:334] "Generic (PLEG): container finished" podID="a9f366e3-403e-4dad-9f88-b734ab67badd" containerID="705ee57ffa28a7ed735b651f092ba09ff40a3417da546f92d010b4691b2f02b6" exitCode=0 Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.221262 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-zztvz" event={"ID":"a9f366e3-403e-4dad-9f88-b734ab67badd","Type":"ContainerDied","Data":"705ee57ffa28a7ed735b651f092ba09ff40a3417da546f92d010b4691b2f02b6"} Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.221291 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-zztvz" event={"ID":"a9f366e3-403e-4dad-9f88-b734ab67badd","Type":"ContainerDied","Data":"8bae63b9a0354b038434043c5dd5875a19755fe17c76fa11808439bed89dcad1"} Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.221309 4843 scope.go:117] "RemoveContainer" containerID="705ee57ffa28a7ed735b651f092ba09ff40a3417da546f92d010b4691b2f02b6" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.221419 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-zztvz" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.227120 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="45d711fe-44a0-4167-8155-e31794ab10e5" containerName="cinder-scheduler" containerID="cri-o://aebb2469b89b8930f6cfa9e90cb71b157196270c2b73c90bec57a7aeb42fdaf8" gracePeriod=30 Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.229496 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="45d711fe-44a0-4167-8155-e31794ab10e5" containerName="probe" containerID="cri-o://6edfbde974f14038b9c73570c2850cb9073586496f8ea48614074132ddb0afd7" gracePeriod=30 Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.261072 4843 scope.go:117] "RemoveContainer" containerID="ab81560b275f4cda29ebbfac518765cf4a2f21b192499ae9564782848dc440db" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.279176 4843 scope.go:117] "RemoveContainer" containerID="705ee57ffa28a7ed735b651f092ba09ff40a3417da546f92d010b4691b2f02b6" Mar 18 12:33:21 crc kubenswrapper[4843]: E0318 12:33:21.279691 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"705ee57ffa28a7ed735b651f092ba09ff40a3417da546f92d010b4691b2f02b6\": container with ID starting with 705ee57ffa28a7ed735b651f092ba09ff40a3417da546f92d010b4691b2f02b6 not found: ID does not exist" containerID="705ee57ffa28a7ed735b651f092ba09ff40a3417da546f92d010b4691b2f02b6" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.279731 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705ee57ffa28a7ed735b651f092ba09ff40a3417da546f92d010b4691b2f02b6"} err="failed to get container status \"705ee57ffa28a7ed735b651f092ba09ff40a3417da546f92d010b4691b2f02b6\": rpc error: code = NotFound desc = could not find container \"705ee57ffa28a7ed735b651f092ba09ff40a3417da546f92d010b4691b2f02b6\": container with ID starting with 705ee57ffa28a7ed735b651f092ba09ff40a3417da546f92d010b4691b2f02b6 not found: ID does not exist" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.279799 4843 scope.go:117] "RemoveContainer" containerID="ab81560b275f4cda29ebbfac518765cf4a2f21b192499ae9564782848dc440db" Mar 18 12:33:21 crc kubenswrapper[4843]: E0318 12:33:21.283817 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab81560b275f4cda29ebbfac518765cf4a2f21b192499ae9564782848dc440db\": container with ID starting with ab81560b275f4cda29ebbfac518765cf4a2f21b192499ae9564782848dc440db not found: ID does not exist" containerID="ab81560b275f4cda29ebbfac518765cf4a2f21b192499ae9564782848dc440db" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.283857 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab81560b275f4cda29ebbfac518765cf4a2f21b192499ae9564782848dc440db"} err="failed to get container status \"ab81560b275f4cda29ebbfac518765cf4a2f21b192499ae9564782848dc440db\": rpc error: code = NotFound desc = could not find container \"ab81560b275f4cda29ebbfac518765cf4a2f21b192499ae9564782848dc440db\": container with ID starting with ab81560b275f4cda29ebbfac518765cf4a2f21b192499ae9564782848dc440db not found: ID does not exist" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.304762 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-config\") pod \"a9f366e3-403e-4dad-9f88-b734ab67badd\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.305006 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-dns-swift-storage-0\") pod \"a9f366e3-403e-4dad-9f88-b734ab67badd\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.305257 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-ovsdbserver-nb\") pod \"a9f366e3-403e-4dad-9f88-b734ab67badd\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.305333 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-dns-svc\") pod \"a9f366e3-403e-4dad-9f88-b734ab67badd\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.305424 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-ovsdbserver-sb\") pod \"a9f366e3-403e-4dad-9f88-b734ab67badd\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.305600 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75rj2\" (UniqueName: \"kubernetes.io/projected/a9f366e3-403e-4dad-9f88-b734ab67badd-kube-api-access-75rj2\") pod \"a9f366e3-403e-4dad-9f88-b734ab67badd\" (UID: \"a9f366e3-403e-4dad-9f88-b734ab67badd\") " Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.313380 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f366e3-403e-4dad-9f88-b734ab67badd-kube-api-access-75rj2" (OuterVolumeSpecName: "kube-api-access-75rj2") pod "a9f366e3-403e-4dad-9f88-b734ab67badd" (UID: "a9f366e3-403e-4dad-9f88-b734ab67badd"). InnerVolumeSpecName "kube-api-access-75rj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.371491 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a9f366e3-403e-4dad-9f88-b734ab67badd" (UID: "a9f366e3-403e-4dad-9f88-b734ab67badd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.373499 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a9f366e3-403e-4dad-9f88-b734ab67badd" (UID: "a9f366e3-403e-4dad-9f88-b734ab67badd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.394477 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9f366e3-403e-4dad-9f88-b734ab67badd" (UID: "a9f366e3-403e-4dad-9f88-b734ab67badd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.395139 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a9f366e3-403e-4dad-9f88-b734ab67badd" (UID: "a9f366e3-403e-4dad-9f88-b734ab67badd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.410687 4843 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.410730 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.410743 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.410755 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.410763 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75rj2\" (UniqueName: \"kubernetes.io/projected/a9f366e3-403e-4dad-9f88-b734ab67badd-kube-api-access-75rj2\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.411210 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7dc96fcc9b-lt524" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:52344->10.217.0.154:8443: read: connection reset by peer" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.411502 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-config" (OuterVolumeSpecName: "config") pod "a9f366e3-403e-4dad-9f88-b734ab67badd" (UID: "a9f366e3-403e-4dad-9f88-b734ab67badd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.512666 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f366e3-403e-4dad-9f88-b734ab67badd-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.560699 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-zztvz"] Mar 18 12:33:21 crc kubenswrapper[4843]: I0318 12:33:21.571325 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-zztvz"] Mar 18 12:33:22 crc kubenswrapper[4843]: I0318 12:33:22.246407 4843 generic.go:334] "Generic (PLEG): container finished" podID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerID="cc85cd1c0e532d46fd65865f18713d5bac3f4038811619aedb9a024c01f92940" exitCode=0 Mar 18 12:33:22 crc kubenswrapper[4843]: I0318 12:33:22.246492 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc96fcc9b-lt524" event={"ID":"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434","Type":"ContainerDied","Data":"cc85cd1c0e532d46fd65865f18713d5bac3f4038811619aedb9a024c01f92940"} Mar 18 12:33:22 crc kubenswrapper[4843]: I0318 12:33:22.996300 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f366e3-403e-4dad-9f88-b734ab67badd" path="/var/lib/kubelet/pods/a9f366e3-403e-4dad-9f88-b734ab67badd/volumes" Mar 18 12:33:24 crc kubenswrapper[4843]: I0318 12:33:24.271254 4843 generic.go:334] "Generic (PLEG): container finished" podID="7321460c-9acb-47b8-b14c-d3f17cd937ec" containerID="de5293daf4d4080c399e13311d5f89c19ae291442390ec8def56b3ccb3ab78c6" exitCode=137 Mar 18 12:33:24 crc kubenswrapper[4843]: I0318 12:33:24.271456 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fb7d6bc54-x5f24" event={"ID":"7321460c-9acb-47b8-b14c-d3f17cd937ec","Type":"ContainerDied","Data":"de5293daf4d4080c399e13311d5f89c19ae291442390ec8def56b3ccb3ab78c6"} Mar 18 12:33:24 crc kubenswrapper[4843]: I0318 12:33:24.274169 4843 generic.go:334] "Generic (PLEG): container finished" podID="45d711fe-44a0-4167-8155-e31794ab10e5" containerID="6edfbde974f14038b9c73570c2850cb9073586496f8ea48614074132ddb0afd7" exitCode=0 Mar 18 12:33:24 crc kubenswrapper[4843]: I0318 12:33:24.274376 4843 generic.go:334] "Generic (PLEG): container finished" podID="45d711fe-44a0-4167-8155-e31794ab10e5" containerID="aebb2469b89b8930f6cfa9e90cb71b157196270c2b73c90bec57a7aeb42fdaf8" exitCode=0 Mar 18 12:33:24 crc kubenswrapper[4843]: I0318 12:33:24.274325 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45d711fe-44a0-4167-8155-e31794ab10e5","Type":"ContainerDied","Data":"6edfbde974f14038b9c73570c2850cb9073586496f8ea48614074132ddb0afd7"} Mar 18 12:33:24 crc kubenswrapper[4843]: I0318 12:33:24.274417 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45d711fe-44a0-4167-8155-e31794ab10e5","Type":"ContainerDied","Data":"aebb2469b89b8930f6cfa9e90cb71b157196270c2b73c90bec57a7aeb42fdaf8"} Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.693646 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-76c6d69747-h572n"] Mar 18 12:33:25 crc kubenswrapper[4843]: E0318 12:33:25.694213 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f366e3-403e-4dad-9f88-b734ab67badd" containerName="dnsmasq-dns" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.694225 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f366e3-403e-4dad-9f88-b734ab67badd" containerName="dnsmasq-dns" Mar 18 12:33:25 crc kubenswrapper[4843]: E0318 12:33:25.694255 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" containerName="placement-log" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.694261 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" containerName="placement-log" Mar 18 12:33:25 crc kubenswrapper[4843]: E0318 12:33:25.694273 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" containerName="placement-api" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.694279 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" containerName="placement-api" Mar 18 12:33:25 crc kubenswrapper[4843]: E0318 12:33:25.694293 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f366e3-403e-4dad-9f88-b734ab67badd" containerName="init" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.694299 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f366e3-403e-4dad-9f88-b734ab67badd" containerName="init" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.694480 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f366e3-403e-4dad-9f88-b734ab67badd" containerName="dnsmasq-dns" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.694500 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" containerName="placement-log" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.694511 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddcf10fb-ae65-41fb-ac27-7cf9f3adf6ba" containerName="placement-api" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.695419 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.698298 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.698507 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.698702 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.711035 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76c6d69747-h572n"] Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.808177 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r47g\" (UniqueName: \"kubernetes.io/projected/5001539a-ee9d-44c9-bcab-58e3720323ae-kube-api-access-4r47g\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.808372 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5001539a-ee9d-44c9-bcab-58e3720323ae-etc-swift\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.808402 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5001539a-ee9d-44c9-bcab-58e3720323ae-log-httpd\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.808433 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5001539a-ee9d-44c9-bcab-58e3720323ae-combined-ca-bundle\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.808472 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5001539a-ee9d-44c9-bcab-58e3720323ae-run-httpd\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.808506 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5001539a-ee9d-44c9-bcab-58e3720323ae-public-tls-certs\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.809428 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5001539a-ee9d-44c9-bcab-58e3720323ae-internal-tls-certs\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.809554 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5001539a-ee9d-44c9-bcab-58e3720323ae-config-data\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.911328 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5001539a-ee9d-44c9-bcab-58e3720323ae-combined-ca-bundle\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.911393 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5001539a-ee9d-44c9-bcab-58e3720323ae-run-httpd\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.911444 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5001539a-ee9d-44c9-bcab-58e3720323ae-public-tls-certs\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.911512 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5001539a-ee9d-44c9-bcab-58e3720323ae-internal-tls-certs\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.911877 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5001539a-ee9d-44c9-bcab-58e3720323ae-config-data\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.911955 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r47g\" (UniqueName: \"kubernetes.io/projected/5001539a-ee9d-44c9-bcab-58e3720323ae-kube-api-access-4r47g\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.912044 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5001539a-ee9d-44c9-bcab-58e3720323ae-run-httpd\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.913025 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5001539a-ee9d-44c9-bcab-58e3720323ae-etc-swift\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.913078 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5001539a-ee9d-44c9-bcab-58e3720323ae-log-httpd\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.913532 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5001539a-ee9d-44c9-bcab-58e3720323ae-log-httpd\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.919148 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5001539a-ee9d-44c9-bcab-58e3720323ae-internal-tls-certs\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.920564 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5001539a-ee9d-44c9-bcab-58e3720323ae-public-tls-certs\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.921575 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5001539a-ee9d-44c9-bcab-58e3720323ae-etc-swift\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.925218 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5001539a-ee9d-44c9-bcab-58e3720323ae-combined-ca-bundle\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.927290 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5001539a-ee9d-44c9-bcab-58e3720323ae-config-data\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:25 crc kubenswrapper[4843]: I0318 12:33:25.931978 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r47g\" (UniqueName: \"kubernetes.io/projected/5001539a-ee9d-44c9-bcab-58e3720323ae-kube-api-access-4r47g\") pod \"swift-proxy-76c6d69747-h572n\" (UID: \"5001539a-ee9d-44c9-bcab-58e3720323ae\") " pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:26 crc kubenswrapper[4843]: I0318 12:33:26.018023 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:26 crc kubenswrapper[4843]: I0318 12:33:26.537892 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fb7d6bc54-x5f24" podUID="7321460c-9acb-47b8-b14c-d3f17cd937ec" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Mar 18 12:33:26 crc kubenswrapper[4843]: I0318 12:33:26.538058 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fb7d6bc54-x5f24" podUID="7321460c-9acb-47b8-b14c-d3f17cd937ec" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Mar 18 12:33:26 crc kubenswrapper[4843]: I0318 12:33:26.655482 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.072966 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.073826 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="faa05482-2d1f-43b3-93a2-94178afbc8a7" containerName="glance-log" containerID="cri-o://e5120395adcb25710ae79cf4e7aca3389748547e1ac64b3de7f3cc8c340582b9" gracePeriod=30 Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.073893 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="faa05482-2d1f-43b3-93a2-94178afbc8a7" containerName="glance-httpd" containerID="cri-o://e952b22fa5cad4c50fae86de9f9e0bb81b48b450d35116902db80c66fe04e6c6" gracePeriod=30 Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.394206 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45d711fe-44a0-4167-8155-e31794ab10e5","Type":"ContainerDied","Data":"625a3945d7baf48522ba2fc18c27ab240baa1eacd1330a614301e10381f6270f"} Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.394677 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="625a3945d7baf48522ba2fc18c27ab240baa1eacd1330a614301e10381f6270f" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.417395 4843 generic.go:334] "Generic (PLEG): container finished" podID="faa05482-2d1f-43b3-93a2-94178afbc8a7" containerID="e5120395adcb25710ae79cf4e7aca3389748547e1ac64b3de7f3cc8c340582b9" exitCode=143 Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.417472 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"faa05482-2d1f-43b3-93a2-94178afbc8a7","Type":"ContainerDied","Data":"e5120395adcb25710ae79cf4e7aca3389748547e1ac64b3de7f3cc8c340582b9"} Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.465690 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.503691 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sq52\" (UniqueName: \"kubernetes.io/projected/45d711fe-44a0-4167-8155-e31794ab10e5-kube-api-access-8sq52\") pod \"45d711fe-44a0-4167-8155-e31794ab10e5\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.503781 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-scripts\") pod \"45d711fe-44a0-4167-8155-e31794ab10e5\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.503817 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-combined-ca-bundle\") pod \"45d711fe-44a0-4167-8155-e31794ab10e5\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.503866 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-config-data\") pod \"45d711fe-44a0-4167-8155-e31794ab10e5\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.503900 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45d711fe-44a0-4167-8155-e31794ab10e5-etc-machine-id\") pod \"45d711fe-44a0-4167-8155-e31794ab10e5\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.503924 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-config-data-custom\") pod \"45d711fe-44a0-4167-8155-e31794ab10e5\" (UID: \"45d711fe-44a0-4167-8155-e31794ab10e5\") " Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.505826 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45d711fe-44a0-4167-8155-e31794ab10e5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "45d711fe-44a0-4167-8155-e31794ab10e5" (UID: "45d711fe-44a0-4167-8155-e31794ab10e5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.511579 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d711fe-44a0-4167-8155-e31794ab10e5-kube-api-access-8sq52" (OuterVolumeSpecName: "kube-api-access-8sq52") pod "45d711fe-44a0-4167-8155-e31794ab10e5" (UID: "45d711fe-44a0-4167-8155-e31794ab10e5"). InnerVolumeSpecName "kube-api-access-8sq52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.512099 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "45d711fe-44a0-4167-8155-e31794ab10e5" (UID: "45d711fe-44a0-4167-8155-e31794ab10e5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.516095 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-scripts" (OuterVolumeSpecName: "scripts") pod "45d711fe-44a0-4167-8155-e31794ab10e5" (UID: "45d711fe-44a0-4167-8155-e31794ab10e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.590923 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45d711fe-44a0-4167-8155-e31794ab10e5" (UID: "45d711fe-44a0-4167-8155-e31794ab10e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.608144 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.608181 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.608194 4843 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45d711fe-44a0-4167-8155-e31794ab10e5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.608203 4843 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.608213 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sq52\" (UniqueName: \"kubernetes.io/projected/45d711fe-44a0-4167-8155-e31794ab10e5-kube-api-access-8sq52\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.635711 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-config-data" (OuterVolumeSpecName: "config-data") pod "45d711fe-44a0-4167-8155-e31794ab10e5" (UID: "45d711fe-44a0-4167-8155-e31794ab10e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.682037 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.709778 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d711fe-44a0-4167-8155-e31794ab10e5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.811875 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7ggtj"] Mar 18 12:33:30 crc kubenswrapper[4843]: E0318 12:33:30.812441 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d711fe-44a0-4167-8155-e31794ab10e5" containerName="probe" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.812452 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d711fe-44a0-4167-8155-e31794ab10e5" containerName="probe" Mar 18 12:33:30 crc kubenswrapper[4843]: E0318 12:33:30.812486 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7321460c-9acb-47b8-b14c-d3f17cd937ec" containerName="barbican-api-log" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.812492 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7321460c-9acb-47b8-b14c-d3f17cd937ec" containerName="barbican-api-log" Mar 18 12:33:30 crc kubenswrapper[4843]: E0318 12:33:30.812505 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d711fe-44a0-4167-8155-e31794ab10e5" containerName="cinder-scheduler" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.812511 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d711fe-44a0-4167-8155-e31794ab10e5" containerName="cinder-scheduler" Mar 18 12:33:30 crc kubenswrapper[4843]: E0318 12:33:30.812520 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7321460c-9acb-47b8-b14c-d3f17cd937ec" containerName="barbican-api" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.812526 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7321460c-9acb-47b8-b14c-d3f17cd937ec" containerName="barbican-api" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.812707 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7321460c-9acb-47b8-b14c-d3f17cd937ec" containerName="barbican-api" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.812716 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7321460c-9acb-47b8-b14c-d3f17cd937ec" containerName="barbican-api-log" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.812742 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d711fe-44a0-4167-8155-e31794ab10e5" containerName="probe" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.812748 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d711fe-44a0-4167-8155-e31794ab10e5" containerName="cinder-scheduler" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.813398 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7ggtj" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.814462 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-config-data-custom\") pod \"7321460c-9acb-47b8-b14c-d3f17cd937ec\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.814568 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mbsx\" (UniqueName: \"kubernetes.io/projected/7321460c-9acb-47b8-b14c-d3f17cd937ec-kube-api-access-6mbsx\") pod \"7321460c-9acb-47b8-b14c-d3f17cd937ec\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.814701 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-combined-ca-bundle\") pod \"7321460c-9acb-47b8-b14c-d3f17cd937ec\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.814730 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7321460c-9acb-47b8-b14c-d3f17cd937ec-logs\") pod \"7321460c-9acb-47b8-b14c-d3f17cd937ec\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.814775 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-config-data\") pod \"7321460c-9acb-47b8-b14c-d3f17cd937ec\" (UID: \"7321460c-9acb-47b8-b14c-d3f17cd937ec\") " Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.815564 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7321460c-9acb-47b8-b14c-d3f17cd937ec-logs" (OuterVolumeSpecName: "logs") pod "7321460c-9acb-47b8-b14c-d3f17cd937ec" (UID: "7321460c-9acb-47b8-b14c-d3f17cd937ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.821920 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7321460c-9acb-47b8-b14c-d3f17cd937ec-kube-api-access-6mbsx" (OuterVolumeSpecName: "kube-api-access-6mbsx") pod "7321460c-9acb-47b8-b14c-d3f17cd937ec" (UID: "7321460c-9acb-47b8-b14c-d3f17cd937ec"). InnerVolumeSpecName "kube-api-access-6mbsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.824197 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7321460c-9acb-47b8-b14c-d3f17cd937ec" (UID: "7321460c-9acb-47b8-b14c-d3f17cd937ec"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.854450 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7ggtj"] Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.856021 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7321460c-9acb-47b8-b14c-d3f17cd937ec" (UID: "7321460c-9acb-47b8-b14c-d3f17cd937ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.897454 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76c6d69747-h572n"] Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.916792 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tvqb\" (UniqueName: \"kubernetes.io/projected/97556db6-d486-4ca2-9218-13e7691ae6ee-kube-api-access-8tvqb\") pod \"nova-api-db-create-7ggtj\" (UID: \"97556db6-d486-4ca2-9218-13e7691ae6ee\") " pod="openstack/nova-api-db-create-7ggtj" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.916841 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97556db6-d486-4ca2-9218-13e7691ae6ee-operator-scripts\") pod \"nova-api-db-create-7ggtj\" (UID: \"97556db6-d486-4ca2-9218-13e7691ae6ee\") " pod="openstack/nova-api-db-create-7ggtj" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.916924 4843 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.916937 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mbsx\" (UniqueName: \"kubernetes.io/projected/7321460c-9acb-47b8-b14c-d3f17cd937ec-kube-api-access-6mbsx\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.916947 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.916955 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7321460c-9acb-47b8-b14c-d3f17cd937ec-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.918587 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-config-data" (OuterVolumeSpecName: "config-data") pod "7321460c-9acb-47b8-b14c-d3f17cd937ec" (UID: "7321460c-9acb-47b8-b14c-d3f17cd937ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.921739 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-l67jn"] Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.923433 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-l67jn" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.935822 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0dac-account-create-update-ttlb8"] Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.937215 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0dac-account-create-update-ttlb8" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.943530 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0dac-account-create-update-ttlb8"] Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.952339 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 12:33:30 crc kubenswrapper[4843]: I0318 12:33:30.954721 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-l67jn"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.012948 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-7cjkn"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.014305 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7cjkn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.019204 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qlsb\" (UniqueName: \"kubernetes.io/projected/27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4-kube-api-access-6qlsb\") pod \"nova-cell1-db-create-7cjkn\" (UID: \"27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4\") " pod="openstack/nova-cell1-db-create-7cjkn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.019277 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4-operator-scripts\") pod \"nova-cell1-db-create-7cjkn\" (UID: \"27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4\") " pod="openstack/nova-cell1-db-create-7cjkn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.019371 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztqhn\" (UniqueName: \"kubernetes.io/projected/479484ef-0791-436d-91df-e50b0b4390b2-kube-api-access-ztqhn\") pod \"nova-cell0-db-create-l67jn\" (UID: \"479484ef-0791-436d-91df-e50b0b4390b2\") " pod="openstack/nova-cell0-db-create-l67jn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.019434 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tvqb\" (UniqueName: \"kubernetes.io/projected/97556db6-d486-4ca2-9218-13e7691ae6ee-kube-api-access-8tvqb\") pod \"nova-api-db-create-7ggtj\" (UID: \"97556db6-d486-4ca2-9218-13e7691ae6ee\") " pod="openstack/nova-api-db-create-7ggtj" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.019455 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50047908-1fa0-4181-a690-f87f5d4b0a6a-operator-scripts\") pod \"nova-api-0dac-account-create-update-ttlb8\" (UID: \"50047908-1fa0-4181-a690-f87f5d4b0a6a\") " pod="openstack/nova-api-0dac-account-create-update-ttlb8" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.019480 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97556db6-d486-4ca2-9218-13e7691ae6ee-operator-scripts\") pod \"nova-api-db-create-7ggtj\" (UID: \"97556db6-d486-4ca2-9218-13e7691ae6ee\") " pod="openstack/nova-api-db-create-7ggtj" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.019526 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5m5d\" (UniqueName: \"kubernetes.io/projected/50047908-1fa0-4181-a690-f87f5d4b0a6a-kube-api-access-s5m5d\") pod \"nova-api-0dac-account-create-update-ttlb8\" (UID: \"50047908-1fa0-4181-a690-f87f5d4b0a6a\") " pod="openstack/nova-api-0dac-account-create-update-ttlb8" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.019546 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479484ef-0791-436d-91df-e50b0b4390b2-operator-scripts\") pod \"nova-cell0-db-create-l67jn\" (UID: \"479484ef-0791-436d-91df-e50b0b4390b2\") " pod="openstack/nova-cell0-db-create-l67jn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.019632 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7321460c-9acb-47b8-b14c-d3f17cd937ec-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.026704 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97556db6-d486-4ca2-9218-13e7691ae6ee-operator-scripts\") pod \"nova-api-db-create-7ggtj\" (UID: \"97556db6-d486-4ca2-9218-13e7691ae6ee\") " pod="openstack/nova-api-db-create-7ggtj" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.035388 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7cjkn"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.038576 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tvqb\" (UniqueName: \"kubernetes.io/projected/97556db6-d486-4ca2-9218-13e7691ae6ee-kube-api-access-8tvqb\") pod \"nova-api-db-create-7ggtj\" (UID: \"97556db6-d486-4ca2-9218-13e7691ae6ee\") " pod="openstack/nova-api-db-create-7ggtj" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.044052 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7dc96fcc9b-lt524" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.114014 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2934-account-create-update-vtchm"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.119477 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2934-account-create-update-vtchm" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.120546 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qlsb\" (UniqueName: \"kubernetes.io/projected/27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4-kube-api-access-6qlsb\") pod \"nova-cell1-db-create-7cjkn\" (UID: \"27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4\") " pod="openstack/nova-cell1-db-create-7cjkn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.120591 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4-operator-scripts\") pod \"nova-cell1-db-create-7cjkn\" (UID: \"27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4\") " pod="openstack/nova-cell1-db-create-7cjkn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.120645 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztqhn\" (UniqueName: \"kubernetes.io/projected/479484ef-0791-436d-91df-e50b0b4390b2-kube-api-access-ztqhn\") pod \"nova-cell0-db-create-l67jn\" (UID: \"479484ef-0791-436d-91df-e50b0b4390b2\") " pod="openstack/nova-cell0-db-create-l67jn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.120707 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50047908-1fa0-4181-a690-f87f5d4b0a6a-operator-scripts\") pod \"nova-api-0dac-account-create-update-ttlb8\" (UID: \"50047908-1fa0-4181-a690-f87f5d4b0a6a\") " pod="openstack/nova-api-0dac-account-create-update-ttlb8" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.120757 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5m5d\" (UniqueName: \"kubernetes.io/projected/50047908-1fa0-4181-a690-f87f5d4b0a6a-kube-api-access-s5m5d\") pod \"nova-api-0dac-account-create-update-ttlb8\" (UID: \"50047908-1fa0-4181-a690-f87f5d4b0a6a\") " pod="openstack/nova-api-0dac-account-create-update-ttlb8" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.120776 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479484ef-0791-436d-91df-e50b0b4390b2-operator-scripts\") pod \"nova-cell0-db-create-l67jn\" (UID: \"479484ef-0791-436d-91df-e50b0b4390b2\") " pod="openstack/nova-cell0-db-create-l67jn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.121419 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479484ef-0791-436d-91df-e50b0b4390b2-operator-scripts\") pod \"nova-cell0-db-create-l67jn\" (UID: \"479484ef-0791-436d-91df-e50b0b4390b2\") " pod="openstack/nova-cell0-db-create-l67jn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.122152 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4-operator-scripts\") pod \"nova-cell1-db-create-7cjkn\" (UID: \"27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4\") " pod="openstack/nova-cell1-db-create-7cjkn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.122917 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50047908-1fa0-4181-a690-f87f5d4b0a6a-operator-scripts\") pod \"nova-api-0dac-account-create-update-ttlb8\" (UID: \"50047908-1fa0-4181-a690-f87f5d4b0a6a\") " pod="openstack/nova-api-0dac-account-create-update-ttlb8" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.127150 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.140115 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2934-account-create-update-vtchm"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.143099 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7ggtj" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.144144 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5m5d\" (UniqueName: \"kubernetes.io/projected/50047908-1fa0-4181-a690-f87f5d4b0a6a-kube-api-access-s5m5d\") pod \"nova-api-0dac-account-create-update-ttlb8\" (UID: \"50047908-1fa0-4181-a690-f87f5d4b0a6a\") " pod="openstack/nova-api-0dac-account-create-update-ttlb8" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.156557 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztqhn\" (UniqueName: \"kubernetes.io/projected/479484ef-0791-436d-91df-e50b0b4390b2-kube-api-access-ztqhn\") pod \"nova-cell0-db-create-l67jn\" (UID: \"479484ef-0791-436d-91df-e50b0b4390b2\") " pod="openstack/nova-cell0-db-create-l67jn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.161487 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qlsb\" (UniqueName: \"kubernetes.io/projected/27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4-kube-api-access-6qlsb\") pod \"nova-cell1-db-create-7cjkn\" (UID: \"27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4\") " pod="openstack/nova-cell1-db-create-7cjkn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.186938 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.187696 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ed72136-75da-4a94-a22a-c2511b6bb45a" containerName="glance-httpd" containerID="cri-o://80974ee1c1587c80aaa6ca7d23a6ee4bb548ba7b77992ab0e0be1c8c208d2f69" gracePeriod=30 Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.188239 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ed72136-75da-4a94-a22a-c2511b6bb45a" containerName="glance-log" containerID="cri-o://629d7f5287d74b525ef7341e141dde0610d2ccd86069f9f863b31b01d4ac220d" gracePeriod=30 Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.221712 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed163bc-a192-464b-9025-43220a0864b7-operator-scripts\") pod \"nova-cell0-2934-account-create-update-vtchm\" (UID: \"7ed163bc-a192-464b-9025-43220a0864b7\") " pod="openstack/nova-cell0-2934-account-create-update-vtchm" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.221977 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v549x\" (UniqueName: \"kubernetes.io/projected/7ed163bc-a192-464b-9025-43220a0864b7-kube-api-access-v549x\") pod \"nova-cell0-2934-account-create-update-vtchm\" (UID: \"7ed163bc-a192-464b-9025-43220a0864b7\") " pod="openstack/nova-cell0-2934-account-create-update-vtchm" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.251795 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-l67jn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.266536 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0dac-account-create-update-ttlb8" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.312569 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-af08-account-create-update-5q5p4"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.317642 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-af08-account-create-update-5q5p4" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.319485 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.324006 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed163bc-a192-464b-9025-43220a0864b7-operator-scripts\") pod \"nova-cell0-2934-account-create-update-vtchm\" (UID: \"7ed163bc-a192-464b-9025-43220a0864b7\") " pod="openstack/nova-cell0-2934-account-create-update-vtchm" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.324042 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf-operator-scripts\") pod \"nova-cell1-af08-account-create-update-5q5p4\" (UID: \"3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf\") " pod="openstack/nova-cell1-af08-account-create-update-5q5p4" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.324066 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flx8z\" (UniqueName: \"kubernetes.io/projected/3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf-kube-api-access-flx8z\") pod \"nova-cell1-af08-account-create-update-5q5p4\" (UID: \"3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf\") " pod="openstack/nova-cell1-af08-account-create-update-5q5p4" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.324124 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v549x\" (UniqueName: \"kubernetes.io/projected/7ed163bc-a192-464b-9025-43220a0864b7-kube-api-access-v549x\") pod \"nova-cell0-2934-account-create-update-vtchm\" (UID: \"7ed163bc-a192-464b-9025-43220a0864b7\") " pod="openstack/nova-cell0-2934-account-create-update-vtchm" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.325139 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed163bc-a192-464b-9025-43220a0864b7-operator-scripts\") pod \"nova-cell0-2934-account-create-update-vtchm\" (UID: \"7ed163bc-a192-464b-9025-43220a0864b7\") " pod="openstack/nova-cell0-2934-account-create-update-vtchm" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.356918 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7cjkn" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.364851 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v549x\" (UniqueName: \"kubernetes.io/projected/7ed163bc-a192-464b-9025-43220a0864b7-kube-api-access-v549x\") pod \"nova-cell0-2934-account-create-update-vtchm\" (UID: \"7ed163bc-a192-464b-9025-43220a0864b7\") " pod="openstack/nova-cell0-2934-account-create-update-vtchm" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.376431 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-af08-account-create-update-5q5p4"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.427212 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf-operator-scripts\") pod \"nova-cell1-af08-account-create-update-5q5p4\" (UID: \"3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf\") " pod="openstack/nova-cell1-af08-account-create-update-5q5p4" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.427251 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flx8z\" (UniqueName: \"kubernetes.io/projected/3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf-kube-api-access-flx8z\") pod \"nova-cell1-af08-account-create-update-5q5p4\" (UID: \"3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf\") " pod="openstack/nova-cell1-af08-account-create-update-5q5p4" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.431554 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf-operator-scripts\") pod \"nova-cell1-af08-account-create-update-5q5p4\" (UID: \"3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf\") " pod="openstack/nova-cell1-af08-account-create-update-5q5p4" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.451070 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flx8z\" (UniqueName: \"kubernetes.io/projected/3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf-kube-api-access-flx8z\") pod \"nova-cell1-af08-account-create-update-5q5p4\" (UID: \"3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf\") " pod="openstack/nova-cell1-af08-account-create-update-5q5p4" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.460175 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b7b95404-5989-4c88-969b-4e9afaaebe8e","Type":"ContainerStarted","Data":"94b05c3809b37b1ae967d2a94eed22f360f7c39e69fa12fc20fc336abcedc40e"} Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.464297 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2934-account-create-update-vtchm" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.486364 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76c6d69747-h572n" event={"ID":"5001539a-ee9d-44c9-bcab-58e3720323ae","Type":"ContainerStarted","Data":"2a46ada32c0c6ed205f27ef98c0462a9b94c25ce6b4736952f252176fc6e97d8"} Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.486419 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76c6d69747-h572n" event={"ID":"5001539a-ee9d-44c9-bcab-58e3720323ae","Type":"ContainerStarted","Data":"8b42058afd3a7a020fe04d2580f7df651795c7773a90ee26f6bdcbf9cb1b851f"} Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.498791 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.628944681 podStartE2EDuration="15.498770895s" podCreationTimestamp="2026-03-18 12:33:16 +0000 UTC" firstStartedPulling="2026-03-18 12:33:17.428301769 +0000 UTC m=+1431.144127293" lastFinishedPulling="2026-03-18 12:33:30.298127983 +0000 UTC m=+1444.013953507" observedRunningTime="2026-03-18 12:33:31.480481206 +0000 UTC m=+1445.196306730" watchObservedRunningTime="2026-03-18 12:33:31.498770895 +0000 UTC m=+1445.214596419" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.511891 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fb7d6bc54-x5f24" event={"ID":"7321460c-9acb-47b8-b14c-d3f17cd937ec","Type":"ContainerDied","Data":"cec48c31b364e0ce8d41d58b2bdaf09f90d03dbef86461b69479992bcbecda31"} Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.511965 4843 scope.go:117] "RemoveContainer" containerID="de5293daf4d4080c399e13311d5f89c19ae291442390ec8def56b3ccb3ab78c6" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.512134 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fb7d6bc54-x5f24" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.534979 4843 generic.go:334] "Generic (PLEG): container finished" podID="8ed72136-75da-4a94-a22a-c2511b6bb45a" containerID="629d7f5287d74b525ef7341e141dde0610d2ccd86069f9f863b31b01d4ac220d" exitCode=143 Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.535107 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.535449 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ed72136-75da-4a94-a22a-c2511b6bb45a","Type":"ContainerDied","Data":"629d7f5287d74b525ef7341e141dde0610d2ccd86069f9f863b31b01d4ac220d"} Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.553558 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-af08-account-create-update-5q5p4" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.581345 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6fb7d6bc54-x5f24"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.605537 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6fb7d6bc54-x5f24"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.609455 4843 scope.go:117] "RemoveContainer" containerID="fa81f9360ad799dfa6ed8d506fc0a45a50aec5c8f0871de611e40c7de41855b6" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.626770 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.660566 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.665172 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.672530 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.672739 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.689081 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.840385 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480953e1-6024-4f9e-9f0b-a14f95a047cc-config-data\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.840425 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd7rj\" (UniqueName: \"kubernetes.io/projected/480953e1-6024-4f9e-9f0b-a14f95a047cc-kube-api-access-gd7rj\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.840489 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480953e1-6024-4f9e-9f0b-a14f95a047cc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.840539 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/480953e1-6024-4f9e-9f0b-a14f95a047cc-scripts\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.840566 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/480953e1-6024-4f9e-9f0b-a14f95a047cc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.840611 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/480953e1-6024-4f9e-9f0b-a14f95a047cc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.941694 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480953e1-6024-4f9e-9f0b-a14f95a047cc-config-data\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.941739 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd7rj\" (UniqueName: \"kubernetes.io/projected/480953e1-6024-4f9e-9f0b-a14f95a047cc-kube-api-access-gd7rj\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.941811 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480953e1-6024-4f9e-9f0b-a14f95a047cc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.941857 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/480953e1-6024-4f9e-9f0b-a14f95a047cc-scripts\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.941889 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/480953e1-6024-4f9e-9f0b-a14f95a047cc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.941926 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/480953e1-6024-4f9e-9f0b-a14f95a047cc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.942039 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/480953e1-6024-4f9e-9f0b-a14f95a047cc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.947634 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480953e1-6024-4f9e-9f0b-a14f95a047cc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.952559 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480953e1-6024-4f9e-9f0b-a14f95a047cc-config-data\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.952608 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/480953e1-6024-4f9e-9f0b-a14f95a047cc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.953636 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/480953e1-6024-4f9e-9f0b-a14f95a047cc-scripts\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:31 crc kubenswrapper[4843]: I0318 12:33:31.969762 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd7rj\" (UniqueName: \"kubernetes.io/projected/480953e1-6024-4f9e-9f0b-a14f95a047cc-kube-api-access-gd7rj\") pod \"cinder-scheduler-0\" (UID: \"480953e1-6024-4f9e-9f0b-a14f95a047cc\") " pod="openstack/cinder-scheduler-0" Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.038049 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.038791 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7ggtj"] Mar 18 12:33:32 crc kubenswrapper[4843]: W0318 12:33:32.038962 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97556db6_d486_4ca2_9218_13e7691ae6ee.slice/crio-c5ca7d1bb8703aa28a9d97ecc30286801c4fbf83e762c98a22ea39b87e20f829 WatchSource:0}: Error finding container c5ca7d1bb8703aa28a9d97ecc30286801c4fbf83e762c98a22ea39b87e20f829: Status 404 returned error can't find the container with id c5ca7d1bb8703aa28a9d97ecc30286801c4fbf83e762c98a22ea39b87e20f829 Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.172846 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0dac-account-create-update-ttlb8"] Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.296782 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7cjkn"] Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.303807 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-l67jn"] Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.332724 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2934-account-create-update-vtchm"] Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.437099 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-af08-account-create-update-5q5p4"] Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.548043 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0dac-account-create-update-ttlb8" event={"ID":"50047908-1fa0-4181-a690-f87f5d4b0a6a","Type":"ContainerStarted","Data":"8d928a4c134e013085bf463bbf4b6e935645e2bdea80e0e9e44784c294ec7395"} Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.548095 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0dac-account-create-update-ttlb8" event={"ID":"50047908-1fa0-4181-a690-f87f5d4b0a6a","Type":"ContainerStarted","Data":"02ecc17bb4b16bc82e4fa7d006c0b7e024221358f6038ffe33efd236da8c973b"} Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.554485 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76c6d69747-h572n" event={"ID":"5001539a-ee9d-44c9-bcab-58e3720323ae","Type":"ContainerStarted","Data":"795df17092fdf33bbe93ccf50872f93e851bb0526d049eb65c8ef71fbfc7624c"} Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.555480 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.555503 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.571542 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-l67jn" event={"ID":"479484ef-0791-436d-91df-e50b0b4390b2","Type":"ContainerStarted","Data":"97c7c8c88ad5f8a90ae2f8dada937bc7a1a67fc3ff2835421ef578766a67170e"} Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.574615 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2934-account-create-update-vtchm" event={"ID":"7ed163bc-a192-464b-9025-43220a0864b7","Type":"ContainerStarted","Data":"2e447dfe76b336042bfa2b2385e2beb6fe82b3b9ba97f987fc3ef244f3c8b1cd"} Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.575912 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7cjkn" event={"ID":"27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4","Type":"ContainerStarted","Data":"f8d929fe9e019317e2c9a0115bd7339363c4961f0fa67c6e3f82c996fd8db0ad"} Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.580159 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.580380 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.584128 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7ggtj" event={"ID":"97556db6-d486-4ca2-9218-13e7691ae6ee","Type":"ContainerStarted","Data":"5375cdbdf6aa2c6b49317e91459362258a0b5dea21f6782dc26054d721dd2c4d"} Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.584152 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7ggtj" event={"ID":"97556db6-d486-4ca2-9218-13e7691ae6ee","Type":"ContainerStarted","Data":"c5ca7d1bb8703aa28a9d97ecc30286801c4fbf83e762c98a22ea39b87e20f829"} Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.584987 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0dac-account-create-update-ttlb8" podStartSLOduration=2.584966156 podStartE2EDuration="2.584966156s" podCreationTimestamp="2026-03-18 12:33:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:32.562705393 +0000 UTC m=+1446.278530917" watchObservedRunningTime="2026-03-18 12:33:32.584966156 +0000 UTC m=+1446.300791680" Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.590519 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-af08-account-create-update-5q5p4" event={"ID":"3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf","Type":"ContainerStarted","Data":"060967a71b74949095b279fdf234e60cc82ea5f0bd7c6dd62843a1bb81e1d473"} Mar 18 12:33:32 crc kubenswrapper[4843]: W0318 12:33:32.608055 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod480953e1_6024_4f9e_9f0b_a14f95a047cc.slice/crio-3e6ef6ccdeab8084310bb04dd70398bc3be7184e3c6cee9370687582d757c052 WatchSource:0}: Error finding container 3e6ef6ccdeab8084310bb04dd70398bc3be7184e3c6cee9370687582d757c052: Status 404 returned error can't find the container with id 3e6ef6ccdeab8084310bb04dd70398bc3be7184e3c6cee9370687582d757c052 Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.608682 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-76c6d69747-h572n" podStartSLOduration=7.608659469 podStartE2EDuration="7.608659469s" podCreationTimestamp="2026-03-18 12:33:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:32.589298449 +0000 UTC m=+1446.305123973" watchObservedRunningTime="2026-03-18 12:33:32.608659469 +0000 UTC m=+1446.324485003" Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.892559 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7555f69bf7-z6jmw" Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.958325 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55f88f498d-gppxz"] Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.958554 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55f88f498d-gppxz" podUID="653a3476-1fa6-4afc-a036-df13d5a0c6e6" containerName="neutron-api" containerID="cri-o://dce2758a1c7d1c49b477ae393897a09e8d1a92c6cef613b6518c2f44ceef130b" gracePeriod=30 Mar 18 12:33:32 crc kubenswrapper[4843]: I0318 12:33:32.959240 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55f88f498d-gppxz" podUID="653a3476-1fa6-4afc-a036-df13d5a0c6e6" containerName="neutron-httpd" containerID="cri-o://a671e7800a3ff26c9e5e699687d50c557282c4272376ce2c7e1b0f481a7ebb3b" gracePeriod=30 Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.002331 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d711fe-44a0-4167-8155-e31794ab10e5" path="/var/lib/kubelet/pods/45d711fe-44a0-4167-8155-e31794ab10e5/volumes" Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.003085 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7321460c-9acb-47b8-b14c-d3f17cd937ec" path="/var/lib/kubelet/pods/7321460c-9acb-47b8-b14c-d3f17cd937ec/volumes" Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.657876 4843 generic.go:334] "Generic (PLEG): container finished" podID="479484ef-0791-436d-91df-e50b0b4390b2" containerID="ce9e4149d7240e54ddab26e5ab4fc65e28f66ffa281e256fa2b07abbbabef878" exitCode=0 Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.658037 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-l67jn" event={"ID":"479484ef-0791-436d-91df-e50b0b4390b2","Type":"ContainerDied","Data":"ce9e4149d7240e54ddab26e5ab4fc65e28f66ffa281e256fa2b07abbbabef878"} Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.665343 4843 generic.go:334] "Generic (PLEG): container finished" podID="faa05482-2d1f-43b3-93a2-94178afbc8a7" containerID="e952b22fa5cad4c50fae86de9f9e0bb81b48b450d35116902db80c66fe04e6c6" exitCode=0 Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.665403 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"faa05482-2d1f-43b3-93a2-94178afbc8a7","Type":"ContainerDied","Data":"e952b22fa5cad4c50fae86de9f9e0bb81b48b450d35116902db80c66fe04e6c6"} Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.675195 4843 generic.go:334] "Generic (PLEG): container finished" podID="27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4" containerID="20db8d802e7f4d1bdddd8859dd325e7f48989815b7ffbdf8b70d377128356192" exitCode=0 Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.675319 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7cjkn" event={"ID":"27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4","Type":"ContainerDied","Data":"20db8d802e7f4d1bdddd8859dd325e7f48989815b7ffbdf8b70d377128356192"} Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.692614 4843 generic.go:334] "Generic (PLEG): container finished" podID="3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf" containerID="cec1fec0f0a4d9106d73fb660bb462b724583e5825f9fb7cf9c0ede2cb623a3e" exitCode=0 Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.692694 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-af08-account-create-update-5q5p4" event={"ID":"3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf","Type":"ContainerDied","Data":"cec1fec0f0a4d9106d73fb660bb462b724583e5825f9fb7cf9c0ede2cb623a3e"} Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.701691 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"480953e1-6024-4f9e-9f0b-a14f95a047cc","Type":"ContainerStarted","Data":"b76858acbda6e4dc48920e9957ab0f2f57f4171d8c0177d6040b86663a2dfaa3"} Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.701739 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"480953e1-6024-4f9e-9f0b-a14f95a047cc","Type":"ContainerStarted","Data":"3e6ef6ccdeab8084310bb04dd70398bc3be7184e3c6cee9370687582d757c052"} Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.705231 4843 generic.go:334] "Generic (PLEG): container finished" podID="7ed163bc-a192-464b-9025-43220a0864b7" containerID="5666649acb1afd40e1f7f5b2d8526cac42defb517c4ec93d694206562175521e" exitCode=0 Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.705303 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2934-account-create-update-vtchm" event={"ID":"7ed163bc-a192-464b-9025-43220a0864b7","Type":"ContainerDied","Data":"5666649acb1afd40e1f7f5b2d8526cac42defb517c4ec93d694206562175521e"} Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.706332 4843 generic.go:334] "Generic (PLEG): container finished" podID="97556db6-d486-4ca2-9218-13e7691ae6ee" containerID="5375cdbdf6aa2c6b49317e91459362258a0b5dea21f6782dc26054d721dd2c4d" exitCode=0 Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.706377 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7ggtj" event={"ID":"97556db6-d486-4ca2-9218-13e7691ae6ee","Type":"ContainerDied","Data":"5375cdbdf6aa2c6b49317e91459362258a0b5dea21f6782dc26054d721dd2c4d"} Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.716939 4843 generic.go:334] "Generic (PLEG): container finished" podID="653a3476-1fa6-4afc-a036-df13d5a0c6e6" containerID="a671e7800a3ff26c9e5e699687d50c557282c4272376ce2c7e1b0f481a7ebb3b" exitCode=0 Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.717046 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55f88f498d-gppxz" event={"ID":"653a3476-1fa6-4afc-a036-df13d5a0c6e6","Type":"ContainerDied","Data":"a671e7800a3ff26c9e5e699687d50c557282c4272376ce2c7e1b0f481a7ebb3b"} Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.736118 4843 generic.go:334] "Generic (PLEG): container finished" podID="50047908-1fa0-4181-a690-f87f5d4b0a6a" containerID="8d928a4c134e013085bf463bbf4b6e935645e2bdea80e0e9e44784c294ec7395" exitCode=0 Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.737177 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0dac-account-create-update-ttlb8" event={"ID":"50047908-1fa0-4181-a690-f87f5d4b0a6a","Type":"ContainerDied","Data":"8d928a4c134e013085bf463bbf4b6e935645e2bdea80e0e9e44784c294ec7395"} Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.848937 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.995724 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-scripts\") pod \"faa05482-2d1f-43b3-93a2-94178afbc8a7\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.995866 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa05482-2d1f-43b3-93a2-94178afbc8a7-logs\") pod \"faa05482-2d1f-43b3-93a2-94178afbc8a7\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.995927 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-public-tls-certs\") pod \"faa05482-2d1f-43b3-93a2-94178afbc8a7\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.996048 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwpgt\" (UniqueName: \"kubernetes.io/projected/faa05482-2d1f-43b3-93a2-94178afbc8a7-kube-api-access-qwpgt\") pod \"faa05482-2d1f-43b3-93a2-94178afbc8a7\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.996102 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"faa05482-2d1f-43b3-93a2-94178afbc8a7\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.996133 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/faa05482-2d1f-43b3-93a2-94178afbc8a7-httpd-run\") pod \"faa05482-2d1f-43b3-93a2-94178afbc8a7\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.996185 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-combined-ca-bundle\") pod \"faa05482-2d1f-43b3-93a2-94178afbc8a7\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " Mar 18 12:33:33 crc kubenswrapper[4843]: I0318 12:33:33.996306 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-config-data\") pod \"faa05482-2d1f-43b3-93a2-94178afbc8a7\" (UID: \"faa05482-2d1f-43b3-93a2-94178afbc8a7\") " Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.001638 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa05482-2d1f-43b3-93a2-94178afbc8a7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "faa05482-2d1f-43b3-93a2-94178afbc8a7" (UID: "faa05482-2d1f-43b3-93a2-94178afbc8a7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.002117 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa05482-2d1f-43b3-93a2-94178afbc8a7-logs" (OuterVolumeSpecName: "logs") pod "faa05482-2d1f-43b3-93a2-94178afbc8a7" (UID: "faa05482-2d1f-43b3-93a2-94178afbc8a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.005851 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-scripts" (OuterVolumeSpecName: "scripts") pod "faa05482-2d1f-43b3-93a2-94178afbc8a7" (UID: "faa05482-2d1f-43b3-93a2-94178afbc8a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.007874 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa05482-2d1f-43b3-93a2-94178afbc8a7-kube-api-access-qwpgt" (OuterVolumeSpecName: "kube-api-access-qwpgt") pod "faa05482-2d1f-43b3-93a2-94178afbc8a7" (UID: "faa05482-2d1f-43b3-93a2-94178afbc8a7"). InnerVolumeSpecName "kube-api-access-qwpgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.010814 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "faa05482-2d1f-43b3-93a2-94178afbc8a7" (UID: "faa05482-2d1f-43b3-93a2-94178afbc8a7"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.063214 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faa05482-2d1f-43b3-93a2-94178afbc8a7" (UID: "faa05482-2d1f-43b3-93a2-94178afbc8a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.074596 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7ggtj" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.098744 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "faa05482-2d1f-43b3-93a2-94178afbc8a7" (UID: "faa05482-2d1f-43b3-93a2-94178afbc8a7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.101252 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwpgt\" (UniqueName: \"kubernetes.io/projected/faa05482-2d1f-43b3-93a2-94178afbc8a7-kube-api-access-qwpgt\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.101293 4843 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.101305 4843 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/faa05482-2d1f-43b3-93a2-94178afbc8a7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.101316 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.101325 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.101334 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa05482-2d1f-43b3-93a2-94178afbc8a7-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.101342 4843 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.121765 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-config-data" (OuterVolumeSpecName: "config-data") pod "faa05482-2d1f-43b3-93a2-94178afbc8a7" (UID: "faa05482-2d1f-43b3-93a2-94178afbc8a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.149242 4843 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.201751 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97556db6-d486-4ca2-9218-13e7691ae6ee-operator-scripts\") pod \"97556db6-d486-4ca2-9218-13e7691ae6ee\" (UID: \"97556db6-d486-4ca2-9218-13e7691ae6ee\") " Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.201812 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tvqb\" (UniqueName: \"kubernetes.io/projected/97556db6-d486-4ca2-9218-13e7691ae6ee-kube-api-access-8tvqb\") pod \"97556db6-d486-4ca2-9218-13e7691ae6ee\" (UID: \"97556db6-d486-4ca2-9218-13e7691ae6ee\") " Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.202092 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa05482-2d1f-43b3-93a2-94178afbc8a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.202104 4843 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.202903 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97556db6-d486-4ca2-9218-13e7691ae6ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97556db6-d486-4ca2-9218-13e7691ae6ee" (UID: "97556db6-d486-4ca2-9218-13e7691ae6ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.205815 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97556db6-d486-4ca2-9218-13e7691ae6ee-kube-api-access-8tvqb" (OuterVolumeSpecName: "kube-api-access-8tvqb") pod "97556db6-d486-4ca2-9218-13e7691ae6ee" (UID: "97556db6-d486-4ca2-9218-13e7691ae6ee"). InnerVolumeSpecName "kube-api-access-8tvqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.304541 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97556db6-d486-4ca2-9218-13e7691ae6ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.304593 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tvqb\" (UniqueName: \"kubernetes.io/projected/97556db6-d486-4ca2-9218-13e7691ae6ee-kube-api-access-8tvqb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.760389 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"faa05482-2d1f-43b3-93a2-94178afbc8a7","Type":"ContainerDied","Data":"f5741fd3765fa23a1bfe77a07da1e68a853d48f3e3dd17ff299d2f253843414c"} Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.760720 4843 scope.go:117] "RemoveContainer" containerID="e952b22fa5cad4c50fae86de9f9e0bb81b48b450d35116902db80c66fe04e6c6" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.760897 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.773871 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7ggtj" event={"ID":"97556db6-d486-4ca2-9218-13e7691ae6ee","Type":"ContainerDied","Data":"c5ca7d1bb8703aa28a9d97ecc30286801c4fbf83e762c98a22ea39b87e20f829"} Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.773911 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5ca7d1bb8703aa28a9d97ecc30286801c4fbf83e762c98a22ea39b87e20f829" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.773971 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7ggtj" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.808557 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"480953e1-6024-4f9e-9f0b-a14f95a047cc","Type":"ContainerStarted","Data":"3072fc59238b3f0cf0b066d18e3f1d52db9abe6c5dd506fd23c0cc92baea0e5d"} Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.817963 4843 scope.go:117] "RemoveContainer" containerID="e5120395adcb25710ae79cf4e7aca3389748547e1ac64b3de7f3cc8c340582b9" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.831591 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.835478 4843 generic.go:334] "Generic (PLEG): container finished" podID="8ed72136-75da-4a94-a22a-c2511b6bb45a" containerID="80974ee1c1587c80aaa6ca7d23a6ee4bb548ba7b77992ab0e0be1c8c208d2f69" exitCode=0 Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.835703 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ed72136-75da-4a94-a22a-c2511b6bb45a","Type":"ContainerDied","Data":"80974ee1c1587c80aaa6ca7d23a6ee4bb548ba7b77992ab0e0be1c8c208d2f69"} Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.842816 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.857065 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:34 crc kubenswrapper[4843]: E0318 12:33:34.857881 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa05482-2d1f-43b3-93a2-94178afbc8a7" containerName="glance-httpd" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.857900 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa05482-2d1f-43b3-93a2-94178afbc8a7" containerName="glance-httpd" Mar 18 12:33:34 crc kubenswrapper[4843]: E0318 12:33:34.857916 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa05482-2d1f-43b3-93a2-94178afbc8a7" containerName="glance-log" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.857924 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa05482-2d1f-43b3-93a2-94178afbc8a7" containerName="glance-log" Mar 18 12:33:34 crc kubenswrapper[4843]: E0318 12:33:34.857941 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97556db6-d486-4ca2-9218-13e7691ae6ee" containerName="mariadb-database-create" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.857947 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="97556db6-d486-4ca2-9218-13e7691ae6ee" containerName="mariadb-database-create" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.858120 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa05482-2d1f-43b3-93a2-94178afbc8a7" containerName="glance-httpd" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.858138 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="97556db6-d486-4ca2-9218-13e7691ae6ee" containerName="mariadb-database-create" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.858153 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa05482-2d1f-43b3-93a2-94178afbc8a7" containerName="glance-log" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.859233 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.872620 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.87259784 podStartE2EDuration="3.87259784s" podCreationTimestamp="2026-03-18 12:33:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:34.854089734 +0000 UTC m=+1448.569915258" watchObservedRunningTime="2026-03-18 12:33:34.87259784 +0000 UTC m=+1448.588423364" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.879668 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.879935 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.893057 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:34 crc kubenswrapper[4843]: I0318 12:33:34.934279 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.006366 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faa05482-2d1f-43b3-93a2-94178afbc8a7" path="/var/lib/kubelet/pods/faa05482-2d1f-43b3-93a2-94178afbc8a7/volumes" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.022567 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.022685 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n6rm\" (UniqueName: \"kubernetes.io/projected/f0845b03-51e4-4a41-9b69-895e588930be-kube-api-access-4n6rm\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.022729 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0845b03-51e4-4a41-9b69-895e588930be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.022745 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0845b03-51e4-4a41-9b69-895e588930be-logs\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.022769 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0845b03-51e4-4a41-9b69-895e588930be-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.022811 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0845b03-51e4-4a41-9b69-895e588930be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.022842 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0845b03-51e4-4a41-9b69-895e588930be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.022883 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0845b03-51e4-4a41-9b69-895e588930be-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.124256 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-combined-ca-bundle\") pod \"8ed72136-75da-4a94-a22a-c2511b6bb45a\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.124371 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d89l5\" (UniqueName: \"kubernetes.io/projected/8ed72136-75da-4a94-a22a-c2511b6bb45a-kube-api-access-d89l5\") pod \"8ed72136-75da-4a94-a22a-c2511b6bb45a\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.124400 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed72136-75da-4a94-a22a-c2511b6bb45a-httpd-run\") pod \"8ed72136-75da-4a94-a22a-c2511b6bb45a\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.124508 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-config-data\") pod \"8ed72136-75da-4a94-a22a-c2511b6bb45a\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.124546 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-scripts\") pod \"8ed72136-75da-4a94-a22a-c2511b6bb45a\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.124598 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed72136-75da-4a94-a22a-c2511b6bb45a-logs\") pod \"8ed72136-75da-4a94-a22a-c2511b6bb45a\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.124678 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-internal-tls-certs\") pod \"8ed72136-75da-4a94-a22a-c2511b6bb45a\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.124710 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"8ed72136-75da-4a94-a22a-c2511b6bb45a\" (UID: \"8ed72136-75da-4a94-a22a-c2511b6bb45a\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.124932 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0845b03-51e4-4a41-9b69-895e588930be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.125883 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0845b03-51e4-4a41-9b69-895e588930be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.125954 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0845b03-51e4-4a41-9b69-895e588930be-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.125992 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.126097 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n6rm\" (UniqueName: \"kubernetes.io/projected/f0845b03-51e4-4a41-9b69-895e588930be-kube-api-access-4n6rm\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.126128 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0845b03-51e4-4a41-9b69-895e588930be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.126165 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0845b03-51e4-4a41-9b69-895e588930be-logs\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.126191 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0845b03-51e4-4a41-9b69-895e588930be-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.126974 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed72136-75da-4a94-a22a-c2511b6bb45a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8ed72136-75da-4a94-a22a-c2511b6bb45a" (UID: "8ed72136-75da-4a94-a22a-c2511b6bb45a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.127182 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed72136-75da-4a94-a22a-c2511b6bb45a-logs" (OuterVolumeSpecName: "logs") pod "8ed72136-75da-4a94-a22a-c2511b6bb45a" (UID: "8ed72136-75da-4a94-a22a-c2511b6bb45a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.127456 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.134148 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0845b03-51e4-4a41-9b69-895e588930be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.136541 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0845b03-51e4-4a41-9b69-895e588930be-logs\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.149007 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0845b03-51e4-4a41-9b69-895e588930be-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.167100 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0845b03-51e4-4a41-9b69-895e588930be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.168580 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ed72136-75da-4a94-a22a-c2511b6bb45a-kube-api-access-d89l5" (OuterVolumeSpecName: "kube-api-access-d89l5") pod "8ed72136-75da-4a94-a22a-c2511b6bb45a" (UID: "8ed72136-75da-4a94-a22a-c2511b6bb45a"). InnerVolumeSpecName "kube-api-access-d89l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.168882 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "8ed72136-75da-4a94-a22a-c2511b6bb45a" (UID: "8ed72136-75da-4a94-a22a-c2511b6bb45a"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.172474 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0845b03-51e4-4a41-9b69-895e588930be-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.182221 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-scripts" (OuterVolumeSpecName: "scripts") pod "8ed72136-75da-4a94-a22a-c2511b6bb45a" (UID: "8ed72136-75da-4a94-a22a-c2511b6bb45a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.182697 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0845b03-51e4-4a41-9b69-895e588930be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.206918 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n6rm\" (UniqueName: \"kubernetes.io/projected/f0845b03-51e4-4a41-9b69-895e588930be-kube-api-access-4n6rm\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.228988 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed72136-75da-4a94-a22a-c2511b6bb45a-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.229030 4843 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.229042 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d89l5\" (UniqueName: \"kubernetes.io/projected/8ed72136-75da-4a94-a22a-c2511b6bb45a-kube-api-access-d89l5\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.229054 4843 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed72136-75da-4a94-a22a-c2511b6bb45a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.229062 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.285769 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ed72136-75da-4a94-a22a-c2511b6bb45a" (UID: "8ed72136-75da-4a94-a22a-c2511b6bb45a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.302544 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f0845b03-51e4-4a41-9b69-895e588930be\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.303859 4843 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.344162 4843 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.344350 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.368632 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8ed72136-75da-4a94-a22a-c2511b6bb45a" (UID: "8ed72136-75da-4a94-a22a-c2511b6bb45a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.419616 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-config-data" (OuterVolumeSpecName: "config-data") pod "8ed72136-75da-4a94-a22a-c2511b6bb45a" (UID: "8ed72136-75da-4a94-a22a-c2511b6bb45a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.447172 4843 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.447201 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed72136-75da-4a94-a22a-c2511b6bb45a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.548484 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.561599 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2934-account-create-update-vtchm" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.651625 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v549x\" (UniqueName: \"kubernetes.io/projected/7ed163bc-a192-464b-9025-43220a0864b7-kube-api-access-v549x\") pod \"7ed163bc-a192-464b-9025-43220a0864b7\" (UID: \"7ed163bc-a192-464b-9025-43220a0864b7\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.651724 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed163bc-a192-464b-9025-43220a0864b7-operator-scripts\") pod \"7ed163bc-a192-464b-9025-43220a0864b7\" (UID: \"7ed163bc-a192-464b-9025-43220a0864b7\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.656937 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ed163bc-a192-464b-9025-43220a0864b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ed163bc-a192-464b-9025-43220a0864b7" (UID: "7ed163bc-a192-464b-9025-43220a0864b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.688843 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed163bc-a192-464b-9025-43220a0864b7-kube-api-access-v549x" (OuterVolumeSpecName: "kube-api-access-v549x") pod "7ed163bc-a192-464b-9025-43220a0864b7" (UID: "7ed163bc-a192-464b-9025-43220a0864b7"). InnerVolumeSpecName "kube-api-access-v549x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.735420 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-l67jn" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.750059 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-af08-account-create-update-5q5p4" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.758939 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v549x\" (UniqueName: \"kubernetes.io/projected/7ed163bc-a192-464b-9025-43220a0864b7-kube-api-access-v549x\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.758966 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ed163bc-a192-464b-9025-43220a0864b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.759838 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0dac-account-create-update-ttlb8" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.787973 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7cjkn" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.866603 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479484ef-0791-436d-91df-e50b0b4390b2-operator-scripts\") pod \"479484ef-0791-436d-91df-e50b0b4390b2\" (UID: \"479484ef-0791-436d-91df-e50b0b4390b2\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.866688 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf-operator-scripts\") pod \"3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf\" (UID: \"3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.866757 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50047908-1fa0-4181-a690-f87f5d4b0a6a-operator-scripts\") pod \"50047908-1fa0-4181-a690-f87f5d4b0a6a\" (UID: \"50047908-1fa0-4181-a690-f87f5d4b0a6a\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.866804 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qlsb\" (UniqueName: \"kubernetes.io/projected/27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4-kube-api-access-6qlsb\") pod \"27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4\" (UID: \"27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.866825 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flx8z\" (UniqueName: \"kubernetes.io/projected/3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf-kube-api-access-flx8z\") pod \"3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf\" (UID: \"3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.866843 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztqhn\" (UniqueName: \"kubernetes.io/projected/479484ef-0791-436d-91df-e50b0b4390b2-kube-api-access-ztqhn\") pod \"479484ef-0791-436d-91df-e50b0b4390b2\" (UID: \"479484ef-0791-436d-91df-e50b0b4390b2\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.866879 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4-operator-scripts\") pod \"27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4\" (UID: \"27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.866966 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5m5d\" (UniqueName: \"kubernetes.io/projected/50047908-1fa0-4181-a690-f87f5d4b0a6a-kube-api-access-s5m5d\") pod \"50047908-1fa0-4181-a690-f87f5d4b0a6a\" (UID: \"50047908-1fa0-4181-a690-f87f5d4b0a6a\") " Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.872265 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479484ef-0791-436d-91df-e50b0b4390b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "479484ef-0791-436d-91df-e50b0b4390b2" (UID: "479484ef-0791-436d-91df-e50b0b4390b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.872288 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf" (UID: "3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.872746 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50047908-1fa0-4181-a690-f87f5d4b0a6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50047908-1fa0-4181-a690-f87f5d4b0a6a" (UID: "50047908-1fa0-4181-a690-f87f5d4b0a6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.877709 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479484ef-0791-436d-91df-e50b0b4390b2-kube-api-access-ztqhn" (OuterVolumeSpecName: "kube-api-access-ztqhn") pod "479484ef-0791-436d-91df-e50b0b4390b2" (UID: "479484ef-0791-436d-91df-e50b0b4390b2"). InnerVolumeSpecName "kube-api-access-ztqhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.880036 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4-kube-api-access-6qlsb" (OuterVolumeSpecName: "kube-api-access-6qlsb") pod "27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4" (UID: "27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4"). InnerVolumeSpecName "kube-api-access-6qlsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.880148 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50047908-1fa0-4181-a690-f87f5d4b0a6a-kube-api-access-s5m5d" (OuterVolumeSpecName: "kube-api-access-s5m5d") pod "50047908-1fa0-4181-a690-f87f5d4b0a6a" (UID: "50047908-1fa0-4181-a690-f87f5d4b0a6a"). InnerVolumeSpecName "kube-api-access-s5m5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.881121 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf-kube-api-access-flx8z" (OuterVolumeSpecName: "kube-api-access-flx8z") pod "3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf" (UID: "3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf"). InnerVolumeSpecName "kube-api-access-flx8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.881882 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4" (UID: "27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.883925 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0dac-account-create-update-ttlb8" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.884815 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0dac-account-create-update-ttlb8" event={"ID":"50047908-1fa0-4181-a690-f87f5d4b0a6a","Type":"ContainerDied","Data":"02ecc17bb4b16bc82e4fa7d006c0b7e024221358f6038ffe33efd236da8c973b"} Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.884869 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02ecc17bb4b16bc82e4fa7d006c0b7e024221358f6038ffe33efd236da8c973b" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.896478 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-l67jn" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.896645 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-l67jn" event={"ID":"479484ef-0791-436d-91df-e50b0b4390b2","Type":"ContainerDied","Data":"97c7c8c88ad5f8a90ae2f8dada937bc7a1a67fc3ff2835421ef578766a67170e"} Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.896750 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97c7c8c88ad5f8a90ae2f8dada937bc7a1a67fc3ff2835421ef578766a67170e" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.899193 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ed72136-75da-4a94-a22a-c2511b6bb45a","Type":"ContainerDied","Data":"1c50a1f6a680a7d49e079147dd8158d24252e7555ca1e7a8c2cb46214b52a5ba"} Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.899315 4843 scope.go:117] "RemoveContainer" containerID="80974ee1c1587c80aaa6ca7d23a6ee4bb548ba7b77992ab0e0be1c8c208d2f69" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.899500 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.911159 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2934-account-create-update-vtchm" event={"ID":"7ed163bc-a192-464b-9025-43220a0864b7","Type":"ContainerDied","Data":"2e447dfe76b336042bfa2b2385e2beb6fe82b3b9ba97f987fc3ef244f3c8b1cd"} Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.911214 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e447dfe76b336042bfa2b2385e2beb6fe82b3b9ba97f987fc3ef244f3c8b1cd" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.911295 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2934-account-create-update-vtchm" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.936723 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7cjkn" event={"ID":"27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4","Type":"ContainerDied","Data":"f8d929fe9e019317e2c9a0115bd7339363c4961f0fa67c6e3f82c996fd8db0ad"} Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.936765 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8d929fe9e019317e2c9a0115bd7339363c4961f0fa67c6e3f82c996fd8db0ad" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.937033 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7cjkn" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.945717 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.951026 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.951152 4843 scope.go:117] "RemoveContainer" containerID="629d7f5287d74b525ef7341e141dde0610d2ccd86069f9f863b31b01d4ac220d" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.965109 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-af08-account-create-update-5q5p4" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.965138 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-af08-account-create-update-5q5p4" event={"ID":"3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf","Type":"ContainerDied","Data":"060967a71b74949095b279fdf234e60cc82ea5f0bd7c6dd62843a1bb81e1d473"} Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.965191 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="060967a71b74949095b279fdf234e60cc82ea5f0bd7c6dd62843a1bb81e1d473" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.966197 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:35 crc kubenswrapper[4843]: E0318 12:33:35.966566 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479484ef-0791-436d-91df-e50b0b4390b2" containerName="mariadb-database-create" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.966578 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="479484ef-0791-436d-91df-e50b0b4390b2" containerName="mariadb-database-create" Mar 18 12:33:35 crc kubenswrapper[4843]: E0318 12:33:35.966592 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed72136-75da-4a94-a22a-c2511b6bb45a" containerName="glance-log" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.966598 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed72136-75da-4a94-a22a-c2511b6bb45a" containerName="glance-log" Mar 18 12:33:35 crc kubenswrapper[4843]: E0318 12:33:35.966620 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4" containerName="mariadb-database-create" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.966627 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4" containerName="mariadb-database-create" Mar 18 12:33:35 crc kubenswrapper[4843]: E0318 12:33:35.966637 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed163bc-a192-464b-9025-43220a0864b7" containerName="mariadb-account-create-update" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.966643 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed163bc-a192-464b-9025-43220a0864b7" containerName="mariadb-account-create-update" Mar 18 12:33:35 crc kubenswrapper[4843]: E0318 12:33:35.966673 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed72136-75da-4a94-a22a-c2511b6bb45a" containerName="glance-httpd" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.966678 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed72136-75da-4a94-a22a-c2511b6bb45a" containerName="glance-httpd" Mar 18 12:33:35 crc kubenswrapper[4843]: E0318 12:33:35.966691 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf" containerName="mariadb-account-create-update" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.966696 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf" containerName="mariadb-account-create-update" Mar 18 12:33:35 crc kubenswrapper[4843]: E0318 12:33:35.966708 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50047908-1fa0-4181-a690-f87f5d4b0a6a" containerName="mariadb-account-create-update" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.966714 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="50047908-1fa0-4181-a690-f87f5d4b0a6a" containerName="mariadb-account-create-update" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.966880 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="50047908-1fa0-4181-a690-f87f5d4b0a6a" containerName="mariadb-account-create-update" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.967097 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="479484ef-0791-436d-91df-e50b0b4390b2" containerName="mariadb-database-create" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.967108 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed163bc-a192-464b-9025-43220a0864b7" containerName="mariadb-account-create-update" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.967119 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed72136-75da-4a94-a22a-c2511b6bb45a" containerName="glance-log" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.967135 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4" containerName="mariadb-database-create" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.967144 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf" containerName="mariadb-account-create-update" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.967151 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed72136-75da-4a94-a22a-c2511b6bb45a" containerName="glance-httpd" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.968269 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.968926 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479484ef-0791-436d-91df-e50b0b4390b2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.968958 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.968973 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50047908-1fa0-4181-a690-f87f5d4b0a6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.968986 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qlsb\" (UniqueName: \"kubernetes.io/projected/27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4-kube-api-access-6qlsb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.969029 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flx8z\" (UniqueName: \"kubernetes.io/projected/3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf-kube-api-access-flx8z\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.969045 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztqhn\" (UniqueName: \"kubernetes.io/projected/479484ef-0791-436d-91df-e50b0b4390b2-kube-api-access-ztqhn\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.969059 4843 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.969072 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5m5d\" (UniqueName: \"kubernetes.io/projected/50047908-1fa0-4181-a690-f87f5d4b0a6a-kube-api-access-s5m5d\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.971418 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.971588 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 12:33:35 crc kubenswrapper[4843]: I0318 12:33:35.986551 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.048228 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.052035 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76c6d69747-h572n" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.075740 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d9ce903-7368-4024-bb16-5563248684cc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.075823 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gndb7\" (UniqueName: \"kubernetes.io/projected/3d9ce903-7368-4024-bb16-5563248684cc-kube-api-access-gndb7\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.075864 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d9ce903-7368-4024-bb16-5563248684cc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.075944 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d9ce903-7368-4024-bb16-5563248684cc-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.075984 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d9ce903-7368-4024-bb16-5563248684cc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.076014 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9ce903-7368-4024-bb16-5563248684cc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.076035 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.076134 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d9ce903-7368-4024-bb16-5563248684cc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.178009 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d9ce903-7368-4024-bb16-5563248684cc-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.178096 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d9ce903-7368-4024-bb16-5563248684cc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.178149 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9ce903-7368-4024-bb16-5563248684cc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.178175 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.178290 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d9ce903-7368-4024-bb16-5563248684cc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.180336 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.180747 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d9ce903-7368-4024-bb16-5563248684cc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.182589 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9ce903-7368-4024-bb16-5563248684cc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.183308 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d9ce903-7368-4024-bb16-5563248684cc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.199166 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d9ce903-7368-4024-bb16-5563248684cc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.199276 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gndb7\" (UniqueName: \"kubernetes.io/projected/3d9ce903-7368-4024-bb16-5563248684cc-kube-api-access-gndb7\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.199345 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d9ce903-7368-4024-bb16-5563248684cc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.200032 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d9ce903-7368-4024-bb16-5563248684cc-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.214846 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d9ce903-7368-4024-bb16-5563248684cc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.216432 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d9ce903-7368-4024-bb16-5563248684cc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.224543 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.231639 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gndb7\" (UniqueName: \"kubernetes.io/projected/3d9ce903-7368-4024-bb16-5563248684cc-kube-api-access-gndb7\") pod \"glance-default-internal-api-0\" (UID: \"3d9ce903-7368-4024-bb16-5563248684cc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: W0318 12:33:36.302916 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0845b03_51e4_4a41_9b69_895e588930be.slice/crio-a3826119e122e26c3de17d71e898a72e3672a45bb4637f9e8c7c2ee6ec93f575 WatchSource:0}: Error finding container a3826119e122e26c3de17d71e898a72e3672a45bb4637f9e8c7c2ee6ec93f575: Status 404 returned error can't find the container with id a3826119e122e26c3de17d71e898a72e3672a45bb4637f9e8c7c2ee6ec93f575 Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.310337 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.343122 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.894903 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:36 crc kubenswrapper[4843]: W0318 12:33:36.894919 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d9ce903_7368_4024_bb16_5563248684cc.slice/crio-d97f9943439cf7dd9cafa8cd4a6b5e6bb5f445b179c280f88a5c51c333a7e96b WatchSource:0}: Error finding container d97f9943439cf7dd9cafa8cd4a6b5e6bb5f445b179c280f88a5c51c333a7e96b: Status 404 returned error can't find the container with id d97f9943439cf7dd9cafa8cd4a6b5e6bb5f445b179c280f88a5c51c333a7e96b Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.979054 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0845b03-51e4-4a41-9b69-895e588930be","Type":"ContainerStarted","Data":"a3826119e122e26c3de17d71e898a72e3672a45bb4637f9e8c7c2ee6ec93f575"} Mar 18 12:33:36 crc kubenswrapper[4843]: I0318 12:33:36.982980 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d9ce903-7368-4024-bb16-5563248684cc","Type":"ContainerStarted","Data":"d97f9943439cf7dd9cafa8cd4a6b5e6bb5f445b179c280f88a5c51c333a7e96b"} Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.006222 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ed72136-75da-4a94-a22a-c2511b6bb45a" path="/var/lib/kubelet/pods/8ed72136-75da-4a94-a22a-c2511b6bb45a/volumes" Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.040765 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.714270 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.828543 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sts2\" (UniqueName: \"kubernetes.io/projected/653a3476-1fa6-4afc-a036-df13d5a0c6e6-kube-api-access-4sts2\") pod \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.828769 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-combined-ca-bundle\") pod \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.828811 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-config\") pod \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.828863 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-ovndb-tls-certs\") pod \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.828894 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-httpd-config\") pod \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\" (UID: \"653a3476-1fa6-4afc-a036-df13d5a0c6e6\") " Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.835232 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/653a3476-1fa6-4afc-a036-df13d5a0c6e6-kube-api-access-4sts2" (OuterVolumeSpecName: "kube-api-access-4sts2") pod "653a3476-1fa6-4afc-a036-df13d5a0c6e6" (UID: "653a3476-1fa6-4afc-a036-df13d5a0c6e6"). InnerVolumeSpecName "kube-api-access-4sts2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.840048 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "653a3476-1fa6-4afc-a036-df13d5a0c6e6" (UID: "653a3476-1fa6-4afc-a036-df13d5a0c6e6"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.899771 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-config" (OuterVolumeSpecName: "config") pod "653a3476-1fa6-4afc-a036-df13d5a0c6e6" (UID: "653a3476-1fa6-4afc-a036-df13d5a0c6e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.922738 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "653a3476-1fa6-4afc-a036-df13d5a0c6e6" (UID: "653a3476-1fa6-4afc-a036-df13d5a0c6e6"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.925700 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "653a3476-1fa6-4afc-a036-df13d5a0c6e6" (UID: "653a3476-1fa6-4afc-a036-df13d5a0c6e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.930834 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.930906 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.930923 4843 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.930935 4843 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/653a3476-1fa6-4afc-a036-df13d5a0c6e6-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.930946 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sts2\" (UniqueName: \"kubernetes.io/projected/653a3476-1fa6-4afc-a036-df13d5a0c6e6-kube-api-access-4sts2\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:37 crc kubenswrapper[4843]: I0318 12:33:37.994835 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d9ce903-7368-4024-bb16-5563248684cc","Type":"ContainerStarted","Data":"dc883ec07f48d97ceb84abab976db3b8de85834b3a7179bedbbbc1a288bfde1e"} Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.000007 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0845b03-51e4-4a41-9b69-895e588930be","Type":"ContainerStarted","Data":"5c9294493eb085da2a3a8bf85f373d53e8e4ad5b16db77bc10c4614adf9b6c9e"} Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.000055 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0845b03-51e4-4a41-9b69-895e588930be","Type":"ContainerStarted","Data":"08b9d1dcbe7cd65c54fa5fca716f03163f561b6e79785865b856046db8bff749"} Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.002443 4843 generic.go:334] "Generic (PLEG): container finished" podID="653a3476-1fa6-4afc-a036-df13d5a0c6e6" containerID="dce2758a1c7d1c49b477ae393897a09e8d1a92c6cef613b6518c2f44ceef130b" exitCode=0 Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.002535 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55f88f498d-gppxz" Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.002557 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55f88f498d-gppxz" event={"ID":"653a3476-1fa6-4afc-a036-df13d5a0c6e6","Type":"ContainerDied","Data":"dce2758a1c7d1c49b477ae393897a09e8d1a92c6cef613b6518c2f44ceef130b"} Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.003039 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55f88f498d-gppxz" event={"ID":"653a3476-1fa6-4afc-a036-df13d5a0c6e6","Type":"ContainerDied","Data":"7029e4a052a4a9f682b33d22bad1caba1c43db169686b07d67daca065fe36456"} Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.003066 4843 scope.go:117] "RemoveContainer" containerID="a671e7800a3ff26c9e5e699687d50c557282c4272376ce2c7e1b0f481a7ebb3b" Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.040078 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.04005491 podStartE2EDuration="4.04005491s" podCreationTimestamp="2026-03-18 12:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:38.019924408 +0000 UTC m=+1451.735749932" watchObservedRunningTime="2026-03-18 12:33:38.04005491 +0000 UTC m=+1451.755880434" Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.059870 4843 scope.go:117] "RemoveContainer" containerID="dce2758a1c7d1c49b477ae393897a09e8d1a92c6cef613b6518c2f44ceef130b" Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.065311 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55f88f498d-gppxz"] Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.083364 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-55f88f498d-gppxz"] Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.100264 4843 scope.go:117] "RemoveContainer" containerID="a671e7800a3ff26c9e5e699687d50c557282c4272376ce2c7e1b0f481a7ebb3b" Mar 18 12:33:38 crc kubenswrapper[4843]: E0318 12:33:38.100824 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a671e7800a3ff26c9e5e699687d50c557282c4272376ce2c7e1b0f481a7ebb3b\": container with ID starting with a671e7800a3ff26c9e5e699687d50c557282c4272376ce2c7e1b0f481a7ebb3b not found: ID does not exist" containerID="a671e7800a3ff26c9e5e699687d50c557282c4272376ce2c7e1b0f481a7ebb3b" Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.101119 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a671e7800a3ff26c9e5e699687d50c557282c4272376ce2c7e1b0f481a7ebb3b"} err="failed to get container status \"a671e7800a3ff26c9e5e699687d50c557282c4272376ce2c7e1b0f481a7ebb3b\": rpc error: code = NotFound desc = could not find container \"a671e7800a3ff26c9e5e699687d50c557282c4272376ce2c7e1b0f481a7ebb3b\": container with ID starting with a671e7800a3ff26c9e5e699687d50c557282c4272376ce2c7e1b0f481a7ebb3b not found: ID does not exist" Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.101144 4843 scope.go:117] "RemoveContainer" containerID="dce2758a1c7d1c49b477ae393897a09e8d1a92c6cef613b6518c2f44ceef130b" Mar 18 12:33:38 crc kubenswrapper[4843]: E0318 12:33:38.105798 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce2758a1c7d1c49b477ae393897a09e8d1a92c6cef613b6518c2f44ceef130b\": container with ID starting with dce2758a1c7d1c49b477ae393897a09e8d1a92c6cef613b6518c2f44ceef130b not found: ID does not exist" containerID="dce2758a1c7d1c49b477ae393897a09e8d1a92c6cef613b6518c2f44ceef130b" Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.105821 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce2758a1c7d1c49b477ae393897a09e8d1a92c6cef613b6518c2f44ceef130b"} err="failed to get container status \"dce2758a1c7d1c49b477ae393897a09e8d1a92c6cef613b6518c2f44ceef130b\": rpc error: code = NotFound desc = could not find container \"dce2758a1c7d1c49b477ae393897a09e8d1a92c6cef613b6518c2f44ceef130b\": container with ID starting with dce2758a1c7d1c49b477ae393897a09e8d1a92c6cef613b6518c2f44ceef130b not found: ID does not exist" Mar 18 12:33:38 crc kubenswrapper[4843]: I0318 12:33:38.997138 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="653a3476-1fa6-4afc-a036-df13d5a0c6e6" path="/var/lib/kubelet/pods/653a3476-1fa6-4afc-a036-df13d5a0c6e6/volumes" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.048834 4843 generic.go:334] "Generic (PLEG): container finished" podID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerID="0af6944646ea5a3a8abda87fd4574176f10b58646950af77946507c7144622bd" exitCode=137 Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.048911 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8648504-98a7-406e-9838-c8dc2d64ffe9","Type":"ContainerDied","Data":"0af6944646ea5a3a8abda87fd4574176f10b58646950af77946507c7144622bd"} Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.052025 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d9ce903-7368-4024-bb16-5563248684cc","Type":"ContainerStarted","Data":"1ce1598820089ecc7786557c52ac0d57bcc0532f2b63981a5e7eef0053099353"} Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.080058 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.080006246 podStartE2EDuration="4.080006246s" podCreationTimestamp="2026-03-18 12:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:39.076061914 +0000 UTC m=+1452.791887438" watchObservedRunningTime="2026-03-18 12:33:39.080006246 +0000 UTC m=+1452.795831770" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.326938 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.368488 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t7lx\" (UniqueName: \"kubernetes.io/projected/e8648504-98a7-406e-9838-c8dc2d64ffe9-kube-api-access-4t7lx\") pod \"e8648504-98a7-406e-9838-c8dc2d64ffe9\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.368544 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-combined-ca-bundle\") pod \"e8648504-98a7-406e-9838-c8dc2d64ffe9\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.368595 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8648504-98a7-406e-9838-c8dc2d64ffe9-log-httpd\") pod \"e8648504-98a7-406e-9838-c8dc2d64ffe9\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.368637 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-config-data\") pod \"e8648504-98a7-406e-9838-c8dc2d64ffe9\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.368795 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8648504-98a7-406e-9838-c8dc2d64ffe9-run-httpd\") pod \"e8648504-98a7-406e-9838-c8dc2d64ffe9\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.368857 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-scripts\") pod \"e8648504-98a7-406e-9838-c8dc2d64ffe9\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.368879 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-sg-core-conf-yaml\") pod \"e8648504-98a7-406e-9838-c8dc2d64ffe9\" (UID: \"e8648504-98a7-406e-9838-c8dc2d64ffe9\") " Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.369319 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8648504-98a7-406e-9838-c8dc2d64ffe9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e8648504-98a7-406e-9838-c8dc2d64ffe9" (UID: "e8648504-98a7-406e-9838-c8dc2d64ffe9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.369510 4843 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8648504-98a7-406e-9838-c8dc2d64ffe9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.369891 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8648504-98a7-406e-9838-c8dc2d64ffe9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e8648504-98a7-406e-9838-c8dc2d64ffe9" (UID: "e8648504-98a7-406e-9838-c8dc2d64ffe9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.381780 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8648504-98a7-406e-9838-c8dc2d64ffe9-kube-api-access-4t7lx" (OuterVolumeSpecName: "kube-api-access-4t7lx") pod "e8648504-98a7-406e-9838-c8dc2d64ffe9" (UID: "e8648504-98a7-406e-9838-c8dc2d64ffe9"). InnerVolumeSpecName "kube-api-access-4t7lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.381795 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-scripts" (OuterVolumeSpecName: "scripts") pod "e8648504-98a7-406e-9838-c8dc2d64ffe9" (UID: "e8648504-98a7-406e-9838-c8dc2d64ffe9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.465169 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e8648504-98a7-406e-9838-c8dc2d64ffe9" (UID: "e8648504-98a7-406e-9838-c8dc2d64ffe9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.471562 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.471587 4843 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.471598 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t7lx\" (UniqueName: \"kubernetes.io/projected/e8648504-98a7-406e-9838-c8dc2d64ffe9-kube-api-access-4t7lx\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.471607 4843 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8648504-98a7-406e-9838-c8dc2d64ffe9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.505491 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8648504-98a7-406e-9838-c8dc2d64ffe9" (UID: "e8648504-98a7-406e-9838-c8dc2d64ffe9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.510107 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-config-data" (OuterVolumeSpecName: "config-data") pod "e8648504-98a7-406e-9838-c8dc2d64ffe9" (UID: "e8648504-98a7-406e-9838-c8dc2d64ffe9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.573477 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:39 crc kubenswrapper[4843]: I0318 12:33:39.573515 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8648504-98a7-406e-9838-c8dc2d64ffe9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.063670 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8648504-98a7-406e-9838-c8dc2d64ffe9","Type":"ContainerDied","Data":"38ecf85f7b8822ac071c770370295fdbc0474891f992705a24fb353be8594af7"} Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.063980 4843 scope.go:117] "RemoveContainer" containerID="0af6944646ea5a3a8abda87fd4574176f10b58646950af77946507c7144622bd" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.063736 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.098699 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.100867 4843 scope.go:117] "RemoveContainer" containerID="6a017f03a1efe8a6c123adf81e9c6c3551c21cab076b97b0f0a289aba1ab8853" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.120919 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.131480 4843 scope.go:117] "RemoveContainer" containerID="b6c155ad15706ebdbe27889daf0cd88a255bb6094c925dde218ff066e1894018" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.133896 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:40 crc kubenswrapper[4843]: E0318 12:33:40.134457 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a3476-1fa6-4afc-a036-df13d5a0c6e6" containerName="neutron-httpd" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.134482 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a3476-1fa6-4afc-a036-df13d5a0c6e6" containerName="neutron-httpd" Mar 18 12:33:40 crc kubenswrapper[4843]: E0318 12:33:40.134492 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="ceilometer-notification-agent" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.134500 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="ceilometer-notification-agent" Mar 18 12:33:40 crc kubenswrapper[4843]: E0318 12:33:40.134543 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653a3476-1fa6-4afc-a036-df13d5a0c6e6" containerName="neutron-api" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.134551 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="653a3476-1fa6-4afc-a036-df13d5a0c6e6" containerName="neutron-api" Mar 18 12:33:40 crc kubenswrapper[4843]: E0318 12:33:40.134570 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="ceilometer-central-agent" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.134582 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="ceilometer-central-agent" Mar 18 12:33:40 crc kubenswrapper[4843]: E0318 12:33:40.134599 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="proxy-httpd" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.134607 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="proxy-httpd" Mar 18 12:33:40 crc kubenswrapper[4843]: E0318 12:33:40.134618 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="sg-core" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.134625 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="sg-core" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.134851 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="proxy-httpd" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.134868 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a3476-1fa6-4afc-a036-df13d5a0c6e6" containerName="neutron-httpd" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.134881 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="ceilometer-central-agent" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.134896 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="sg-core" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.134913 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" containerName="ceilometer-notification-agent" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.134928 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="653a3476-1fa6-4afc-a036-df13d5a0c6e6" containerName="neutron-api" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.137127 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.139636 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.141505 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.142722 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.179490 4843 scope.go:117] "RemoveContainer" containerID="7c835082341f7ff229787df7ee9fb56ae8248570b01f14e98bd2a1ebbbfb1329" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.183830 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2de25d38-5ad6-4af1-bc18-ea9974156d72-run-httpd\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.184122 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-scripts\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.184228 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.184431 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-config-data\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.184516 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2de25d38-5ad6-4af1-bc18-ea9974156d72-log-httpd\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.184598 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djmxq\" (UniqueName: \"kubernetes.io/projected/2de25d38-5ad6-4af1-bc18-ea9974156d72-kube-api-access-djmxq\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.184662 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.286852 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.286922 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2de25d38-5ad6-4af1-bc18-ea9974156d72-run-httpd\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.286979 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-scripts\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.287009 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.287076 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-config-data\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.287112 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2de25d38-5ad6-4af1-bc18-ea9974156d72-log-httpd\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.287137 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djmxq\" (UniqueName: \"kubernetes.io/projected/2de25d38-5ad6-4af1-bc18-ea9974156d72-kube-api-access-djmxq\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.288349 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2de25d38-5ad6-4af1-bc18-ea9974156d72-log-httpd\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.289243 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2de25d38-5ad6-4af1-bc18-ea9974156d72-run-httpd\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.295394 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-scripts\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.297500 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.300397 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-config-data\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.302356 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.313345 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djmxq\" (UniqueName: \"kubernetes.io/projected/2de25d38-5ad6-4af1-bc18-ea9974156d72-kube-api-access-djmxq\") pod \"ceilometer-0\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.469477 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.924890 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:40 crc kubenswrapper[4843]: W0318 12:33:40.928546 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2de25d38_5ad6_4af1_bc18_ea9974156d72.slice/crio-2d9e5e21025c8e80cdf3fae1f471f0e31a5b768482ddb87c9e7942e612232ba6 WatchSource:0}: Error finding container 2d9e5e21025c8e80cdf3fae1f471f0e31a5b768482ddb87c9e7942e612232ba6: Status 404 returned error can't find the container with id 2d9e5e21025c8e80cdf3fae1f471f0e31a5b768482ddb87c9e7942e612232ba6 Mar 18 12:33:40 crc kubenswrapper[4843]: I0318 12:33:40.997984 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8648504-98a7-406e-9838-c8dc2d64ffe9" path="/var/lib/kubelet/pods/e8648504-98a7-406e-9838-c8dc2d64ffe9/volumes" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.043685 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7dc96fcc9b-lt524" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.043831 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.072726 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2de25d38-5ad6-4af1-bc18-ea9974156d72","Type":"ContainerStarted","Data":"2d9e5e21025c8e80cdf3fae1f471f0e31a5b768482ddb87c9e7942e612232ba6"} Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.432801 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jv79s"] Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.433880 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.436371 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.436385 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.436541 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gplpb" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.441930 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jv79s"] Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.513311 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-config-data\") pod \"nova-cell0-conductor-db-sync-jv79s\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.513363 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jv79s\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.513383 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-scripts\") pod \"nova-cell0-conductor-db-sync-jv79s\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.513867 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l585m\" (UniqueName: \"kubernetes.io/projected/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-kube-api-access-l585m\") pod \"nova-cell0-conductor-db-sync-jv79s\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.618757 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l585m\" (UniqueName: \"kubernetes.io/projected/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-kube-api-access-l585m\") pod \"nova-cell0-conductor-db-sync-jv79s\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.618859 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-config-data\") pod \"nova-cell0-conductor-db-sync-jv79s\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.618881 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jv79s\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.618914 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-scripts\") pod \"nova-cell0-conductor-db-sync-jv79s\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.623137 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-scripts\") pod \"nova-cell0-conductor-db-sync-jv79s\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.627405 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jv79s\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.635295 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-config-data\") pod \"nova-cell0-conductor-db-sync-jv79s\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.666638 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l585m\" (UniqueName: \"kubernetes.io/projected/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-kube-api-access-l585m\") pod \"nova-cell0-conductor-db-sync-jv79s\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:41 crc kubenswrapper[4843]: I0318 12:33:41.766354 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:33:42 crc kubenswrapper[4843]: I0318 12:33:42.081744 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2de25d38-5ad6-4af1-bc18-ea9974156d72","Type":"ContainerStarted","Data":"11ced49d1c8ea729809a0ba672c2549bc85f8f5ba14622efa49efbe5995470f8"} Mar 18 12:33:42 crc kubenswrapper[4843]: W0318 12:33:42.231829 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2e5376f_b6a5_4462_925f_a0daa0d3aa5b.slice/crio-68bd87b620d46adf971adfcff15aed739bd32de12806546ef77d5896a131de1b WatchSource:0}: Error finding container 68bd87b620d46adf971adfcff15aed739bd32de12806546ef77d5896a131de1b: Status 404 returned error can't find the container with id 68bd87b620d46adf971adfcff15aed739bd32de12806546ef77d5896a131de1b Mar 18 12:33:42 crc kubenswrapper[4843]: I0318 12:33:42.233845 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jv79s"] Mar 18 12:33:42 crc kubenswrapper[4843]: I0318 12:33:42.280403 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 12:33:43 crc kubenswrapper[4843]: I0318 12:33:43.111509 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2de25d38-5ad6-4af1-bc18-ea9974156d72","Type":"ContainerStarted","Data":"3d2055534d53e64bc93ec205abefc93c8b4c5db7406291b4fe5075fd4183c17f"} Mar 18 12:33:43 crc kubenswrapper[4843]: I0318 12:33:43.114151 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jv79s" event={"ID":"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b","Type":"ContainerStarted","Data":"68bd87b620d46adf971adfcff15aed739bd32de12806546ef77d5896a131de1b"} Mar 18 12:33:44 crc kubenswrapper[4843]: I0318 12:33:44.126624 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2de25d38-5ad6-4af1-bc18-ea9974156d72","Type":"ContainerStarted","Data":"ec4a1644b314d51d3b4129e2436f0c3fbedf4e19e20cc8b993b44cf537ecb707"} Mar 18 12:33:45 crc kubenswrapper[4843]: I0318 12:33:45.549567 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 12:33:45 crc kubenswrapper[4843]: I0318 12:33:45.550964 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 12:33:45 crc kubenswrapper[4843]: I0318 12:33:45.603803 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 12:33:45 crc kubenswrapper[4843]: I0318 12:33:45.603869 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 12:33:45 crc kubenswrapper[4843]: I0318 12:33:45.662048 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:46 crc kubenswrapper[4843]: I0318 12:33:46.157693 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2de25d38-5ad6-4af1-bc18-ea9974156d72","Type":"ContainerStarted","Data":"7afb85411ad7b241b5a6db23adc0b716fdec211dfa4eab4ff60b7c5fc2167676"} Mar 18 12:33:46 crc kubenswrapper[4843]: I0318 12:33:46.158279 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 12:33:46 crc kubenswrapper[4843]: I0318 12:33:46.158330 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 12:33:46 crc kubenswrapper[4843]: I0318 12:33:46.187081 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.282308526 podStartE2EDuration="6.1870561s" podCreationTimestamp="2026-03-18 12:33:40 +0000 UTC" firstStartedPulling="2026-03-18 12:33:40.931329742 +0000 UTC m=+1454.647155276" lastFinishedPulling="2026-03-18 12:33:44.836077336 +0000 UTC m=+1458.551902850" observedRunningTime="2026-03-18 12:33:46.17754531 +0000 UTC m=+1459.893370834" watchObservedRunningTime="2026-03-18 12:33:46.1870561 +0000 UTC m=+1459.902881624" Mar 18 12:33:46 crc kubenswrapper[4843]: I0318 12:33:46.343921 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:46 crc kubenswrapper[4843]: I0318 12:33:46.343983 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:46 crc kubenswrapper[4843]: I0318 12:33:46.399878 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:46 crc kubenswrapper[4843]: I0318 12:33:46.410902 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:47 crc kubenswrapper[4843]: I0318 12:33:47.166069 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:33:47 crc kubenswrapper[4843]: I0318 12:33:47.166500 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="ceilometer-central-agent" containerID="cri-o://11ced49d1c8ea729809a0ba672c2549bc85f8f5ba14622efa49efbe5995470f8" gracePeriod=30 Mar 18 12:33:47 crc kubenswrapper[4843]: I0318 12:33:47.166538 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:47 crc kubenswrapper[4843]: I0318 12:33:47.166585 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:47 crc kubenswrapper[4843]: I0318 12:33:47.166611 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="ceilometer-notification-agent" containerID="cri-o://3d2055534d53e64bc93ec205abefc93c8b4c5db7406291b4fe5075fd4183c17f" gracePeriod=30 Mar 18 12:33:47 crc kubenswrapper[4843]: I0318 12:33:47.166599 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="sg-core" containerID="cri-o://ec4a1644b314d51d3b4129e2436f0c3fbedf4e19e20cc8b993b44cf537ecb707" gracePeriod=30 Mar 18 12:33:47 crc kubenswrapper[4843]: I0318 12:33:47.166570 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="proxy-httpd" containerID="cri-o://7afb85411ad7b241b5a6db23adc0b716fdec211dfa4eab4ff60b7c5fc2167676" gracePeriod=30 Mar 18 12:33:48 crc kubenswrapper[4843]: I0318 12:33:48.152394 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 12:33:48 crc kubenswrapper[4843]: I0318 12:33:48.155810 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 12:33:48 crc kubenswrapper[4843]: I0318 12:33:48.198898 4843 generic.go:334] "Generic (PLEG): container finished" podID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerID="3556ca049e4c4ff3eb611a388edd9d3f28839b1285421ed541909b3b6403983e" exitCode=137 Mar 18 12:33:48 crc kubenswrapper[4843]: I0318 12:33:48.198943 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc96fcc9b-lt524" event={"ID":"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434","Type":"ContainerDied","Data":"3556ca049e4c4ff3eb611a388edd9d3f28839b1285421ed541909b3b6403983e"} Mar 18 12:33:48 crc kubenswrapper[4843]: I0318 12:33:48.221942 4843 generic.go:334] "Generic (PLEG): container finished" podID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerID="7afb85411ad7b241b5a6db23adc0b716fdec211dfa4eab4ff60b7c5fc2167676" exitCode=0 Mar 18 12:33:48 crc kubenswrapper[4843]: I0318 12:33:48.221969 4843 generic.go:334] "Generic (PLEG): container finished" podID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerID="ec4a1644b314d51d3b4129e2436f0c3fbedf4e19e20cc8b993b44cf537ecb707" exitCode=2 Mar 18 12:33:48 crc kubenswrapper[4843]: I0318 12:33:48.221996 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2de25d38-5ad6-4af1-bc18-ea9974156d72","Type":"ContainerDied","Data":"7afb85411ad7b241b5a6db23adc0b716fdec211dfa4eab4ff60b7c5fc2167676"} Mar 18 12:33:48 crc kubenswrapper[4843]: I0318 12:33:48.222026 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2de25d38-5ad6-4af1-bc18-ea9974156d72","Type":"ContainerDied","Data":"ec4a1644b314d51d3b4129e2436f0c3fbedf4e19e20cc8b993b44cf537ecb707"} Mar 18 12:33:48 crc kubenswrapper[4843]: I0318 12:33:48.222037 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2de25d38-5ad6-4af1-bc18-ea9974156d72","Type":"ContainerDied","Data":"3d2055534d53e64bc93ec205abefc93c8b4c5db7406291b4fe5075fd4183c17f"} Mar 18 12:33:48 crc kubenswrapper[4843]: I0318 12:33:48.222055 4843 generic.go:334] "Generic (PLEG): container finished" podID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerID="3d2055534d53e64bc93ec205abefc93c8b4c5db7406291b4fe5075fd4183c17f" exitCode=0 Mar 18 12:33:49 crc kubenswrapper[4843]: I0318 12:33:49.552773 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:49 crc kubenswrapper[4843]: I0318 12:33:49.553181 4843 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:33:49 crc kubenswrapper[4843]: I0318 12:33:49.614490 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:50 crc kubenswrapper[4843]: I0318 12:33:50.034443 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:33:50 crc kubenswrapper[4843]: I0318 12:33:50.034494 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:33:51 crc kubenswrapper[4843]: I0318 12:33:51.043577 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7dc96fcc9b-lt524" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.626318 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.735644 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-horizon-secret-key\") pod \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.735738 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-horizon-tls-certs\") pod \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.735784 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-logs\") pod \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.735868 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-config-data\") pod \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.735961 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-scripts\") pod \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.735999 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mckf7\" (UniqueName: \"kubernetes.io/projected/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-kube-api-access-mckf7\") pod \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.736070 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-combined-ca-bundle\") pod \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\" (UID: \"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434\") " Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.736418 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-logs" (OuterVolumeSpecName: "logs") pod "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" (UID: "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.740791 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-kube-api-access-mckf7" (OuterVolumeSpecName: "kube-api-access-mckf7") pod "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" (UID: "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434"). InnerVolumeSpecName "kube-api-access-mckf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.740885 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" (UID: "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.759339 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-scripts" (OuterVolumeSpecName: "scripts") pod "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" (UID: "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.760586 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-config-data" (OuterVolumeSpecName: "config-data") pod "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" (UID: "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.763176 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" (UID: "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.790076 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" (UID: "8fc8c3ad-8b5f-45f0-a856-8b97ada7c434"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.837776 4843 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.838050 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.838184 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.838257 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.838328 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mckf7\" (UniqueName: \"kubernetes.io/projected/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-kube-api-access-mckf7\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.838395 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:52 crc kubenswrapper[4843]: I0318 12:33:52.838453 4843 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.275431 4843 generic.go:334] "Generic (PLEG): container finished" podID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerID="11ced49d1c8ea729809a0ba672c2549bc85f8f5ba14622efa49efbe5995470f8" exitCode=0 Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.275515 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2de25d38-5ad6-4af1-bc18-ea9974156d72","Type":"ContainerDied","Data":"11ced49d1c8ea729809a0ba672c2549bc85f8f5ba14622efa49efbe5995470f8"} Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.277475 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jv79s" event={"ID":"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b","Type":"ContainerStarted","Data":"a298a077e860eb61e48e3f862d74c7029a9d42aac32c62c5faff068cf825f077"} Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.280853 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dc96fcc9b-lt524" event={"ID":"8fc8c3ad-8b5f-45f0-a856-8b97ada7c434","Type":"ContainerDied","Data":"c6e05db2a8d042c71ae1d32e7d0214f89e5655e3fb6937991c21daa07e7c0fdc"} Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.280930 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dc96fcc9b-lt524" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.281039 4843 scope.go:117] "RemoveContainer" containerID="cc85cd1c0e532d46fd65865f18713d5bac3f4038811619aedb9a024c01f92940" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.296987 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-jv79s" podStartSLOduration=2.059015881 podStartE2EDuration="12.296957427s" podCreationTimestamp="2026-03-18 12:33:41 +0000 UTC" firstStartedPulling="2026-03-18 12:33:42.234029254 +0000 UTC m=+1455.949854778" lastFinishedPulling="2026-03-18 12:33:52.4719708 +0000 UTC m=+1466.187796324" observedRunningTime="2026-03-18 12:33:53.295987959 +0000 UTC m=+1467.011813493" watchObservedRunningTime="2026-03-18 12:33:53.296957427 +0000 UTC m=+1467.012782951" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.328191 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7dc96fcc9b-lt524"] Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.337698 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7dc96fcc9b-lt524"] Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.487934 4843 scope.go:117] "RemoveContainer" containerID="3556ca049e4c4ff3eb611a388edd9d3f28839b1285421ed541909b3b6403983e" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.610822 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.753865 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2de25d38-5ad6-4af1-bc18-ea9974156d72-log-httpd\") pod \"2de25d38-5ad6-4af1-bc18-ea9974156d72\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.753908 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-scripts\") pod \"2de25d38-5ad6-4af1-bc18-ea9974156d72\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.753957 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-combined-ca-bundle\") pod \"2de25d38-5ad6-4af1-bc18-ea9974156d72\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.754098 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2de25d38-5ad6-4af1-bc18-ea9974156d72-run-httpd\") pod \"2de25d38-5ad6-4af1-bc18-ea9974156d72\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.754141 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-sg-core-conf-yaml\") pod \"2de25d38-5ad6-4af1-bc18-ea9974156d72\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.754155 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-config-data\") pod \"2de25d38-5ad6-4af1-bc18-ea9974156d72\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.754186 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djmxq\" (UniqueName: \"kubernetes.io/projected/2de25d38-5ad6-4af1-bc18-ea9974156d72-kube-api-access-djmxq\") pod \"2de25d38-5ad6-4af1-bc18-ea9974156d72\" (UID: \"2de25d38-5ad6-4af1-bc18-ea9974156d72\") " Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.755369 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de25d38-5ad6-4af1-bc18-ea9974156d72-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2de25d38-5ad6-4af1-bc18-ea9974156d72" (UID: "2de25d38-5ad6-4af1-bc18-ea9974156d72"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.755763 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de25d38-5ad6-4af1-bc18-ea9974156d72-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2de25d38-5ad6-4af1-bc18-ea9974156d72" (UID: "2de25d38-5ad6-4af1-bc18-ea9974156d72"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.760285 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-scripts" (OuterVolumeSpecName: "scripts") pod "2de25d38-5ad6-4af1-bc18-ea9974156d72" (UID: "2de25d38-5ad6-4af1-bc18-ea9974156d72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.761063 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de25d38-5ad6-4af1-bc18-ea9974156d72-kube-api-access-djmxq" (OuterVolumeSpecName: "kube-api-access-djmxq") pod "2de25d38-5ad6-4af1-bc18-ea9974156d72" (UID: "2de25d38-5ad6-4af1-bc18-ea9974156d72"). InnerVolumeSpecName "kube-api-access-djmxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.817841 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2de25d38-5ad6-4af1-bc18-ea9974156d72" (UID: "2de25d38-5ad6-4af1-bc18-ea9974156d72"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.830579 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2de25d38-5ad6-4af1-bc18-ea9974156d72" (UID: "2de25d38-5ad6-4af1-bc18-ea9974156d72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.855895 4843 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2de25d38-5ad6-4af1-bc18-ea9974156d72-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.855931 4843 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.855945 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djmxq\" (UniqueName: \"kubernetes.io/projected/2de25d38-5ad6-4af1-bc18-ea9974156d72-kube-api-access-djmxq\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.855954 4843 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2de25d38-5ad6-4af1-bc18-ea9974156d72-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.855972 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.855980 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.922847 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-config-data" (OuterVolumeSpecName: "config-data") pod "2de25d38-5ad6-4af1-bc18-ea9974156d72" (UID: "2de25d38-5ad6-4af1-bc18-ea9974156d72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:53 crc kubenswrapper[4843]: I0318 12:33:53.957930 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de25d38-5ad6-4af1-bc18-ea9974156d72-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.292999 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.293011 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2de25d38-5ad6-4af1-bc18-ea9974156d72","Type":"ContainerDied","Data":"2d9e5e21025c8e80cdf3fae1f471f0e31a5b768482ddb87c9e7942e612232ba6"} Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.293423 4843 scope.go:117] "RemoveContainer" containerID="7afb85411ad7b241b5a6db23adc0b716fdec211dfa4eab4ff60b7c5fc2167676" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.332341 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.340701 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.353904 4843 scope.go:117] "RemoveContainer" containerID="ec4a1644b314d51d3b4129e2436f0c3fbedf4e19e20cc8b993b44cf537ecb707" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.372790 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:54 crc kubenswrapper[4843]: E0318 12:33:54.373199 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.373213 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon" Mar 18 12:33:54 crc kubenswrapper[4843]: E0318 12:33:54.373225 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="ceilometer-central-agent" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.373232 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="ceilometer-central-agent" Mar 18 12:33:54 crc kubenswrapper[4843]: E0318 12:33:54.373245 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="proxy-httpd" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.373251 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="proxy-httpd" Mar 18 12:33:54 crc kubenswrapper[4843]: E0318 12:33:54.373264 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon-log" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.373269 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon-log" Mar 18 12:33:54 crc kubenswrapper[4843]: E0318 12:33:54.373282 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="sg-core" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.373288 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="sg-core" Mar 18 12:33:54 crc kubenswrapper[4843]: E0318 12:33:54.373303 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="ceilometer-notification-agent" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.373310 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="ceilometer-notification-agent" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.373476 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon-log" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.373484 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="ceilometer-central-agent" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.373499 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="ceilometer-notification-agent" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.373509 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" containerName="horizon" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.373518 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="sg-core" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.373527 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" containerName="proxy-httpd" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.375301 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.380498 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.380759 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.402853 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.403935 4843 scope.go:117] "RemoveContainer" containerID="3d2055534d53e64bc93ec205abefc93c8b4c5db7406291b4fe5075fd4183c17f" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.426059 4843 scope.go:117] "RemoveContainer" containerID="11ced49d1c8ea729809a0ba672c2549bc85f8f5ba14622efa49efbe5995470f8" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.471839 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-scripts\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.471952 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.471985 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0870ff56-ede8-4001-a8c5-5762c3087bb5-log-httpd\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.472172 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.472236 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-config-data\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.472267 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0870ff56-ede8-4001-a8c5-5762c3087bb5-run-httpd\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.472306 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5ktb\" (UniqueName: \"kubernetes.io/projected/0870ff56-ede8-4001-a8c5-5762c3087bb5-kube-api-access-f5ktb\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.573886 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.573942 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0870ff56-ede8-4001-a8c5-5762c3087bb5-run-httpd\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.573962 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-config-data\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.573989 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5ktb\" (UniqueName: \"kubernetes.io/projected/0870ff56-ede8-4001-a8c5-5762c3087bb5-kube-api-access-f5ktb\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.574009 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-scripts\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.574058 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.574077 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0870ff56-ede8-4001-a8c5-5762c3087bb5-log-httpd\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.574564 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0870ff56-ede8-4001-a8c5-5762c3087bb5-log-httpd\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.575207 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0870ff56-ede8-4001-a8c5-5762c3087bb5-run-httpd\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.578304 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.578487 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-scripts\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.579036 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.587565 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-config-data\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.593166 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5ktb\" (UniqueName: \"kubernetes.io/projected/0870ff56-ede8-4001-a8c5-5762c3087bb5-kube-api-access-f5ktb\") pod \"ceilometer-0\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.704463 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.996845 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de25d38-5ad6-4af1-bc18-ea9974156d72" path="/var/lib/kubelet/pods/2de25d38-5ad6-4af1-bc18-ea9974156d72/volumes" Mar 18 12:33:54 crc kubenswrapper[4843]: I0318 12:33:54.997931 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc8c3ad-8b5f-45f0-a856-8b97ada7c434" path="/var/lib/kubelet/pods/8fc8c3ad-8b5f-45f0-a856-8b97ada7c434/volumes" Mar 18 12:33:55 crc kubenswrapper[4843]: I0318 12:33:55.036896 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:55 crc kubenswrapper[4843]: W0318 12:33:55.037086 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0870ff56_ede8_4001_a8c5_5762c3087bb5.slice/crio-b54cfcc4392d8be535873c4a6bf795ec09a9472a8dd87b21b433659e48c35a38 WatchSource:0}: Error finding container b54cfcc4392d8be535873c4a6bf795ec09a9472a8dd87b21b433659e48c35a38: Status 404 returned error can't find the container with id b54cfcc4392d8be535873c4a6bf795ec09a9472a8dd87b21b433659e48c35a38 Mar 18 12:33:55 crc kubenswrapper[4843]: I0318 12:33:55.310844 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0870ff56-ede8-4001-a8c5-5762c3087bb5","Type":"ContainerStarted","Data":"b54cfcc4392d8be535873c4a6bf795ec09a9472a8dd87b21b433659e48c35a38"} Mar 18 12:33:57 crc kubenswrapper[4843]: I0318 12:33:57.337244 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0870ff56-ede8-4001-a8c5-5762c3087bb5","Type":"ContainerStarted","Data":"cdb79bda8def5db80af2192407f731840060aac30a6a2bd00ea2437fd4a2a517"} Mar 18 12:33:58 crc kubenswrapper[4843]: I0318 12:33:58.350767 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0870ff56-ede8-4001-a8c5-5762c3087bb5","Type":"ContainerStarted","Data":"2fba52ad94e4122e7be6cf582cd61aeb95c2e76c026795e9110bb5fbae40c87e"} Mar 18 12:33:59 crc kubenswrapper[4843]: I0318 12:33:59.363962 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0870ff56-ede8-4001-a8c5-5762c3087bb5","Type":"ContainerStarted","Data":"632100624b01bdc422015a977032f1557c7e8528f47def71502a96f31366437c"} Mar 18 12:34:00 crc kubenswrapper[4843]: I0318 12:34:00.160244 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563954-gnjzm"] Mar 18 12:34:00 crc kubenswrapper[4843]: I0318 12:34:00.162640 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563954-gnjzm" Mar 18 12:34:00 crc kubenswrapper[4843]: I0318 12:34:00.166407 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563954-gnjzm"] Mar 18 12:34:00 crc kubenswrapper[4843]: I0318 12:34:00.201000 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:34:00 crc kubenswrapper[4843]: I0318 12:34:00.201941 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:34:00 crc kubenswrapper[4843]: I0318 12:34:00.202199 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:34:00 crc kubenswrapper[4843]: I0318 12:34:00.301277 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cwdt\" (UniqueName: \"kubernetes.io/projected/df17c133-1ad9-4e20-884e-99a07055c280-kube-api-access-5cwdt\") pod \"auto-csr-approver-29563954-gnjzm\" (UID: \"df17c133-1ad9-4e20-884e-99a07055c280\") " pod="openshift-infra/auto-csr-approver-29563954-gnjzm" Mar 18 12:34:00 crc kubenswrapper[4843]: I0318 12:34:00.403476 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cwdt\" (UniqueName: \"kubernetes.io/projected/df17c133-1ad9-4e20-884e-99a07055c280-kube-api-access-5cwdt\") pod \"auto-csr-approver-29563954-gnjzm\" (UID: \"df17c133-1ad9-4e20-884e-99a07055c280\") " pod="openshift-infra/auto-csr-approver-29563954-gnjzm" Mar 18 12:34:00 crc kubenswrapper[4843]: I0318 12:34:00.423678 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cwdt\" (UniqueName: \"kubernetes.io/projected/df17c133-1ad9-4e20-884e-99a07055c280-kube-api-access-5cwdt\") pod \"auto-csr-approver-29563954-gnjzm\" (UID: \"df17c133-1ad9-4e20-884e-99a07055c280\") " pod="openshift-infra/auto-csr-approver-29563954-gnjzm" Mar 18 12:34:00 crc kubenswrapper[4843]: I0318 12:34:00.515432 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563954-gnjzm" Mar 18 12:34:00 crc kubenswrapper[4843]: I0318 12:34:00.994177 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563954-gnjzm"] Mar 18 12:34:01 crc kubenswrapper[4843]: W0318 12:34:01.003295 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf17c133_1ad9_4e20_884e_99a07055c280.slice/crio-2eb83fa4cb4e416f2125a2aaab4d1f5e6e16740bb70e4cb019831358dbcb2305 WatchSource:0}: Error finding container 2eb83fa4cb4e416f2125a2aaab4d1f5e6e16740bb70e4cb019831358dbcb2305: Status 404 returned error can't find the container with id 2eb83fa4cb4e416f2125a2aaab4d1f5e6e16740bb70e4cb019831358dbcb2305 Mar 18 12:34:01 crc kubenswrapper[4843]: I0318 12:34:01.387535 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563954-gnjzm" event={"ID":"df17c133-1ad9-4e20-884e-99a07055c280","Type":"ContainerStarted","Data":"2eb83fa4cb4e416f2125a2aaab4d1f5e6e16740bb70e4cb019831358dbcb2305"} Mar 18 12:34:02 crc kubenswrapper[4843]: I0318 12:34:02.410507 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563954-gnjzm" event={"ID":"df17c133-1ad9-4e20-884e-99a07055c280","Type":"ContainerStarted","Data":"0d33d648e5242a45903289afc6259d2db51ccddffcb0d0c95cac93496b0218a4"} Mar 18 12:34:02 crc kubenswrapper[4843]: I0318 12:34:02.420345 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0870ff56-ede8-4001-a8c5-5762c3087bb5","Type":"ContainerStarted","Data":"5f433f776c8222bbacc1f74bef839843e52579e6e709abeb778a6f420e4f8be6"} Mar 18 12:34:02 crc kubenswrapper[4843]: I0318 12:34:02.421045 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:34:02 crc kubenswrapper[4843]: I0318 12:34:02.442148 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563954-gnjzm" podStartSLOduration=1.374589106 podStartE2EDuration="2.442117286s" podCreationTimestamp="2026-03-18 12:34:00 +0000 UTC" firstStartedPulling="2026-03-18 12:34:01.007794891 +0000 UTC m=+1474.723620415" lastFinishedPulling="2026-03-18 12:34:02.075323071 +0000 UTC m=+1475.791148595" observedRunningTime="2026-03-18 12:34:02.431304538 +0000 UTC m=+1476.147130062" watchObservedRunningTime="2026-03-18 12:34:02.442117286 +0000 UTC m=+1476.157942810" Mar 18 12:34:02 crc kubenswrapper[4843]: I0318 12:34:02.458026 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.13199277 podStartE2EDuration="8.458005897s" podCreationTimestamp="2026-03-18 12:33:54 +0000 UTC" firstStartedPulling="2026-03-18 12:33:55.039138831 +0000 UTC m=+1468.754964355" lastFinishedPulling="2026-03-18 12:34:01.365151938 +0000 UTC m=+1475.080977482" observedRunningTime="2026-03-18 12:34:02.455674761 +0000 UTC m=+1476.171500285" watchObservedRunningTime="2026-03-18 12:34:02.458005897 +0000 UTC m=+1476.173831421" Mar 18 12:34:03 crc kubenswrapper[4843]: I0318 12:34:03.432803 4843 generic.go:334] "Generic (PLEG): container finished" podID="c2e5376f-b6a5-4462-925f-a0daa0d3aa5b" containerID="a298a077e860eb61e48e3f862d74c7029a9d42aac32c62c5faff068cf825f077" exitCode=0 Mar 18 12:34:03 crc kubenswrapper[4843]: I0318 12:34:03.433070 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jv79s" event={"ID":"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b","Type":"ContainerDied","Data":"a298a077e860eb61e48e3f862d74c7029a9d42aac32c62c5faff068cf825f077"} Mar 18 12:34:03 crc kubenswrapper[4843]: I0318 12:34:03.435852 4843 generic.go:334] "Generic (PLEG): container finished" podID="df17c133-1ad9-4e20-884e-99a07055c280" containerID="0d33d648e5242a45903289afc6259d2db51ccddffcb0d0c95cac93496b0218a4" exitCode=0 Mar 18 12:34:03 crc kubenswrapper[4843]: I0318 12:34:03.435983 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563954-gnjzm" event={"ID":"df17c133-1ad9-4e20-884e-99a07055c280","Type":"ContainerDied","Data":"0d33d648e5242a45903289afc6259d2db51ccddffcb0d0c95cac93496b0218a4"} Mar 18 12:34:04 crc kubenswrapper[4843]: I0318 12:34:04.853706 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563954-gnjzm" Mar 18 12:34:04 crc kubenswrapper[4843]: I0318 12:34:04.953692 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.000255 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cwdt\" (UniqueName: \"kubernetes.io/projected/df17c133-1ad9-4e20-884e-99a07055c280-kube-api-access-5cwdt\") pod \"df17c133-1ad9-4e20-884e-99a07055c280\" (UID: \"df17c133-1ad9-4e20-884e-99a07055c280\") " Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.006596 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df17c133-1ad9-4e20-884e-99a07055c280-kube-api-access-5cwdt" (OuterVolumeSpecName: "kube-api-access-5cwdt") pod "df17c133-1ad9-4e20-884e-99a07055c280" (UID: "df17c133-1ad9-4e20-884e-99a07055c280"). InnerVolumeSpecName "kube-api-access-5cwdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.101769 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l585m\" (UniqueName: \"kubernetes.io/projected/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-kube-api-access-l585m\") pod \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.102166 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-combined-ca-bundle\") pod \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.102204 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-scripts\") pod \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.102238 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-config-data\") pod \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\" (UID: \"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b\") " Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.102781 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cwdt\" (UniqueName: \"kubernetes.io/projected/df17c133-1ad9-4e20-884e-99a07055c280-kube-api-access-5cwdt\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.106953 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-kube-api-access-l585m" (OuterVolumeSpecName: "kube-api-access-l585m") pod "c2e5376f-b6a5-4462-925f-a0daa0d3aa5b" (UID: "c2e5376f-b6a5-4462-925f-a0daa0d3aa5b"). InnerVolumeSpecName "kube-api-access-l585m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.109016 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-scripts" (OuterVolumeSpecName: "scripts") pod "c2e5376f-b6a5-4462-925f-a0daa0d3aa5b" (UID: "c2e5376f-b6a5-4462-925f-a0daa0d3aa5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.127534 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2e5376f-b6a5-4462-925f-a0daa0d3aa5b" (UID: "c2e5376f-b6a5-4462-925f-a0daa0d3aa5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.132003 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-config-data" (OuterVolumeSpecName: "config-data") pod "c2e5376f-b6a5-4462-925f-a0daa0d3aa5b" (UID: "c2e5376f-b6a5-4462-925f-a0daa0d3aa5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.204995 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.205048 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.205066 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.205084 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l585m\" (UniqueName: \"kubernetes.io/projected/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b-kube-api-access-l585m\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.458225 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jv79s" event={"ID":"c2e5376f-b6a5-4462-925f-a0daa0d3aa5b","Type":"ContainerDied","Data":"68bd87b620d46adf971adfcff15aed739bd32de12806546ef77d5896a131de1b"} Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.458298 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68bd87b620d46adf971adfcff15aed739bd32de12806546ef77d5896a131de1b" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.458240 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jv79s" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.460350 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563954-gnjzm" event={"ID":"df17c133-1ad9-4e20-884e-99a07055c280","Type":"ContainerDied","Data":"2eb83fa4cb4e416f2125a2aaab4d1f5e6e16740bb70e4cb019831358dbcb2305"} Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.460519 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eb83fa4cb4e416f2125a2aaab4d1f5e6e16740bb70e4cb019831358dbcb2305" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.460452 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563954-gnjzm" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.572780 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563948-mh955"] Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.581582 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563948-mh955"] Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.601919 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:34:05 crc kubenswrapper[4843]: E0318 12:34:05.602351 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e5376f-b6a5-4462-925f-a0daa0d3aa5b" containerName="nova-cell0-conductor-db-sync" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.602372 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e5376f-b6a5-4462-925f-a0daa0d3aa5b" containerName="nova-cell0-conductor-db-sync" Mar 18 12:34:05 crc kubenswrapper[4843]: E0318 12:34:05.602395 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df17c133-1ad9-4e20-884e-99a07055c280" containerName="oc" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.602407 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="df17c133-1ad9-4e20-884e-99a07055c280" containerName="oc" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.602618 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e5376f-b6a5-4462-925f-a0daa0d3aa5b" containerName="nova-cell0-conductor-db-sync" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.602645 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="df17c133-1ad9-4e20-884e-99a07055c280" containerName="oc" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.603444 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.607371 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.607608 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gplpb" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.614556 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.722303 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwg7s\" (UniqueName: \"kubernetes.io/projected/926dc012-259a-44d0-a0b7-b88fe703dfae-kube-api-access-qwg7s\") pod \"nova-cell0-conductor-0\" (UID: \"926dc012-259a-44d0-a0b7-b88fe703dfae\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.722406 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926dc012-259a-44d0-a0b7-b88fe703dfae-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"926dc012-259a-44d0-a0b7-b88fe703dfae\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.722573 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926dc012-259a-44d0-a0b7-b88fe703dfae-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"926dc012-259a-44d0-a0b7-b88fe703dfae\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.824476 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926dc012-259a-44d0-a0b7-b88fe703dfae-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"926dc012-259a-44d0-a0b7-b88fe703dfae\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.824559 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwg7s\" (UniqueName: \"kubernetes.io/projected/926dc012-259a-44d0-a0b7-b88fe703dfae-kube-api-access-qwg7s\") pod \"nova-cell0-conductor-0\" (UID: \"926dc012-259a-44d0-a0b7-b88fe703dfae\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.824610 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926dc012-259a-44d0-a0b7-b88fe703dfae-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"926dc012-259a-44d0-a0b7-b88fe703dfae\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.829853 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926dc012-259a-44d0-a0b7-b88fe703dfae-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"926dc012-259a-44d0-a0b7-b88fe703dfae\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.830421 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926dc012-259a-44d0-a0b7-b88fe703dfae-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"926dc012-259a-44d0-a0b7-b88fe703dfae\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.849626 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwg7s\" (UniqueName: \"kubernetes.io/projected/926dc012-259a-44d0-a0b7-b88fe703dfae-kube-api-access-qwg7s\") pod \"nova-cell0-conductor-0\" (UID: \"926dc012-259a-44d0-a0b7-b88fe703dfae\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:05 crc kubenswrapper[4843]: I0318 12:34:05.924758 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:06 crc kubenswrapper[4843]: I0318 12:34:06.415424 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:34:06 crc kubenswrapper[4843]: I0318 12:34:06.483226 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"926dc012-259a-44d0-a0b7-b88fe703dfae","Type":"ContainerStarted","Data":"c31a4ccd82cfc25af4b3d19b2f774ce947e43bda91c9eb99b8fdc0365567b32e"} Mar 18 12:34:07 crc kubenswrapper[4843]: I0318 12:34:07.000211 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ff159c-ced7-4e9d-8aac-a6789348ca55" path="/var/lib/kubelet/pods/c6ff159c-ced7-4e9d-8aac-a6789348ca55/volumes" Mar 18 12:34:07 crc kubenswrapper[4843]: I0318 12:34:07.494745 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"926dc012-259a-44d0-a0b7-b88fe703dfae","Type":"ContainerStarted","Data":"870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98"} Mar 18 12:34:07 crc kubenswrapper[4843]: I0318 12:34:07.494830 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:07 crc kubenswrapper[4843]: I0318 12:34:07.516458 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.516435938 podStartE2EDuration="2.516435938s" podCreationTimestamp="2026-03-18 12:34:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:07.51579675 +0000 UTC m=+1481.231622284" watchObservedRunningTime="2026-03-18 12:34:07.516435938 +0000 UTC m=+1481.232261462" Mar 18 12:34:08 crc kubenswrapper[4843]: I0318 12:34:08.648547 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:34:09 crc kubenswrapper[4843]: I0318 12:34:09.517873 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="926dc012-259a-44d0-a0b7-b88fe703dfae" containerName="nova-cell0-conductor-conductor" containerID="cri-o://870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" gracePeriod=30 Mar 18 12:34:10 crc kubenswrapper[4843]: I0318 12:34:10.692868 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:10 crc kubenswrapper[4843]: I0318 12:34:10.693160 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="ceilometer-central-agent" containerID="cri-o://cdb79bda8def5db80af2192407f731840060aac30a6a2bd00ea2437fd4a2a517" gracePeriod=30 Mar 18 12:34:10 crc kubenswrapper[4843]: I0318 12:34:10.693210 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="sg-core" containerID="cri-o://632100624b01bdc422015a977032f1557c7e8528f47def71502a96f31366437c" gracePeriod=30 Mar 18 12:34:10 crc kubenswrapper[4843]: I0318 12:34:10.693225 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="ceilometer-notification-agent" containerID="cri-o://2fba52ad94e4122e7be6cf582cd61aeb95c2e76c026795e9110bb5fbae40c87e" gracePeriod=30 Mar 18 12:34:10 crc kubenswrapper[4843]: I0318 12:34:10.693234 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="proxy-httpd" containerID="cri-o://5f433f776c8222bbacc1f74bef839843e52579e6e709abeb778a6f420e4f8be6" gracePeriod=30 Mar 18 12:34:11 crc kubenswrapper[4843]: I0318 12:34:11.673647 4843 generic.go:334] "Generic (PLEG): container finished" podID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerID="5f433f776c8222bbacc1f74bef839843e52579e6e709abeb778a6f420e4f8be6" exitCode=0 Mar 18 12:34:11 crc kubenswrapper[4843]: I0318 12:34:11.673690 4843 generic.go:334] "Generic (PLEG): container finished" podID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerID="632100624b01bdc422015a977032f1557c7e8528f47def71502a96f31366437c" exitCode=2 Mar 18 12:34:11 crc kubenswrapper[4843]: I0318 12:34:11.673700 4843 generic.go:334] "Generic (PLEG): container finished" podID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerID="cdb79bda8def5db80af2192407f731840060aac30a6a2bd00ea2437fd4a2a517" exitCode=0 Mar 18 12:34:11 crc kubenswrapper[4843]: I0318 12:34:11.673689 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0870ff56-ede8-4001-a8c5-5762c3087bb5","Type":"ContainerDied","Data":"5f433f776c8222bbacc1f74bef839843e52579e6e709abeb778a6f420e4f8be6"} Mar 18 12:34:11 crc kubenswrapper[4843]: I0318 12:34:11.673735 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0870ff56-ede8-4001-a8c5-5762c3087bb5","Type":"ContainerDied","Data":"632100624b01bdc422015a977032f1557c7e8528f47def71502a96f31366437c"} Mar 18 12:34:11 crc kubenswrapper[4843]: I0318 12:34:11.673746 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0870ff56-ede8-4001-a8c5-5762c3087bb5","Type":"ContainerDied","Data":"cdb79bda8def5db80af2192407f731840060aac30a6a2bd00ea2437fd4a2a517"} Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.467511 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.497503 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5ktb\" (UniqueName: \"kubernetes.io/projected/0870ff56-ede8-4001-a8c5-5762c3087bb5-kube-api-access-f5ktb\") pod \"0870ff56-ede8-4001-a8c5-5762c3087bb5\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.497619 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-combined-ca-bundle\") pod \"0870ff56-ede8-4001-a8c5-5762c3087bb5\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.497706 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0870ff56-ede8-4001-a8c5-5762c3087bb5-log-httpd\") pod \"0870ff56-ede8-4001-a8c5-5762c3087bb5\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.497782 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0870ff56-ede8-4001-a8c5-5762c3087bb5-run-httpd\") pod \"0870ff56-ede8-4001-a8c5-5762c3087bb5\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.497841 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-config-data\") pod \"0870ff56-ede8-4001-a8c5-5762c3087bb5\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.497890 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-scripts\") pod \"0870ff56-ede8-4001-a8c5-5762c3087bb5\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.497972 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-sg-core-conf-yaml\") pod \"0870ff56-ede8-4001-a8c5-5762c3087bb5\" (UID: \"0870ff56-ede8-4001-a8c5-5762c3087bb5\") " Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.498676 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0870ff56-ede8-4001-a8c5-5762c3087bb5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0870ff56-ede8-4001-a8c5-5762c3087bb5" (UID: "0870ff56-ede8-4001-a8c5-5762c3087bb5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.499480 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0870ff56-ede8-4001-a8c5-5762c3087bb5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0870ff56-ede8-4001-a8c5-5762c3087bb5" (UID: "0870ff56-ede8-4001-a8c5-5762c3087bb5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.504932 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0870ff56-ede8-4001-a8c5-5762c3087bb5-kube-api-access-f5ktb" (OuterVolumeSpecName: "kube-api-access-f5ktb") pod "0870ff56-ede8-4001-a8c5-5762c3087bb5" (UID: "0870ff56-ede8-4001-a8c5-5762c3087bb5"). InnerVolumeSpecName "kube-api-access-f5ktb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.505929 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-scripts" (OuterVolumeSpecName: "scripts") pod "0870ff56-ede8-4001-a8c5-5762c3087bb5" (UID: "0870ff56-ede8-4001-a8c5-5762c3087bb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.555728 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0870ff56-ede8-4001-a8c5-5762c3087bb5" (UID: "0870ff56-ede8-4001-a8c5-5762c3087bb5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.599303 4843 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0870ff56-ede8-4001-a8c5-5762c3087bb5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.599344 4843 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0870ff56-ede8-4001-a8c5-5762c3087bb5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.599356 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.599368 4843 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.599381 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5ktb\" (UniqueName: \"kubernetes.io/projected/0870ff56-ede8-4001-a8c5-5762c3087bb5-kube-api-access-f5ktb\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.610424 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0870ff56-ede8-4001-a8c5-5762c3087bb5" (UID: "0870ff56-ede8-4001-a8c5-5762c3087bb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.642469 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-config-data" (OuterVolumeSpecName: "config-data") pod "0870ff56-ede8-4001-a8c5-5762c3087bb5" (UID: "0870ff56-ede8-4001-a8c5-5762c3087bb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.703508 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.703560 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0870ff56-ede8-4001-a8c5-5762c3087bb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.754518 4843 generic.go:334] "Generic (PLEG): container finished" podID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerID="2fba52ad94e4122e7be6cf582cd61aeb95c2e76c026795e9110bb5fbae40c87e" exitCode=0 Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.754566 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0870ff56-ede8-4001-a8c5-5762c3087bb5","Type":"ContainerDied","Data":"2fba52ad94e4122e7be6cf582cd61aeb95c2e76c026795e9110bb5fbae40c87e"} Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.754592 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0870ff56-ede8-4001-a8c5-5762c3087bb5","Type":"ContainerDied","Data":"b54cfcc4392d8be535873c4a6bf795ec09a9472a8dd87b21b433659e48c35a38"} Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.754609 4843 scope.go:117] "RemoveContainer" containerID="5f433f776c8222bbacc1f74bef839843e52579e6e709abeb778a6f420e4f8be6" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.754768 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.790240 4843 scope.go:117] "RemoveContainer" containerID="632100624b01bdc422015a977032f1557c7e8528f47def71502a96f31366437c" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.792098 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.801926 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.818262 4843 scope.go:117] "RemoveContainer" containerID="2fba52ad94e4122e7be6cf582cd61aeb95c2e76c026795e9110bb5fbae40c87e" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.818525 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:13 crc kubenswrapper[4843]: E0318 12:34:13.818983 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="ceilometer-central-agent" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.819008 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="ceilometer-central-agent" Mar 18 12:34:13 crc kubenswrapper[4843]: E0318 12:34:13.819032 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="ceilometer-notification-agent" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.819043 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="ceilometer-notification-agent" Mar 18 12:34:13 crc kubenswrapper[4843]: E0318 12:34:13.819066 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="proxy-httpd" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.819074 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="proxy-httpd" Mar 18 12:34:13 crc kubenswrapper[4843]: E0318 12:34:13.819085 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="sg-core" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.819094 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="sg-core" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.819338 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="ceilometer-notification-agent" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.819359 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="sg-core" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.819400 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="proxy-httpd" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.819417 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" containerName="ceilometer-central-agent" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.821986 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.830452 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.832156 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.845120 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.856243 4843 scope.go:117] "RemoveContainer" containerID="cdb79bda8def5db80af2192407f731840060aac30a6a2bd00ea2437fd4a2a517" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.878002 4843 scope.go:117] "RemoveContainer" containerID="5f433f776c8222bbacc1f74bef839843e52579e6e709abeb778a6f420e4f8be6" Mar 18 12:34:13 crc kubenswrapper[4843]: E0318 12:34:13.880744 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f433f776c8222bbacc1f74bef839843e52579e6e709abeb778a6f420e4f8be6\": container with ID starting with 5f433f776c8222bbacc1f74bef839843e52579e6e709abeb778a6f420e4f8be6 not found: ID does not exist" containerID="5f433f776c8222bbacc1f74bef839843e52579e6e709abeb778a6f420e4f8be6" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.881444 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f433f776c8222bbacc1f74bef839843e52579e6e709abeb778a6f420e4f8be6"} err="failed to get container status \"5f433f776c8222bbacc1f74bef839843e52579e6e709abeb778a6f420e4f8be6\": rpc error: code = NotFound desc = could not find container \"5f433f776c8222bbacc1f74bef839843e52579e6e709abeb778a6f420e4f8be6\": container with ID starting with 5f433f776c8222bbacc1f74bef839843e52579e6e709abeb778a6f420e4f8be6 not found: ID does not exist" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.881596 4843 scope.go:117] "RemoveContainer" containerID="632100624b01bdc422015a977032f1557c7e8528f47def71502a96f31366437c" Mar 18 12:34:13 crc kubenswrapper[4843]: E0318 12:34:13.882204 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632100624b01bdc422015a977032f1557c7e8528f47def71502a96f31366437c\": container with ID starting with 632100624b01bdc422015a977032f1557c7e8528f47def71502a96f31366437c not found: ID does not exist" containerID="632100624b01bdc422015a977032f1557c7e8528f47def71502a96f31366437c" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.882250 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632100624b01bdc422015a977032f1557c7e8528f47def71502a96f31366437c"} err="failed to get container status \"632100624b01bdc422015a977032f1557c7e8528f47def71502a96f31366437c\": rpc error: code = NotFound desc = could not find container \"632100624b01bdc422015a977032f1557c7e8528f47def71502a96f31366437c\": container with ID starting with 632100624b01bdc422015a977032f1557c7e8528f47def71502a96f31366437c not found: ID does not exist" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.882281 4843 scope.go:117] "RemoveContainer" containerID="2fba52ad94e4122e7be6cf582cd61aeb95c2e76c026795e9110bb5fbae40c87e" Mar 18 12:34:13 crc kubenswrapper[4843]: E0318 12:34:13.883053 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fba52ad94e4122e7be6cf582cd61aeb95c2e76c026795e9110bb5fbae40c87e\": container with ID starting with 2fba52ad94e4122e7be6cf582cd61aeb95c2e76c026795e9110bb5fbae40c87e not found: ID does not exist" containerID="2fba52ad94e4122e7be6cf582cd61aeb95c2e76c026795e9110bb5fbae40c87e" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.883095 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fba52ad94e4122e7be6cf582cd61aeb95c2e76c026795e9110bb5fbae40c87e"} err="failed to get container status \"2fba52ad94e4122e7be6cf582cd61aeb95c2e76c026795e9110bb5fbae40c87e\": rpc error: code = NotFound desc = could not find container \"2fba52ad94e4122e7be6cf582cd61aeb95c2e76c026795e9110bb5fbae40c87e\": container with ID starting with 2fba52ad94e4122e7be6cf582cd61aeb95c2e76c026795e9110bb5fbae40c87e not found: ID does not exist" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.883122 4843 scope.go:117] "RemoveContainer" containerID="cdb79bda8def5db80af2192407f731840060aac30a6a2bd00ea2437fd4a2a517" Mar 18 12:34:13 crc kubenswrapper[4843]: E0318 12:34:13.883696 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdb79bda8def5db80af2192407f731840060aac30a6a2bd00ea2437fd4a2a517\": container with ID starting with cdb79bda8def5db80af2192407f731840060aac30a6a2bd00ea2437fd4a2a517 not found: ID does not exist" containerID="cdb79bda8def5db80af2192407f731840060aac30a6a2bd00ea2437fd4a2a517" Mar 18 12:34:13 crc kubenswrapper[4843]: I0318 12:34:13.883720 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdb79bda8def5db80af2192407f731840060aac30a6a2bd00ea2437fd4a2a517"} err="failed to get container status \"cdb79bda8def5db80af2192407f731840060aac30a6a2bd00ea2437fd4a2a517\": rpc error: code = NotFound desc = could not find container \"cdb79bda8def5db80af2192407f731840060aac30a6a2bd00ea2437fd4a2a517\": container with ID starting with cdb79bda8def5db80af2192407f731840060aac30a6a2bd00ea2437fd4a2a517 not found: ID does not exist" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.012197 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-scripts\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.013209 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-config-data\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.013318 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-log-httpd\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.014143 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czth8\" (UniqueName: \"kubernetes.io/projected/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-kube-api-access-czth8\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.014208 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.014268 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-run-httpd\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.014930 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.116354 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.116447 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-scripts\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.116524 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-config-data\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.116571 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-log-httpd\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.116638 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czth8\" (UniqueName: \"kubernetes.io/projected/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-kube-api-access-czth8\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.116696 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.116731 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-run-httpd\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.117835 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-log-httpd\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.118347 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-run-httpd\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.120879 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.126319 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-scripts\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.136615 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-config-data\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.138528 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.140828 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czth8\" (UniqueName: \"kubernetes.io/projected/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-kube-api-access-czth8\") pod \"ceilometer-0\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.161401 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:14 crc kubenswrapper[4843]: W0318 12:34:14.717508 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8d8bc6b_6618_434a_b6ef_7dc30a6bf10f.slice/crio-b0290ccc3241f3100d7cfa338599a620a76cd27c8e607d2796a6a1da0d0334d1 WatchSource:0}: Error finding container b0290ccc3241f3100d7cfa338599a620a76cd27c8e607d2796a6a1da0d0334d1: Status 404 returned error can't find the container with id b0290ccc3241f3100d7cfa338599a620a76cd27c8e607d2796a6a1da0d0334d1 Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.721084 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.809976 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f","Type":"ContainerStarted","Data":"b0290ccc3241f3100d7cfa338599a620a76cd27c8e607d2796a6a1da0d0334d1"} Mar 18 12:34:14 crc kubenswrapper[4843]: I0318 12:34:14.996566 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0870ff56-ede8-4001-a8c5-5762c3087bb5" path="/var/lib/kubelet/pods/0870ff56-ede8-4001-a8c5-5762c3087bb5/volumes" Mar 18 12:34:15 crc kubenswrapper[4843]: I0318 12:34:15.823880 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f","Type":"ContainerStarted","Data":"242bdb078536ad139af43372b4bcff1b3a6e50e688d921467e85214c1c3c5a8b"} Mar 18 12:34:15 crc kubenswrapper[4843]: E0318 12:34:15.927308 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:15 crc kubenswrapper[4843]: E0318 12:34:15.934843 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:15 crc kubenswrapper[4843]: E0318 12:34:15.937308 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:15 crc kubenswrapper[4843]: E0318 12:34:15.937358 4843 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="926dc012-259a-44d0-a0b7-b88fe703dfae" containerName="nova-cell0-conductor-conductor" Mar 18 12:34:16 crc kubenswrapper[4843]: I0318 12:34:16.835051 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f","Type":"ContainerStarted","Data":"60f8cb4a75020c06e17cc0f40b66c0206eacb8e7a7a5acadc004c83992e7dadd"} Mar 18 12:34:17 crc kubenswrapper[4843]: I0318 12:34:17.846036 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f","Type":"ContainerStarted","Data":"746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57"} Mar 18 12:34:19 crc kubenswrapper[4843]: I0318 12:34:19.905066 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f","Type":"ContainerStarted","Data":"2591d1d68ad2ff3882e9f91f045f8e7eda55f9ac54a6986a5e98a08876db7a1d"} Mar 18 12:34:19 crc kubenswrapper[4843]: I0318 12:34:19.905586 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:34:19 crc kubenswrapper[4843]: I0318 12:34:19.983185 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.763334087 podStartE2EDuration="6.983156166s" podCreationTimestamp="2026-03-18 12:34:13 +0000 UTC" firstStartedPulling="2026-03-18 12:34:14.720601253 +0000 UTC m=+1488.436426777" lastFinishedPulling="2026-03-18 12:34:18.940423332 +0000 UTC m=+1492.656248856" observedRunningTime="2026-03-18 12:34:19.96673615 +0000 UTC m=+1493.682561674" watchObservedRunningTime="2026-03-18 12:34:19.983156166 +0000 UTC m=+1493.698981690" Mar 18 12:34:20 crc kubenswrapper[4843]: I0318 12:34:20.036707 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:34:20 crc kubenswrapper[4843]: I0318 12:34:20.036769 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:34:20 crc kubenswrapper[4843]: I0318 12:34:20.036833 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:34:20 crc kubenswrapper[4843]: I0318 12:34:20.037683 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c1df8135718fa79548f8df435226cda37a6730cb41d0cf14ca133c83dba65e7"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:34:20 crc kubenswrapper[4843]: I0318 12:34:20.037749 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://6c1df8135718fa79548f8df435226cda37a6730cb41d0cf14ca133c83dba65e7" gracePeriod=600 Mar 18 12:34:20 crc kubenswrapper[4843]: E0318 12:34:20.928976 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:20 crc kubenswrapper[4843]: E0318 12:34:20.972504 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:20 crc kubenswrapper[4843]: E0318 12:34:20.976955 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:20 crc kubenswrapper[4843]: E0318 12:34:20.977058 4843 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="926dc012-259a-44d0-a0b7-b88fe703dfae" containerName="nova-cell0-conductor-conductor" Mar 18 12:34:20 crc kubenswrapper[4843]: I0318 12:34:20.981286 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="6c1df8135718fa79548f8df435226cda37a6730cb41d0cf14ca133c83dba65e7" exitCode=0 Mar 18 12:34:20 crc kubenswrapper[4843]: I0318 12:34:20.982190 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"6c1df8135718fa79548f8df435226cda37a6730cb41d0cf14ca133c83dba65e7"} Mar 18 12:34:20 crc kubenswrapper[4843]: I0318 12:34:20.982259 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24"} Mar 18 12:34:20 crc kubenswrapper[4843]: I0318 12:34:20.982284 4843 scope.go:117] "RemoveContainer" containerID="7fcc44fd473fc2d97d7be9aa8e61f5a92c58d2a0df082678596236e3adb17e3e" Mar 18 12:34:25 crc kubenswrapper[4843]: E0318 12:34:25.941799 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:25 crc kubenswrapper[4843]: E0318 12:34:25.944705 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:25 crc kubenswrapper[4843]: E0318 12:34:25.946899 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:25 crc kubenswrapper[4843]: E0318 12:34:25.946973 4843 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="926dc012-259a-44d0-a0b7-b88fe703dfae" containerName="nova-cell0-conductor-conductor" Mar 18 12:34:30 crc kubenswrapper[4843]: E0318 12:34:30.928609 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:30 crc kubenswrapper[4843]: E0318 12:34:30.934749 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:30 crc kubenswrapper[4843]: E0318 12:34:30.936799 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:30 crc kubenswrapper[4843]: E0318 12:34:30.936888 4843 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="926dc012-259a-44d0-a0b7-b88fe703dfae" containerName="nova-cell0-conductor-conductor" Mar 18 12:34:35 crc kubenswrapper[4843]: E0318 12:34:35.928272 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:35 crc kubenswrapper[4843]: E0318 12:34:35.930799 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:35 crc kubenswrapper[4843]: E0318 12:34:35.933583 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:34:35 crc kubenswrapper[4843]: E0318 12:34:35.933705 4843 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="926dc012-259a-44d0-a0b7-b88fe703dfae" containerName="nova-cell0-conductor-conductor" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.318804 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.356620 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwg7s\" (UniqueName: \"kubernetes.io/projected/926dc012-259a-44d0-a0b7-b88fe703dfae-kube-api-access-qwg7s\") pod \"926dc012-259a-44d0-a0b7-b88fe703dfae\" (UID: \"926dc012-259a-44d0-a0b7-b88fe703dfae\") " Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.356995 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926dc012-259a-44d0-a0b7-b88fe703dfae-config-data\") pod \"926dc012-259a-44d0-a0b7-b88fe703dfae\" (UID: \"926dc012-259a-44d0-a0b7-b88fe703dfae\") " Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.357031 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926dc012-259a-44d0-a0b7-b88fe703dfae-combined-ca-bundle\") pod \"926dc012-259a-44d0-a0b7-b88fe703dfae\" (UID: \"926dc012-259a-44d0-a0b7-b88fe703dfae\") " Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.374266 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/926dc012-259a-44d0-a0b7-b88fe703dfae-kube-api-access-qwg7s" (OuterVolumeSpecName: "kube-api-access-qwg7s") pod "926dc012-259a-44d0-a0b7-b88fe703dfae" (UID: "926dc012-259a-44d0-a0b7-b88fe703dfae"). InnerVolumeSpecName "kube-api-access-qwg7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.406102 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926dc012-259a-44d0-a0b7-b88fe703dfae-config-data" (OuterVolumeSpecName: "config-data") pod "926dc012-259a-44d0-a0b7-b88fe703dfae" (UID: "926dc012-259a-44d0-a0b7-b88fe703dfae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.410784 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926dc012-259a-44d0-a0b7-b88fe703dfae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "926dc012-259a-44d0-a0b7-b88fe703dfae" (UID: "926dc012-259a-44d0-a0b7-b88fe703dfae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.460061 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926dc012-259a-44d0-a0b7-b88fe703dfae-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.460105 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926dc012-259a-44d0-a0b7-b88fe703dfae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.460126 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwg7s\" (UniqueName: \"kubernetes.io/projected/926dc012-259a-44d0-a0b7-b88fe703dfae-kube-api-access-qwg7s\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.521119 4843 generic.go:334] "Generic (PLEG): container finished" podID="926dc012-259a-44d0-a0b7-b88fe703dfae" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" exitCode=137 Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.521148 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.521181 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"926dc012-259a-44d0-a0b7-b88fe703dfae","Type":"ContainerDied","Data":"870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98"} Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.521213 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"926dc012-259a-44d0-a0b7-b88fe703dfae","Type":"ContainerDied","Data":"c31a4ccd82cfc25af4b3d19b2f774ce947e43bda91c9eb99b8fdc0365567b32e"} Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.521243 4843 scope.go:117] "RemoveContainer" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.550596 4843 scope.go:117] "RemoveContainer" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" Mar 18 12:34:40 crc kubenswrapper[4843]: E0318 12:34:40.551269 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98\": container with ID starting with 870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98 not found: ID does not exist" containerID="870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.551322 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98"} err="failed to get container status \"870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98\": rpc error: code = NotFound desc = could not find container \"870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98\": container with ID starting with 870f060142d6263b7309629b2c60564d83e5118285b1259bff8d73fba8ecbc98 not found: ID does not exist" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.563449 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.573353 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.598707 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:34:40 crc kubenswrapper[4843]: E0318 12:34:40.602540 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="926dc012-259a-44d0-a0b7-b88fe703dfae" containerName="nova-cell0-conductor-conductor" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.602584 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="926dc012-259a-44d0-a0b7-b88fe703dfae" containerName="nova-cell0-conductor-conductor" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.603252 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="926dc012-259a-44d0-a0b7-b88fe703dfae" containerName="nova-cell0-conductor-conductor" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.605865 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.610812 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.610969 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gplpb" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.622068 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.664422 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9v9k\" (UniqueName: \"kubernetes.io/projected/e4769626-cdcb-498c-a5e1-0743378e318e-kube-api-access-w9v9k\") pod \"nova-cell0-conductor-0\" (UID: \"e4769626-cdcb-498c-a5e1-0743378e318e\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.664964 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4769626-cdcb-498c-a5e1-0743378e318e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e4769626-cdcb-498c-a5e1-0743378e318e\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.665108 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4769626-cdcb-498c-a5e1-0743378e318e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e4769626-cdcb-498c-a5e1-0743378e318e\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.766509 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4769626-cdcb-498c-a5e1-0743378e318e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e4769626-cdcb-498c-a5e1-0743378e318e\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.766565 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4769626-cdcb-498c-a5e1-0743378e318e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e4769626-cdcb-498c-a5e1-0743378e318e\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.766618 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9v9k\" (UniqueName: \"kubernetes.io/projected/e4769626-cdcb-498c-a5e1-0743378e318e-kube-api-access-w9v9k\") pod \"nova-cell0-conductor-0\" (UID: \"e4769626-cdcb-498c-a5e1-0743378e318e\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.772587 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4769626-cdcb-498c-a5e1-0743378e318e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e4769626-cdcb-498c-a5e1-0743378e318e\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.772591 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4769626-cdcb-498c-a5e1-0743378e318e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e4769626-cdcb-498c-a5e1-0743378e318e\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.790946 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9v9k\" (UniqueName: \"kubernetes.io/projected/e4769626-cdcb-498c-a5e1-0743378e318e-kube-api-access-w9v9k\") pod \"nova-cell0-conductor-0\" (UID: \"e4769626-cdcb-498c-a5e1-0743378e318e\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:40 crc kubenswrapper[4843]: I0318 12:34:40.940633 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:41 crc kubenswrapper[4843]: I0318 12:34:41.000775 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="926dc012-259a-44d0-a0b7-b88fe703dfae" path="/var/lib/kubelet/pods/926dc012-259a-44d0-a0b7-b88fe703dfae/volumes" Mar 18 12:34:41 crc kubenswrapper[4843]: I0318 12:34:41.716741 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:34:41 crc kubenswrapper[4843]: W0318 12:34:41.721186 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4769626_cdcb_498c_a5e1_0743378e318e.slice/crio-3b5bd1486d153a2ac1524f501f9c7a34ecbb3773c7bf72ef44faf5fa2d5b3b73 WatchSource:0}: Error finding container 3b5bd1486d153a2ac1524f501f9c7a34ecbb3773c7bf72ef44faf5fa2d5b3b73: Status 404 returned error can't find the container with id 3b5bd1486d153a2ac1524f501f9c7a34ecbb3773c7bf72ef44faf5fa2d5b3b73 Mar 18 12:34:42 crc kubenswrapper[4843]: I0318 12:34:42.543164 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e4769626-cdcb-498c-a5e1-0743378e318e","Type":"ContainerStarted","Data":"e70a9fd2a86dd7c277da28acc59da519da90eaae37f1b8927381261ae92dbdbf"} Mar 18 12:34:42 crc kubenswrapper[4843]: I0318 12:34:42.543464 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e4769626-cdcb-498c-a5e1-0743378e318e","Type":"ContainerStarted","Data":"3b5bd1486d153a2ac1524f501f9c7a34ecbb3773c7bf72ef44faf5fa2d5b3b73"} Mar 18 12:34:42 crc kubenswrapper[4843]: I0318 12:34:42.543479 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:42 crc kubenswrapper[4843]: I0318 12:34:42.571356 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.57131639 podStartE2EDuration="2.57131639s" podCreationTimestamp="2026-03-18 12:34:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:42.557091286 +0000 UTC m=+1516.272916810" watchObservedRunningTime="2026-03-18 12:34:42.57131639 +0000 UTC m=+1516.287141934" Mar 18 12:34:44 crc kubenswrapper[4843]: I0318 12:34:44.170306 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 12:34:48 crc kubenswrapper[4843]: I0318 12:34:48.210101 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:34:48 crc kubenswrapper[4843]: I0318 12:34:48.210964 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="77409830-08b4-4a50-8f8e-0e0c3ad009b4" containerName="kube-state-metrics" containerID="cri-o://d4ee15718d98e85f33e33c745b4353fb34c13c57f9f8badd3ea05863c06d680a" gracePeriod=30 Mar 18 12:34:48 crc kubenswrapper[4843]: I0318 12:34:48.605558 4843 generic.go:334] "Generic (PLEG): container finished" podID="77409830-08b4-4a50-8f8e-0e0c3ad009b4" containerID="d4ee15718d98e85f33e33c745b4353fb34c13c57f9f8badd3ea05863c06d680a" exitCode=2 Mar 18 12:34:48 crc kubenswrapper[4843]: I0318 12:34:48.605631 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77409830-08b4-4a50-8f8e-0e0c3ad009b4","Type":"ContainerDied","Data":"d4ee15718d98e85f33e33c745b4353fb34c13c57f9f8badd3ea05863c06d680a"} Mar 18 12:34:48 crc kubenswrapper[4843]: I0318 12:34:48.713118 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:34:48 crc kubenswrapper[4843]: I0318 12:34:48.760732 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-526cc\" (UniqueName: \"kubernetes.io/projected/77409830-08b4-4a50-8f8e-0e0c3ad009b4-kube-api-access-526cc\") pod \"77409830-08b4-4a50-8f8e-0e0c3ad009b4\" (UID: \"77409830-08b4-4a50-8f8e-0e0c3ad009b4\") " Mar 18 12:34:48 crc kubenswrapper[4843]: I0318 12:34:48.769799 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77409830-08b4-4a50-8f8e-0e0c3ad009b4-kube-api-access-526cc" (OuterVolumeSpecName: "kube-api-access-526cc") pod "77409830-08b4-4a50-8f8e-0e0c3ad009b4" (UID: "77409830-08b4-4a50-8f8e-0e0c3ad009b4"). InnerVolumeSpecName "kube-api-access-526cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:48 crc kubenswrapper[4843]: I0318 12:34:48.863362 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-526cc\" (UniqueName: \"kubernetes.io/projected/77409830-08b4-4a50-8f8e-0e0c3ad009b4-kube-api-access-526cc\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.619904 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77409830-08b4-4a50-8f8e-0e0c3ad009b4","Type":"ContainerDied","Data":"918700776d2f73584e77237f38e2c06bde7de969d96c4609594a06efde2023cc"} Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.620090 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.620301 4843 scope.go:117] "RemoveContainer" containerID="d4ee15718d98e85f33e33c745b4353fb34c13c57f9f8badd3ea05863c06d680a" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.653035 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.674905 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.683555 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:34:49 crc kubenswrapper[4843]: E0318 12:34:49.684092 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77409830-08b4-4a50-8f8e-0e0c3ad009b4" containerName="kube-state-metrics" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.684129 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="77409830-08b4-4a50-8f8e-0e0c3ad009b4" containerName="kube-state-metrics" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.684368 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="77409830-08b4-4a50-8f8e-0e0c3ad009b4" containerName="kube-state-metrics" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.685185 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.692188 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.692542 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.692823 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.781903 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c8ecee00-1ed4-4fce-9705-e5e513a922cc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c8ecee00-1ed4-4fce-9705-e5e513a922cc\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.781985 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc8jf\" (UniqueName: \"kubernetes.io/projected/c8ecee00-1ed4-4fce-9705-e5e513a922cc-kube-api-access-pc8jf\") pod \"kube-state-metrics-0\" (UID: \"c8ecee00-1ed4-4fce-9705-e5e513a922cc\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.782214 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ecee00-1ed4-4fce-9705-e5e513a922cc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c8ecee00-1ed4-4fce-9705-e5e513a922cc\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.782270 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ecee00-1ed4-4fce-9705-e5e513a922cc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c8ecee00-1ed4-4fce-9705-e5e513a922cc\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.884075 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c8ecee00-1ed4-4fce-9705-e5e513a922cc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c8ecee00-1ed4-4fce-9705-e5e513a922cc\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.884155 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc8jf\" (UniqueName: \"kubernetes.io/projected/c8ecee00-1ed4-4fce-9705-e5e513a922cc-kube-api-access-pc8jf\") pod \"kube-state-metrics-0\" (UID: \"c8ecee00-1ed4-4fce-9705-e5e513a922cc\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.884375 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ecee00-1ed4-4fce-9705-e5e513a922cc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c8ecee00-1ed4-4fce-9705-e5e513a922cc\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.884433 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ecee00-1ed4-4fce-9705-e5e513a922cc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c8ecee00-1ed4-4fce-9705-e5e513a922cc\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.888360 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c8ecee00-1ed4-4fce-9705-e5e513a922cc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c8ecee00-1ed4-4fce-9705-e5e513a922cc\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.889476 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ecee00-1ed4-4fce-9705-e5e513a922cc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c8ecee00-1ed4-4fce-9705-e5e513a922cc\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.890295 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ecee00-1ed4-4fce-9705-e5e513a922cc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c8ecee00-1ed4-4fce-9705-e5e513a922cc\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:49 crc kubenswrapper[4843]: I0318 12:34:49.904526 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc8jf\" (UniqueName: \"kubernetes.io/projected/c8ecee00-1ed4-4fce-9705-e5e513a922cc-kube-api-access-pc8jf\") pod \"kube-state-metrics-0\" (UID: \"c8ecee00-1ed4-4fce-9705-e5e513a922cc\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:50 crc kubenswrapper[4843]: I0318 12:34:50.007539 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:34:50 crc kubenswrapper[4843]: I0318 12:34:50.249744 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:50 crc kubenswrapper[4843]: I0318 12:34:50.250492 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="ceilometer-central-agent" containerID="cri-o://242bdb078536ad139af43372b4bcff1b3a6e50e688d921467e85214c1c3c5a8b" gracePeriod=30 Mar 18 12:34:50 crc kubenswrapper[4843]: I0318 12:34:50.250711 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="proxy-httpd" containerID="cri-o://2591d1d68ad2ff3882e9f91f045f8e7eda55f9ac54a6986a5e98a08876db7a1d" gracePeriod=30 Mar 18 12:34:50 crc kubenswrapper[4843]: I0318 12:34:50.250834 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="ceilometer-notification-agent" containerID="cri-o://60f8cb4a75020c06e17cc0f40b66c0206eacb8e7a7a5acadc004c83992e7dadd" gracePeriod=30 Mar 18 12:34:50 crc kubenswrapper[4843]: I0318 12:34:50.250976 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="sg-core" containerID="cri-o://746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57" gracePeriod=30 Mar 18 12:34:50 crc kubenswrapper[4843]: E0318 12:34:50.399555 4843 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8d8bc6b_6618_434a_b6ef_7dc30a6bf10f.slice/crio-conmon-746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:34:50 crc kubenswrapper[4843]: I0318 12:34:50.583238 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:34:50 crc kubenswrapper[4843]: I0318 12:34:50.629567 4843 generic.go:334] "Generic (PLEG): container finished" podID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerID="2591d1d68ad2ff3882e9f91f045f8e7eda55f9ac54a6986a5e98a08876db7a1d" exitCode=0 Mar 18 12:34:50 crc kubenswrapper[4843]: I0318 12:34:50.629596 4843 generic.go:334] "Generic (PLEG): container finished" podID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerID="746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57" exitCode=2 Mar 18 12:34:50 crc kubenswrapper[4843]: I0318 12:34:50.629645 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f","Type":"ContainerDied","Data":"2591d1d68ad2ff3882e9f91f045f8e7eda55f9ac54a6986a5e98a08876db7a1d"} Mar 18 12:34:50 crc kubenswrapper[4843]: I0318 12:34:50.629715 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f","Type":"ContainerDied","Data":"746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57"} Mar 18 12:34:50 crc kubenswrapper[4843]: I0318 12:34:50.630787 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c8ecee00-1ed4-4fce-9705-e5e513a922cc","Type":"ContainerStarted","Data":"804be9f606ea3c058aa5676f5ed0f286ee9f53e68aeb95b0e07fbdf2245ed39e"} Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.000841 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77409830-08b4-4a50-8f8e-0e0c3ad009b4" path="/var/lib/kubelet/pods/77409830-08b4-4a50-8f8e-0e0c3ad009b4/volumes" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.001954 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.524805 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-7sn7w"] Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.528227 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.545272 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.545540 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.565524 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7sn7w"] Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.625105 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-scripts\") pod \"nova-cell0-cell-mapping-7sn7w\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.625557 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pklc\" (UniqueName: \"kubernetes.io/projected/d981e7da-a5a9-4d42-94f4-f78aefb2a660-kube-api-access-6pklc\") pod \"nova-cell0-cell-mapping-7sn7w\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.625609 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-config-data\") pod \"nova-cell0-cell-mapping-7sn7w\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.626200 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7sn7w\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.697577 4843 generic.go:334] "Generic (PLEG): container finished" podID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerID="242bdb078536ad139af43372b4bcff1b3a6e50e688d921467e85214c1c3c5a8b" exitCode=0 Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.698839 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f","Type":"ContainerDied","Data":"242bdb078536ad139af43372b4bcff1b3a6e50e688d921467e85214c1c3c5a8b"} Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.699158 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.700942 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.701753 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c8ecee00-1ed4-4fce-9705-e5e513a922cc","Type":"ContainerStarted","Data":"0cfe54ca6ac2e746480f28fcf5d71e22184f657b6d0b937f4ba656a4e10d16b6"} Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.702567 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.705008 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.710000 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.727444 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pklc\" (UniqueName: \"kubernetes.io/projected/d981e7da-a5a9-4d42-94f4-f78aefb2a660-kube-api-access-6pklc\") pod \"nova-cell0-cell-mapping-7sn7w\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.727511 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-config-data\") pod \"nova-cell0-cell-mapping-7sn7w\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.727569 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7sn7w\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.727617 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " pod="openstack/nova-api-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.727636 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-logs\") pod \"nova-api-0\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " pod="openstack/nova-api-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.727692 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-config-data\") pod \"nova-api-0\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " pod="openstack/nova-api-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.727754 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-scripts\") pod \"nova-cell0-cell-mapping-7sn7w\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.727830 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76hpv\" (UniqueName: \"kubernetes.io/projected/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-kube-api-access-76hpv\") pod \"nova-api-0\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " pod="openstack/nova-api-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.733371 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-config-data\") pod \"nova-cell0-cell-mapping-7sn7w\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.740468 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-scripts\") pod \"nova-cell0-cell-mapping-7sn7w\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.743164 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7sn7w\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.765460 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pklc\" (UniqueName: \"kubernetes.io/projected/d981e7da-a5a9-4d42-94f4-f78aefb2a660-kube-api-access-6pklc\") pod \"nova-cell0-cell-mapping-7sn7w\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.796287 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.797705 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.801955 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.806193 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.814466 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.339145993 podStartE2EDuration="2.814420565s" podCreationTimestamp="2026-03-18 12:34:49 +0000 UTC" firstStartedPulling="2026-03-18 12:34:50.593692702 +0000 UTC m=+1524.309518226" lastFinishedPulling="2026-03-18 12:34:51.068967274 +0000 UTC m=+1524.784792798" observedRunningTime="2026-03-18 12:34:51.775799609 +0000 UTC m=+1525.491625133" watchObservedRunningTime="2026-03-18 12:34:51.814420565 +0000 UTC m=+1525.530246089" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.829217 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76hpv\" (UniqueName: \"kubernetes.io/projected/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-kube-api-access-76hpv\") pod \"nova-api-0\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " pod="openstack/nova-api-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.829267 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6599567-b95e-4948-a56f-eba713e3e77f-logs\") pod \"nova-metadata-0\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " pod="openstack/nova-metadata-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.829300 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6599567-b95e-4948-a56f-eba713e3e77f-config-data\") pod \"nova-metadata-0\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " pod="openstack/nova-metadata-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.829341 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8zvp\" (UniqueName: \"kubernetes.io/projected/d6599567-b95e-4948-a56f-eba713e3e77f-kube-api-access-c8zvp\") pod \"nova-metadata-0\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " pod="openstack/nova-metadata-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.829394 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " pod="openstack/nova-api-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.829414 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-logs\") pod \"nova-api-0\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " pod="openstack/nova-api-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.829436 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-config-data\") pod \"nova-api-0\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " pod="openstack/nova-api-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.829472 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6599567-b95e-4948-a56f-eba713e3e77f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " pod="openstack/nova-metadata-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.834007 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-logs\") pod \"nova-api-0\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " pod="openstack/nova-api-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.834989 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " pod="openstack/nova-api-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.838192 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-config-data\") pod \"nova-api-0\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " pod="openstack/nova-api-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.879326 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76hpv\" (UniqueName: \"kubernetes.io/projected/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-kube-api-access-76hpv\") pod \"nova-api-0\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " pod="openstack/nova-api-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.882609 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.884005 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.892107 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.905344 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.907253 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-tldf6"] Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.909263 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.931988 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22b4d\" (UniqueName: \"kubernetes.io/projected/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-kube-api-access-22b4d\") pod \"nova-scheduler-0\" (UID: \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.932053 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.932094 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.932137 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6599567-b95e-4948-a56f-eba713e3e77f-logs\") pod \"nova-metadata-0\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " pod="openstack/nova-metadata-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.932172 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-config\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.932197 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6599567-b95e-4948-a56f-eba713e3e77f-config-data\") pod \"nova-metadata-0\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " pod="openstack/nova-metadata-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.932232 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cvj7\" (UniqueName: \"kubernetes.io/projected/6dd87f99-b9bf-481e-87f4-219e09ca9998-kube-api-access-4cvj7\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.932252 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8zvp\" (UniqueName: \"kubernetes.io/projected/d6599567-b95e-4948-a56f-eba713e3e77f-kube-api-access-c8zvp\") pod \"nova-metadata-0\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " pod="openstack/nova-metadata-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.932270 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.932317 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-config-data\") pod \"nova-scheduler-0\" (UID: \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.932368 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-dns-svc\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.932420 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6599567-b95e-4948-a56f-eba713e3e77f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " pod="openstack/nova-metadata-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.932450 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.933041 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6599567-b95e-4948-a56f-eba713e3e77f-logs\") pod \"nova-metadata-0\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " pod="openstack/nova-metadata-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.938752 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.950866 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-tldf6"] Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.956027 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6599567-b95e-4948-a56f-eba713e3e77f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " pod="openstack/nova-metadata-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.960999 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.962407 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.977386 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6599567-b95e-4948-a56f-eba713e3e77f-config-data\") pod \"nova-metadata-0\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " pod="openstack/nova-metadata-0" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.977818 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 12:34:51 crc kubenswrapper[4843]: I0318 12:34:51.983308 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8zvp\" (UniqueName: \"kubernetes.io/projected/d6599567-b95e-4948-a56f-eba713e3e77f-kube-api-access-c8zvp\") pod \"nova-metadata-0\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " pod="openstack/nova-metadata-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.000789 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.030280 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.035094 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22b4d\" (UniqueName: \"kubernetes.io/projected/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-kube-api-access-22b4d\") pod \"nova-scheduler-0\" (UID: \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.035182 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.035235 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.035291 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4eaf3a3f-c91a-4012-af55-268aa29869cf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.035310 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4eaf3a3f-c91a-4012-af55-268aa29869cf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.035330 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-config\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.035388 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvj7\" (UniqueName: \"kubernetes.io/projected/6dd87f99-b9bf-481e-87f4-219e09ca9998-kube-api-access-4cvj7\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.035414 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.035452 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-config-data\") pod \"nova-scheduler-0\" (UID: \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.035521 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwnhd\" (UniqueName: \"kubernetes.io/projected/4eaf3a3f-c91a-4012-af55-268aa29869cf-kube-api-access-fwnhd\") pod \"nova-cell1-novncproxy-0\" (UID: \"4eaf3a3f-c91a-4012-af55-268aa29869cf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.035550 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-dns-svc\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.035612 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.037450 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-config\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.037491 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.037491 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.038034 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.040183 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-dns-svc\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.042122 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.042140 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-config-data\") pod \"nova-scheduler-0\" (UID: \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.059377 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22b4d\" (UniqueName: \"kubernetes.io/projected/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-kube-api-access-22b4d\") pod \"nova-scheduler-0\" (UID: \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.065307 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cvj7\" (UniqueName: \"kubernetes.io/projected/6dd87f99-b9bf-481e-87f4-219e09ca9998-kube-api-access-4cvj7\") pod \"dnsmasq-dns-865f5d856f-tldf6\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.129464 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.137819 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4eaf3a3f-c91a-4012-af55-268aa29869cf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.137855 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4eaf3a3f-c91a-4012-af55-268aa29869cf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.137935 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwnhd\" (UniqueName: \"kubernetes.io/projected/4eaf3a3f-c91a-4012-af55-268aa29869cf-kube-api-access-fwnhd\") pod \"nova-cell1-novncproxy-0\" (UID: \"4eaf3a3f-c91a-4012-af55-268aa29869cf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.139489 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.143393 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4eaf3a3f-c91a-4012-af55-268aa29869cf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.145280 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4eaf3a3f-c91a-4012-af55-268aa29869cf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.157598 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwnhd\" (UniqueName: \"kubernetes.io/projected/4eaf3a3f-c91a-4012-af55-268aa29869cf-kube-api-access-fwnhd\") pod \"nova-cell1-novncproxy-0\" (UID: \"4eaf3a3f-c91a-4012-af55-268aa29869cf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.161214 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.175533 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.296018 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7sn7w"] Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.566547 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:52 crc kubenswrapper[4843]: W0318 12:34:52.664112 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eaf3a3f_c91a_4012_af55_268aa29869cf.slice/crio-9970267ff8bf1abc142ced0b3273337f30db091771015a60b685a77fd574bd8c WatchSource:0}: Error finding container 9970267ff8bf1abc142ced0b3273337f30db091771015a60b685a77fd574bd8c: Status 404 returned error can't find the container with id 9970267ff8bf1abc142ced0b3273337f30db091771015a60b685a77fd574bd8c Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.664411 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.705132 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-tldf6"] Mar 18 12:34:52 crc kubenswrapper[4843]: W0318 12:34:52.717915 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6599567_b95e_4948_a56f_eba713e3e77f.slice/crio-f881eac98ea0a960dad43dff765a88633502ab6a8b998f2614f04ed8e28a5bcb WatchSource:0}: Error finding container f881eac98ea0a960dad43dff765a88633502ab6a8b998f2614f04ed8e28a5bcb: Status 404 returned error can't find the container with id f881eac98ea0a960dad43dff765a88633502ab6a8b998f2614f04ed8e28a5bcb Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.722680 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.730129 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:34:52 crc kubenswrapper[4843]: W0318 12:34:52.730321 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dd87f99_b9bf_481e_87f4_219e09ca9998.slice/crio-db6d1b4e0d09efa9814cf3f5145e9ecc6677cf1421f099d45bbe92574af49d4b WatchSource:0}: Error finding container db6d1b4e0d09efa9814cf3f5145e9ecc6677cf1421f099d45bbe92574af49d4b: Status 404 returned error can't find the container with id db6d1b4e0d09efa9814cf3f5145e9ecc6677cf1421f099d45bbe92574af49d4b Mar 18 12:34:52 crc kubenswrapper[4843]: W0318 12:34:52.730669 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30ce1050_b96d_4fc4_8280_2e6234cf9c9d.slice/crio-7f7e5f44c50a8722fdfb76be5e69ecce5d3a13f9bae3ba9a8c356ba8a547f575 WatchSource:0}: Error finding container 7f7e5f44c50a8722fdfb76be5e69ecce5d3a13f9bae3ba9a8c356ba8a547f575: Status 404 returned error can't find the container with id 7f7e5f44c50a8722fdfb76be5e69ecce5d3a13f9bae3ba9a8c356ba8a547f575 Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.745958 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7sn7w" event={"ID":"d981e7da-a5a9-4d42-94f4-f78aefb2a660","Type":"ContainerStarted","Data":"dc622d8f50c34a532c7b1d0b1f69be3eaa1d5bf177a148c1c9d1ab8243b1e839"} Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.746148 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7sn7w" event={"ID":"d981e7da-a5a9-4d42-94f4-f78aefb2a660","Type":"ContainerStarted","Data":"245cb6e2cb40107eda017353ab0a6b5fc3a3080826a33f1217ef3e79318514ea"} Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.748124 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4eaf3a3f-c91a-4012-af55-268aa29869cf","Type":"ContainerStarted","Data":"9970267ff8bf1abc142ced0b3273337f30db091771015a60b685a77fd574bd8c"} Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.758222 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee","Type":"ContainerStarted","Data":"db096a37d398725bfb3172478c0e3a673fb5b2aa5f31db532709add96cd73976"} Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.769011 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-7sn7w" podStartSLOduration=1.768993253 podStartE2EDuration="1.768993253s" podCreationTimestamp="2026-03-18 12:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:52.765283018 +0000 UTC m=+1526.481108542" watchObservedRunningTime="2026-03-18 12:34:52.768993253 +0000 UTC m=+1526.484818777" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.794609 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rkttr"] Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.796278 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.812481 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.813030 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.824889 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rkttr"] Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.859311 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-scripts\") pod \"nova-cell1-conductor-db-sync-rkttr\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.862046 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-config-data\") pod \"nova-cell1-conductor-db-sync-rkttr\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.862361 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x628t\" (UniqueName: \"kubernetes.io/projected/f78c4600-3dac-4168-9aa0-a1be34986ba5-kube-api-access-x628t\") pod \"nova-cell1-conductor-db-sync-rkttr\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.862405 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rkttr\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.966090 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x628t\" (UniqueName: \"kubernetes.io/projected/f78c4600-3dac-4168-9aa0-a1be34986ba5-kube-api-access-x628t\") pod \"nova-cell1-conductor-db-sync-rkttr\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.966145 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rkttr\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.966252 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-scripts\") pod \"nova-cell1-conductor-db-sync-rkttr\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.966320 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-config-data\") pod \"nova-cell1-conductor-db-sync-rkttr\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.972337 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-scripts\") pod \"nova-cell1-conductor-db-sync-rkttr\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.972455 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-config-data\") pod \"nova-cell1-conductor-db-sync-rkttr\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.973358 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rkttr\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:52 crc kubenswrapper[4843]: I0318 12:34:52.989587 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x628t\" (UniqueName: \"kubernetes.io/projected/f78c4600-3dac-4168-9aa0-a1be34986ba5-kube-api-access-x628t\") pod \"nova-cell1-conductor-db-sync-rkttr\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.150111 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.665357 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.671124 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rkttr"] Mar 18 12:34:53 crc kubenswrapper[4843]: W0318 12:34:53.676422 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf78c4600_3dac_4168_9aa0_a1be34986ba5.slice/crio-067007c631f2d815827c2ee70371fc8439868e6f27ed005c0ddb89b876b486d0 WatchSource:0}: Error finding container 067007c631f2d815827c2ee70371fc8439868e6f27ed005c0ddb89b876b486d0: Status 404 returned error can't find the container with id 067007c631f2d815827c2ee70371fc8439868e6f27ed005c0ddb89b876b486d0 Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.788264 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-sg-core-conf-yaml\") pod \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.788333 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czth8\" (UniqueName: \"kubernetes.io/projected/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-kube-api-access-czth8\") pod \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.788399 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-run-httpd\") pod \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.788433 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-config-data\") pod \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.788491 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-combined-ca-bundle\") pod \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.788607 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-scripts\") pod \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.788715 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-log-httpd\") pod \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\" (UID: \"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f\") " Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.790174 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" (UID: "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.791184 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" (UID: "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.795952 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-scripts" (OuterVolumeSpecName: "scripts") pod "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" (UID: "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.809631 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-kube-api-access-czth8" (OuterVolumeSpecName: "kube-api-access-czth8") pod "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" (UID: "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f"). InnerVolumeSpecName "kube-api-access-czth8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.818198 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30ce1050-b96d-4fc4-8280-2e6234cf9c9d","Type":"ContainerStarted","Data":"7f7e5f44c50a8722fdfb76be5e69ecce5d3a13f9bae3ba9a8c356ba8a547f575"} Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.820698 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6599567-b95e-4948-a56f-eba713e3e77f","Type":"ContainerStarted","Data":"f881eac98ea0a960dad43dff765a88633502ab6a8b998f2614f04ed8e28a5bcb"} Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.844792 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" (UID: "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.845727 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rkttr" event={"ID":"f78c4600-3dac-4168-9aa0-a1be34986ba5","Type":"ContainerStarted","Data":"067007c631f2d815827c2ee70371fc8439868e6f27ed005c0ddb89b876b486d0"} Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.855945 4843 generic.go:334] "Generic (PLEG): container finished" podID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerID="60f8cb4a75020c06e17cc0f40b66c0206eacb8e7a7a5acadc004c83992e7dadd" exitCode=0 Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.856040 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f","Type":"ContainerDied","Data":"60f8cb4a75020c06e17cc0f40b66c0206eacb8e7a7a5acadc004c83992e7dadd"} Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.856071 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f","Type":"ContainerDied","Data":"b0290ccc3241f3100d7cfa338599a620a76cd27c8e607d2796a6a1da0d0334d1"} Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.856091 4843 scope.go:117] "RemoveContainer" containerID="2591d1d68ad2ff3882e9f91f045f8e7eda55f9ac54a6986a5e98a08876db7a1d" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.856097 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.866508 4843 generic.go:334] "Generic (PLEG): container finished" podID="6dd87f99-b9bf-481e-87f4-219e09ca9998" containerID="cfc0c4a5c43405f1e18bf72fe7a35a6ff27187cc2e33cbfe0b6c691f35f0d0bc" exitCode=0 Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.866785 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-tldf6" event={"ID":"6dd87f99-b9bf-481e-87f4-219e09ca9998","Type":"ContainerDied","Data":"cfc0c4a5c43405f1e18bf72fe7a35a6ff27187cc2e33cbfe0b6c691f35f0d0bc"} Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.866830 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-tldf6" event={"ID":"6dd87f99-b9bf-481e-87f4-219e09ca9998","Type":"ContainerStarted","Data":"db6d1b4e0d09efa9814cf3f5145e9ecc6677cf1421f099d45bbe92574af49d4b"} Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.890841 4843 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.890871 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.890880 4843 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.890890 4843 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.890899 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czth8\" (UniqueName: \"kubernetes.io/projected/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-kube-api-access-czth8\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.897104 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" (UID: "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.900575 4843 scope.go:117] "RemoveContainer" containerID="746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.952748 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-config-data" (OuterVolumeSpecName: "config-data") pod "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" (UID: "c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.995353 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:53 crc kubenswrapper[4843]: I0318 12:34:53.995377 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.057775 4843 scope.go:117] "RemoveContainer" containerID="60f8cb4a75020c06e17cc0f40b66c0206eacb8e7a7a5acadc004c83992e7dadd" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.085867 4843 scope.go:117] "RemoveContainer" containerID="242bdb078536ad139af43372b4bcff1b3a6e50e688d921467e85214c1c3c5a8b" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.204206 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.214304 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.222517 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:54 crc kubenswrapper[4843]: E0318 12:34:54.223036 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="proxy-httpd" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.223053 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="proxy-httpd" Mar 18 12:34:54 crc kubenswrapper[4843]: E0318 12:34:54.223065 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="ceilometer-notification-agent" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.223071 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="ceilometer-notification-agent" Mar 18 12:34:54 crc kubenswrapper[4843]: E0318 12:34:54.223110 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="ceilometer-central-agent" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.223118 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="ceilometer-central-agent" Mar 18 12:34:54 crc kubenswrapper[4843]: E0318 12:34:54.223127 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="sg-core" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.223133 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="sg-core" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.223354 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="ceilometer-central-agent" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.223376 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="sg-core" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.223391 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="proxy-httpd" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.223401 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" containerName="ceilometer-notification-agent" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.225211 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.226804 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.229287 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.239823 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.248395 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.311328 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-config-data\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.311411 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.311448 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fde046e-d3a2-4c52-840b-b8b5f764b80d-log-httpd\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.311477 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fde046e-d3a2-4c52-840b-b8b5f764b80d-run-httpd\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.311498 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.311523 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.311543 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-scripts\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.311608 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bswj8\" (UniqueName: \"kubernetes.io/projected/7fde046e-d3a2-4c52-840b-b8b5f764b80d-kube-api-access-bswj8\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.413777 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-scripts\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.413915 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bswj8\" (UniqueName: \"kubernetes.io/projected/7fde046e-d3a2-4c52-840b-b8b5f764b80d-kube-api-access-bswj8\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.414003 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-config-data\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.414073 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.414114 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fde046e-d3a2-4c52-840b-b8b5f764b80d-log-httpd\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.414726 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fde046e-d3a2-4c52-840b-b8b5f764b80d-log-httpd\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.414757 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fde046e-d3a2-4c52-840b-b8b5f764b80d-run-httpd\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.414800 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fde046e-d3a2-4c52-840b-b8b5f764b80d-run-httpd\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.414918 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.415034 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.418403 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-scripts\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.419106 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.421924 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.422040 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.429192 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-config-data\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.434490 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bswj8\" (UniqueName: \"kubernetes.io/projected/7fde046e-d3a2-4c52-840b-b8b5f764b80d-kube-api-access-bswj8\") pod \"ceilometer-0\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.548671 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.875361 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rkttr" event={"ID":"f78c4600-3dac-4168-9aa0-a1be34986ba5","Type":"ContainerStarted","Data":"926b6c02a1435f03f9a3e3c7259af931bba559a9e5a902d2674188ca909817cd"} Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.885152 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-tldf6" event={"ID":"6dd87f99-b9bf-481e-87f4-219e09ca9998","Type":"ContainerStarted","Data":"5379aa099ec89e4fd5152c0fbe5ad04b4ed7e88fd9118c862f55f1c0883007a3"} Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.885325 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.896483 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-rkttr" podStartSLOduration=2.896463486 podStartE2EDuration="2.896463486s" podCreationTimestamp="2026-03-18 12:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:54.891296619 +0000 UTC m=+1528.607122143" watchObservedRunningTime="2026-03-18 12:34:54.896463486 +0000 UTC m=+1528.612289010" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.926487 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-tldf6" podStartSLOduration=3.926466907 podStartE2EDuration="3.926466907s" podCreationTimestamp="2026-03-18 12:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:54.917519933 +0000 UTC m=+1528.633345457" watchObservedRunningTime="2026-03-18 12:34:54.926466907 +0000 UTC m=+1528.642292431" Mar 18 12:34:54 crc kubenswrapper[4843]: I0318 12:34:54.994561 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f" path="/var/lib/kubelet/pods/c8d8bc6b-6618-434a-b6ef-7dc30a6bf10f/volumes" Mar 18 12:34:55 crc kubenswrapper[4843]: I0318 12:34:55.102340 4843 scope.go:117] "RemoveContainer" containerID="2591d1d68ad2ff3882e9f91f045f8e7eda55f9ac54a6986a5e98a08876db7a1d" Mar 18 12:34:55 crc kubenswrapper[4843]: E0318 12:34:55.103075 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2591d1d68ad2ff3882e9f91f045f8e7eda55f9ac54a6986a5e98a08876db7a1d\": container with ID starting with 2591d1d68ad2ff3882e9f91f045f8e7eda55f9ac54a6986a5e98a08876db7a1d not found: ID does not exist" containerID="2591d1d68ad2ff3882e9f91f045f8e7eda55f9ac54a6986a5e98a08876db7a1d" Mar 18 12:34:55 crc kubenswrapper[4843]: I0318 12:34:55.103101 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2591d1d68ad2ff3882e9f91f045f8e7eda55f9ac54a6986a5e98a08876db7a1d"} err="failed to get container status \"2591d1d68ad2ff3882e9f91f045f8e7eda55f9ac54a6986a5e98a08876db7a1d\": rpc error: code = NotFound desc = could not find container \"2591d1d68ad2ff3882e9f91f045f8e7eda55f9ac54a6986a5e98a08876db7a1d\": container with ID starting with 2591d1d68ad2ff3882e9f91f045f8e7eda55f9ac54a6986a5e98a08876db7a1d not found: ID does not exist" Mar 18 12:34:55 crc kubenswrapper[4843]: I0318 12:34:55.103121 4843 scope.go:117] "RemoveContainer" containerID="746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57" Mar 18 12:34:55 crc kubenswrapper[4843]: E0318 12:34:55.103441 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57\": container with ID starting with 746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57 not found: ID does not exist" containerID="746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57" Mar 18 12:34:55 crc kubenswrapper[4843]: I0318 12:34:55.103484 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57"} err="failed to get container status \"746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57\": rpc error: code = NotFound desc = could not find container \"746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57\": container with ID starting with 746b17b8b8c29b68c9ac367403f45f76cdf14afcedada006b312fa5e60908a57 not found: ID does not exist" Mar 18 12:34:55 crc kubenswrapper[4843]: I0318 12:34:55.103500 4843 scope.go:117] "RemoveContainer" containerID="60f8cb4a75020c06e17cc0f40b66c0206eacb8e7a7a5acadc004c83992e7dadd" Mar 18 12:34:55 crc kubenswrapper[4843]: E0318 12:34:55.103754 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60f8cb4a75020c06e17cc0f40b66c0206eacb8e7a7a5acadc004c83992e7dadd\": container with ID starting with 60f8cb4a75020c06e17cc0f40b66c0206eacb8e7a7a5acadc004c83992e7dadd not found: ID does not exist" containerID="60f8cb4a75020c06e17cc0f40b66c0206eacb8e7a7a5acadc004c83992e7dadd" Mar 18 12:34:55 crc kubenswrapper[4843]: I0318 12:34:55.103800 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f8cb4a75020c06e17cc0f40b66c0206eacb8e7a7a5acadc004c83992e7dadd"} err="failed to get container status \"60f8cb4a75020c06e17cc0f40b66c0206eacb8e7a7a5acadc004c83992e7dadd\": rpc error: code = NotFound desc = could not find container \"60f8cb4a75020c06e17cc0f40b66c0206eacb8e7a7a5acadc004c83992e7dadd\": container with ID starting with 60f8cb4a75020c06e17cc0f40b66c0206eacb8e7a7a5acadc004c83992e7dadd not found: ID does not exist" Mar 18 12:34:55 crc kubenswrapper[4843]: I0318 12:34:55.103831 4843 scope.go:117] "RemoveContainer" containerID="242bdb078536ad139af43372b4bcff1b3a6e50e688d921467e85214c1c3c5a8b" Mar 18 12:34:55 crc kubenswrapper[4843]: E0318 12:34:55.104235 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"242bdb078536ad139af43372b4bcff1b3a6e50e688d921467e85214c1c3c5a8b\": container with ID starting with 242bdb078536ad139af43372b4bcff1b3a6e50e688d921467e85214c1c3c5a8b not found: ID does not exist" containerID="242bdb078536ad139af43372b4bcff1b3a6e50e688d921467e85214c1c3c5a8b" Mar 18 12:34:55 crc kubenswrapper[4843]: I0318 12:34:55.104258 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"242bdb078536ad139af43372b4bcff1b3a6e50e688d921467e85214c1c3c5a8b"} err="failed to get container status \"242bdb078536ad139af43372b4bcff1b3a6e50e688d921467e85214c1c3c5a8b\": rpc error: code = NotFound desc = could not find container \"242bdb078536ad139af43372b4bcff1b3a6e50e688d921467e85214c1c3c5a8b\": container with ID starting with 242bdb078536ad139af43372b4bcff1b3a6e50e688d921467e85214c1c3c5a8b not found: ID does not exist" Mar 18 12:34:55 crc kubenswrapper[4843]: I0318 12:34:55.833802 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:34:55 crc kubenswrapper[4843]: I0318 12:34:55.848533 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.825311 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.907897 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30ce1050-b96d-4fc4-8280-2e6234cf9c9d","Type":"ContainerStarted","Data":"bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32"} Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.909956 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4eaf3a3f-c91a-4012-af55-268aa29869cf","Type":"ContainerStarted","Data":"ccfdca08b7285b390c986f2959ce5f38c57e9ab0692e98dbc77796d676ecb2a5"} Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.910061 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4eaf3a3f-c91a-4012-af55-268aa29869cf" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ccfdca08b7285b390c986f2959ce5f38c57e9ab0692e98dbc77796d676ecb2a5" gracePeriod=30 Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.912490 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee","Type":"ContainerStarted","Data":"c1064949e8bf1de76fbcc70d7964e1b7063f315c50d491369bc4899f0b3fbb60"} Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.912621 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee","Type":"ContainerStarted","Data":"86cfdb594d96f7083c8aabe9012dd6f77f579aaf593d43ae9fb66b72770880db"} Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.913813 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fde046e-d3a2-4c52-840b-b8b5f764b80d","Type":"ContainerStarted","Data":"c718e3295a9da16dac456a040b82b9016ce3ec9f308cb52bd5fee6d1e1a2ef0e"} Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.915440 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6599567-b95e-4948-a56f-eba713e3e77f","Type":"ContainerStarted","Data":"2d17a0ffa42bf328f6a4f1470ea085f52ee0a3dcb434ffb5f1dc382919457f65"} Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.915479 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6599567-b95e-4948-a56f-eba713e3e77f","Type":"ContainerStarted","Data":"2cd65047acbf4c25ddaaac1d45399c406950d334f21266de3e64579537b5290e"} Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.915540 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d6599567-b95e-4948-a56f-eba713e3e77f" containerName="nova-metadata-log" containerID="cri-o://2cd65047acbf4c25ddaaac1d45399c406950d334f21266de3e64579537b5290e" gracePeriod=30 Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.915581 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d6599567-b95e-4948-a56f-eba713e3e77f" containerName="nova-metadata-metadata" containerID="cri-o://2d17a0ffa42bf328f6a4f1470ea085f52ee0a3dcb434ffb5f1dc382919457f65" gracePeriod=30 Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.932197 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.50797049 podStartE2EDuration="5.932166113s" podCreationTimestamp="2026-03-18 12:34:51 +0000 UTC" firstStartedPulling="2026-03-18 12:34:52.734213176 +0000 UTC m=+1526.450038700" lastFinishedPulling="2026-03-18 12:34:56.158408799 +0000 UTC m=+1529.874234323" observedRunningTime="2026-03-18 12:34:56.923317962 +0000 UTC m=+1530.639143486" watchObservedRunningTime="2026-03-18 12:34:56.932166113 +0000 UTC m=+1530.647991637" Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.945961 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.452126425 podStartE2EDuration="5.945938104s" podCreationTimestamp="2026-03-18 12:34:51 +0000 UTC" firstStartedPulling="2026-03-18 12:34:52.666121963 +0000 UTC m=+1526.381947487" lastFinishedPulling="2026-03-18 12:34:56.159933642 +0000 UTC m=+1529.875759166" observedRunningTime="2026-03-18 12:34:56.941047925 +0000 UTC m=+1530.656873459" watchObservedRunningTime="2026-03-18 12:34:56.945938104 +0000 UTC m=+1530.661763628" Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.968513 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.191595959 podStartE2EDuration="5.968491194s" podCreationTimestamp="2026-03-18 12:34:51 +0000 UTC" firstStartedPulling="2026-03-18 12:34:52.581273984 +0000 UTC m=+1526.297099508" lastFinishedPulling="2026-03-18 12:34:56.358169219 +0000 UTC m=+1530.073994743" observedRunningTime="2026-03-18 12:34:56.963153602 +0000 UTC m=+1530.678979126" watchObservedRunningTime="2026-03-18 12:34:56.968491194 +0000 UTC m=+1530.684316728" Mar 18 12:34:56 crc kubenswrapper[4843]: I0318 12:34:56.987773 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.425699295 podStartE2EDuration="5.987749901s" podCreationTimestamp="2026-03-18 12:34:51 +0000 UTC" firstStartedPulling="2026-03-18 12:34:52.733055433 +0000 UTC m=+1526.448880957" lastFinishedPulling="2026-03-18 12:34:56.295106039 +0000 UTC m=+1530.010931563" observedRunningTime="2026-03-18 12:34:56.983000496 +0000 UTC m=+1530.698826020" watchObservedRunningTime="2026-03-18 12:34:56.987749901 +0000 UTC m=+1530.703575425" Mar 18 12:34:57 crc kubenswrapper[4843]: I0318 12:34:57.129763 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 12:34:57 crc kubenswrapper[4843]: I0318 12:34:57.178125 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:57 crc kubenswrapper[4843]: I0318 12:34:57.928572 4843 generic.go:334] "Generic (PLEG): container finished" podID="d6599567-b95e-4948-a56f-eba713e3e77f" containerID="2cd65047acbf4c25ddaaac1d45399c406950d334f21266de3e64579537b5290e" exitCode=143 Mar 18 12:34:57 crc kubenswrapper[4843]: I0318 12:34:57.928727 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6599567-b95e-4948-a56f-eba713e3e77f","Type":"ContainerDied","Data":"2cd65047acbf4c25ddaaac1d45399c406950d334f21266de3e64579537b5290e"} Mar 18 12:34:57 crc kubenswrapper[4843]: I0318 12:34:57.931514 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fde046e-d3a2-4c52-840b-b8b5f764b80d","Type":"ContainerStarted","Data":"ec38f802436989d59df83382131616b2862faef36cbee7f7cbecca57b7c70b4c"} Mar 18 12:34:59 crc kubenswrapper[4843]: I0318 12:34:59.011537 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fde046e-d3a2-4c52-840b-b8b5f764b80d","Type":"ContainerStarted","Data":"89ae07b1f2958ad89dd7f8c6ff71f9b8018bc22ff6163fc4b4fe84d85b27d783"} Mar 18 12:35:00 crc kubenswrapper[4843]: I0318 12:35:00.029175 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 12:35:00 crc kubenswrapper[4843]: I0318 12:35:00.440404 4843 scope.go:117] "RemoveContainer" containerID="659087b443798bf81f6607e29bfccfe43ff01fd2628d51de2aa86bb5afbc54f4" Mar 18 12:35:01 crc kubenswrapper[4843]: I0318 12:35:01.059011 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fde046e-d3a2-4c52-840b-b8b5f764b80d","Type":"ContainerStarted","Data":"9377516e936f59e97b16d09a1fb7a9e745359e4e2027e8a2df21ee17dede124e"} Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.031761 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.031837 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.078664 4843 generic.go:334] "Generic (PLEG): container finished" podID="d981e7da-a5a9-4d42-94f4-f78aefb2a660" containerID="dc622d8f50c34a532c7b1d0b1f69be3eaa1d5bf177a148c1c9d1ab8243b1e839" exitCode=0 Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.078707 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7sn7w" event={"ID":"d981e7da-a5a9-4d42-94f4-f78aefb2a660","Type":"ContainerDied","Data":"dc622d8f50c34a532c7b1d0b1f69be3eaa1d5bf177a148c1c9d1ab8243b1e839"} Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.130558 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.157958 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.162893 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.246356 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-nf9gt"] Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.246599 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" podUID="38dcade5-06ff-4d7c-aa4a-d334adf77bd8" containerName="dnsmasq-dns" containerID="cri-o://18db89212e8107d0e2020f9c0ad7557bee08088b25e3ff967885ecb9400e022a" gracePeriod=10 Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.822956 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.984571 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-ovsdbserver-nb\") pod \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.984724 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-dns-svc\") pod \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.984764 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-ovsdbserver-sb\") pod \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.984793 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-dns-swift-storage-0\") pod \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.984948 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwbwm\" (UniqueName: \"kubernetes.io/projected/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-kube-api-access-cwbwm\") pod \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " Mar 18 12:35:02 crc kubenswrapper[4843]: I0318 12:35:02.985025 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-config\") pod \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\" (UID: \"38dcade5-06ff-4d7c-aa4a-d334adf77bd8\") " Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.007368 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-kube-api-access-cwbwm" (OuterVolumeSpecName: "kube-api-access-cwbwm") pod "38dcade5-06ff-4d7c-aa4a-d334adf77bd8" (UID: "38dcade5-06ff-4d7c-aa4a-d334adf77bd8"). InnerVolumeSpecName "kube-api-access-cwbwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.038745 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38dcade5-06ff-4d7c-aa4a-d334adf77bd8" (UID: "38dcade5-06ff-4d7c-aa4a-d334adf77bd8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.043338 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "38dcade5-06ff-4d7c-aa4a-d334adf77bd8" (UID: "38dcade5-06ff-4d7c-aa4a-d334adf77bd8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.045393 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "38dcade5-06ff-4d7c-aa4a-d334adf77bd8" (UID: "38dcade5-06ff-4d7c-aa4a-d334adf77bd8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.056919 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38dcade5-06ff-4d7c-aa4a-d334adf77bd8" (UID: "38dcade5-06ff-4d7c-aa4a-d334adf77bd8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.058212 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-config" (OuterVolumeSpecName: "config") pod "38dcade5-06ff-4d7c-aa4a-d334adf77bd8" (UID: "38dcade5-06ff-4d7c-aa4a-d334adf77bd8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.090763 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.092208 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.092222 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.092234 4843 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.092250 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwbwm\" (UniqueName: \"kubernetes.io/projected/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-kube-api-access-cwbwm\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.092262 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38dcade5-06ff-4d7c-aa4a-d334adf77bd8-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.108736 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fde046e-d3a2-4c52-840b-b8b5f764b80d","Type":"ContainerStarted","Data":"3a144c653fccb6260a3a8aa40e8ebe83aa0c58e42459c735424e882708f397fe"} Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.108949 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.114036 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.114132 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.126992 4843 generic.go:334] "Generic (PLEG): container finished" podID="f78c4600-3dac-4168-9aa0-a1be34986ba5" containerID="926b6c02a1435f03f9a3e3c7259af931bba559a9e5a902d2674188ca909817cd" exitCode=0 Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.127067 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rkttr" event={"ID":"f78c4600-3dac-4168-9aa0-a1be34986ba5","Type":"ContainerDied","Data":"926b6c02a1435f03f9a3e3c7259af931bba559a9e5a902d2674188ca909817cd"} Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.129844 4843 generic.go:334] "Generic (PLEG): container finished" podID="38dcade5-06ff-4d7c-aa4a-d334adf77bd8" containerID="18db89212e8107d0e2020f9c0ad7557bee08088b25e3ff967885ecb9400e022a" exitCode=0 Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.131050 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.135870 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" event={"ID":"38dcade5-06ff-4d7c-aa4a-d334adf77bd8","Type":"ContainerDied","Data":"18db89212e8107d0e2020f9c0ad7557bee08088b25e3ff967885ecb9400e022a"} Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.136030 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-nf9gt" event={"ID":"38dcade5-06ff-4d7c-aa4a-d334adf77bd8","Type":"ContainerDied","Data":"8137e04b19a1301dd6f6c76206915a03538024e6062dd9868e4078744edbade4"} Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.136136 4843 scope.go:117] "RemoveContainer" containerID="18db89212e8107d0e2020f9c0ad7557bee08088b25e3ff967885ecb9400e022a" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.136446 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.53975329 podStartE2EDuration="9.136432264s" podCreationTimestamp="2026-03-18 12:34:54 +0000 UTC" firstStartedPulling="2026-03-18 12:34:56.830338412 +0000 UTC m=+1530.546163936" lastFinishedPulling="2026-03-18 12:35:02.427017386 +0000 UTC m=+1536.142842910" observedRunningTime="2026-03-18 12:35:03.135149308 +0000 UTC m=+1536.850974832" watchObservedRunningTime="2026-03-18 12:35:03.136432264 +0000 UTC m=+1536.852257788" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.171409 4843 scope.go:117] "RemoveContainer" containerID="24299ea65c850ad9d34b9a26e667c5861301df289650f33e929ab27e20f4b900" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.172880 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.191807 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-nf9gt"] Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.212379 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-nf9gt"] Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.215206 4843 scope.go:117] "RemoveContainer" containerID="18db89212e8107d0e2020f9c0ad7557bee08088b25e3ff967885ecb9400e022a" Mar 18 12:35:03 crc kubenswrapper[4843]: E0318 12:35:03.216638 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18db89212e8107d0e2020f9c0ad7557bee08088b25e3ff967885ecb9400e022a\": container with ID starting with 18db89212e8107d0e2020f9c0ad7557bee08088b25e3ff967885ecb9400e022a not found: ID does not exist" containerID="18db89212e8107d0e2020f9c0ad7557bee08088b25e3ff967885ecb9400e022a" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.216774 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18db89212e8107d0e2020f9c0ad7557bee08088b25e3ff967885ecb9400e022a"} err="failed to get container status \"18db89212e8107d0e2020f9c0ad7557bee08088b25e3ff967885ecb9400e022a\": rpc error: code = NotFound desc = could not find container \"18db89212e8107d0e2020f9c0ad7557bee08088b25e3ff967885ecb9400e022a\": container with ID starting with 18db89212e8107d0e2020f9c0ad7557bee08088b25e3ff967885ecb9400e022a not found: ID does not exist" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.216881 4843 scope.go:117] "RemoveContainer" containerID="24299ea65c850ad9d34b9a26e667c5861301df289650f33e929ab27e20f4b900" Mar 18 12:35:03 crc kubenswrapper[4843]: E0318 12:35:03.217243 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24299ea65c850ad9d34b9a26e667c5861301df289650f33e929ab27e20f4b900\": container with ID starting with 24299ea65c850ad9d34b9a26e667c5861301df289650f33e929ab27e20f4b900 not found: ID does not exist" containerID="24299ea65c850ad9d34b9a26e667c5861301df289650f33e929ab27e20f4b900" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.217264 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24299ea65c850ad9d34b9a26e667c5861301df289650f33e929ab27e20f4b900"} err="failed to get container status \"24299ea65c850ad9d34b9a26e667c5861301df289650f33e929ab27e20f4b900\": rpc error: code = NotFound desc = could not find container \"24299ea65c850ad9d34b9a26e667c5861301df289650f33e929ab27e20f4b900\": container with ID starting with 24299ea65c850ad9d34b9a26e667c5861301df289650f33e929ab27e20f4b900 not found: ID does not exist" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.637123 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.644374 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-scripts\") pod \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.644477 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pklc\" (UniqueName: \"kubernetes.io/projected/d981e7da-a5a9-4d42-94f4-f78aefb2a660-kube-api-access-6pklc\") pod \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.644648 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-config-data\") pod \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.644830 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-combined-ca-bundle\") pod \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\" (UID: \"d981e7da-a5a9-4d42-94f4-f78aefb2a660\") " Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.651810 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d981e7da-a5a9-4d42-94f4-f78aefb2a660-kube-api-access-6pklc" (OuterVolumeSpecName: "kube-api-access-6pklc") pod "d981e7da-a5a9-4d42-94f4-f78aefb2a660" (UID: "d981e7da-a5a9-4d42-94f4-f78aefb2a660"). InnerVolumeSpecName "kube-api-access-6pklc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.655576 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-scripts" (OuterVolumeSpecName: "scripts") pod "d981e7da-a5a9-4d42-94f4-f78aefb2a660" (UID: "d981e7da-a5a9-4d42-94f4-f78aefb2a660"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.709686 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d981e7da-a5a9-4d42-94f4-f78aefb2a660" (UID: "d981e7da-a5a9-4d42-94f4-f78aefb2a660"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.717486 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-config-data" (OuterVolumeSpecName: "config-data") pod "d981e7da-a5a9-4d42-94f4-f78aefb2a660" (UID: "d981e7da-a5a9-4d42-94f4-f78aefb2a660"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.746721 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.746857 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.746927 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pklc\" (UniqueName: \"kubernetes.io/projected/d981e7da-a5a9-4d42-94f4-f78aefb2a660-kube-api-access-6pklc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:03 crc kubenswrapper[4843]: I0318 12:35:03.746994 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d981e7da-a5a9-4d42-94f4-f78aefb2a660-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.148986 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7sn7w" event={"ID":"d981e7da-a5a9-4d42-94f4-f78aefb2a660","Type":"ContainerDied","Data":"245cb6e2cb40107eda017353ab0a6b5fc3a3080826a33f1217ef3e79318514ea"} Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.149234 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="245cb6e2cb40107eda017353ab0a6b5fc3a3080826a33f1217ef3e79318514ea" Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.149001 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7sn7w" Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.358087 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.358313 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" containerName="nova-api-log" containerID="cri-o://86cfdb594d96f7083c8aabe9012dd6f77f579aaf593d43ae9fb66b72770880db" gracePeriod=30 Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.358853 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" containerName="nova-api-api" containerID="cri-o://c1064949e8bf1de76fbcc70d7964e1b7063f315c50d491369bc4899f0b3fbb60" gracePeriod=30 Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.649160 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.654610 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.673025 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-combined-ca-bundle\") pod \"f78c4600-3dac-4168-9aa0-a1be34986ba5\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.673144 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-config-data\") pod \"f78c4600-3dac-4168-9aa0-a1be34986ba5\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.673285 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x628t\" (UniqueName: \"kubernetes.io/projected/f78c4600-3dac-4168-9aa0-a1be34986ba5-kube-api-access-x628t\") pod \"f78c4600-3dac-4168-9aa0-a1be34986ba5\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.673311 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-scripts\") pod \"f78c4600-3dac-4168-9aa0-a1be34986ba5\" (UID: \"f78c4600-3dac-4168-9aa0-a1be34986ba5\") " Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.678305 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f78c4600-3dac-4168-9aa0-a1be34986ba5-kube-api-access-x628t" (OuterVolumeSpecName: "kube-api-access-x628t") pod "f78c4600-3dac-4168-9aa0-a1be34986ba5" (UID: "f78c4600-3dac-4168-9aa0-a1be34986ba5"). InnerVolumeSpecName "kube-api-access-x628t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.691740 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-scripts" (OuterVolumeSpecName: "scripts") pod "f78c4600-3dac-4168-9aa0-a1be34986ba5" (UID: "f78c4600-3dac-4168-9aa0-a1be34986ba5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.708901 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f78c4600-3dac-4168-9aa0-a1be34986ba5" (UID: "f78c4600-3dac-4168-9aa0-a1be34986ba5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.879097 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.879126 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x628t\" (UniqueName: \"kubernetes.io/projected/f78c4600-3dac-4168-9aa0-a1be34986ba5-kube-api-access-x628t\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.879139 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.893665 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-config-data" (OuterVolumeSpecName: "config-data") pod "f78c4600-3dac-4168-9aa0-a1be34986ba5" (UID: "f78c4600-3dac-4168-9aa0-a1be34986ba5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.980519 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78c4600-3dac-4168-9aa0-a1be34986ba5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:04 crc kubenswrapper[4843]: I0318 12:35:04.994068 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38dcade5-06ff-4d7c-aa4a-d334adf77bd8" path="/var/lib/kubelet/pods/38dcade5-06ff-4d7c-aa4a-d334adf77bd8/volumes" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.165250 4843 generic.go:334] "Generic (PLEG): container finished" podID="8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" containerID="86cfdb594d96f7083c8aabe9012dd6f77f579aaf593d43ae9fb66b72770880db" exitCode=143 Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.165570 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee","Type":"ContainerDied","Data":"86cfdb594d96f7083c8aabe9012dd6f77f579aaf593d43ae9fb66b72770880db"} Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.169687 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="30ce1050-b96d-4fc4-8280-2e6234cf9c9d" containerName="nova-scheduler-scheduler" containerID="cri-o://bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32" gracePeriod=30 Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.170423 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rkttr" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.170630 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rkttr" event={"ID":"f78c4600-3dac-4168-9aa0-a1be34986ba5","Type":"ContainerDied","Data":"067007c631f2d815827c2ee70371fc8439868e6f27ed005c0ddb89b876b486d0"} Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.170756 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="067007c631f2d815827c2ee70371fc8439868e6f27ed005c0ddb89b876b486d0" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.249119 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 12:35:05 crc kubenswrapper[4843]: E0318 12:35:05.249579 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38dcade5-06ff-4d7c-aa4a-d334adf77bd8" containerName="dnsmasq-dns" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.249603 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="38dcade5-06ff-4d7c-aa4a-d334adf77bd8" containerName="dnsmasq-dns" Mar 18 12:35:05 crc kubenswrapper[4843]: E0318 12:35:05.249621 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c4600-3dac-4168-9aa0-a1be34986ba5" containerName="nova-cell1-conductor-db-sync" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.249631 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c4600-3dac-4168-9aa0-a1be34986ba5" containerName="nova-cell1-conductor-db-sync" Mar 18 12:35:05 crc kubenswrapper[4843]: E0318 12:35:05.249681 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38dcade5-06ff-4d7c-aa4a-d334adf77bd8" containerName="init" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.249690 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="38dcade5-06ff-4d7c-aa4a-d334adf77bd8" containerName="init" Mar 18 12:35:05 crc kubenswrapper[4843]: E0318 12:35:05.249700 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d981e7da-a5a9-4d42-94f4-f78aefb2a660" containerName="nova-manage" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.249708 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d981e7da-a5a9-4d42-94f4-f78aefb2a660" containerName="nova-manage" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.249882 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="38dcade5-06ff-4d7c-aa4a-d334adf77bd8" containerName="dnsmasq-dns" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.249896 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c4600-3dac-4168-9aa0-a1be34986ba5" containerName="nova-cell1-conductor-db-sync" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.249913 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d981e7da-a5a9-4d42-94f4-f78aefb2a660" containerName="nova-manage" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.250477 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.254012 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.265036 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.390767 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285d4562-f78e-4e15-8802-ad07d22a1e95-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"285d4562-f78e-4e15-8802-ad07d22a1e95\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.390913 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285d4562-f78e-4e15-8802-ad07d22a1e95-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"285d4562-f78e-4e15-8802-ad07d22a1e95\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.390982 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgkfp\" (UniqueName: \"kubernetes.io/projected/285d4562-f78e-4e15-8802-ad07d22a1e95-kube-api-access-zgkfp\") pod \"nova-cell1-conductor-0\" (UID: \"285d4562-f78e-4e15-8802-ad07d22a1e95\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.492339 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285d4562-f78e-4e15-8802-ad07d22a1e95-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"285d4562-f78e-4e15-8802-ad07d22a1e95\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.492447 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgkfp\" (UniqueName: \"kubernetes.io/projected/285d4562-f78e-4e15-8802-ad07d22a1e95-kube-api-access-zgkfp\") pod \"nova-cell1-conductor-0\" (UID: \"285d4562-f78e-4e15-8802-ad07d22a1e95\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.492483 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285d4562-f78e-4e15-8802-ad07d22a1e95-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"285d4562-f78e-4e15-8802-ad07d22a1e95\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.506714 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285d4562-f78e-4e15-8802-ad07d22a1e95-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"285d4562-f78e-4e15-8802-ad07d22a1e95\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.506825 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285d4562-f78e-4e15-8802-ad07d22a1e95-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"285d4562-f78e-4e15-8802-ad07d22a1e95\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.532159 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgkfp\" (UniqueName: \"kubernetes.io/projected/285d4562-f78e-4e15-8802-ad07d22a1e95-kube-api-access-zgkfp\") pod \"nova-cell1-conductor-0\" (UID: \"285d4562-f78e-4e15-8802-ad07d22a1e95\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:05 crc kubenswrapper[4843]: I0318 12:35:05.569785 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:06 crc kubenswrapper[4843]: I0318 12:35:06.057700 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 12:35:06 crc kubenswrapper[4843]: I0318 12:35:06.179495 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"285d4562-f78e-4e15-8802-ad07d22a1e95","Type":"ContainerStarted","Data":"234d0ccb13eb001f21c11d46613fd27a595e3f5b63bbc81f5fa83b0ac7eb2913"} Mar 18 12:35:07 crc kubenswrapper[4843]: E0318 12:35:07.272199 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:35:07 crc kubenswrapper[4843]: E0318 12:35:07.276354 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:35:07 crc kubenswrapper[4843]: E0318 12:35:07.278676 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:35:07 crc kubenswrapper[4843]: E0318 12:35:07.278719 4843 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="30ce1050-b96d-4fc4-8280-2e6234cf9c9d" containerName="nova-scheduler-scheduler" Mar 18 12:35:07 crc kubenswrapper[4843]: I0318 12:35:07.280330 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"285d4562-f78e-4e15-8802-ad07d22a1e95","Type":"ContainerStarted","Data":"0bc8265ce5213f1477311b4b84f810ca3ab29ad8ae5a718bae58f8b53cf5dc1d"} Mar 18 12:35:07 crc kubenswrapper[4843]: I0318 12:35:07.281329 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:07 crc kubenswrapper[4843]: I0318 12:35:07.307887 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.307866389 podStartE2EDuration="2.307866389s" podCreationTimestamp="2026-03-18 12:35:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:07.304376139 +0000 UTC m=+1541.020201673" watchObservedRunningTime="2026-03-18 12:35:07.307866389 +0000 UTC m=+1541.023691913" Mar 18 12:35:08 crc kubenswrapper[4843]: I0318 12:35:08.782629 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:35:08 crc kubenswrapper[4843]: I0318 12:35:08.828315 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-combined-ca-bundle\") pod \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\" (UID: \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\") " Mar 18 12:35:08 crc kubenswrapper[4843]: I0318 12:35:08.828367 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22b4d\" (UniqueName: \"kubernetes.io/projected/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-kube-api-access-22b4d\") pod \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\" (UID: \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\") " Mar 18 12:35:08 crc kubenswrapper[4843]: I0318 12:35:08.828529 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-config-data\") pod \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\" (UID: \"30ce1050-b96d-4fc4-8280-2e6234cf9c9d\") " Mar 18 12:35:08 crc kubenswrapper[4843]: I0318 12:35:08.837169 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-kube-api-access-22b4d" (OuterVolumeSpecName: "kube-api-access-22b4d") pod "30ce1050-b96d-4fc4-8280-2e6234cf9c9d" (UID: "30ce1050-b96d-4fc4-8280-2e6234cf9c9d"). InnerVolumeSpecName "kube-api-access-22b4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:08 crc kubenswrapper[4843]: I0318 12:35:08.862854 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-config-data" (OuterVolumeSpecName: "config-data") pod "30ce1050-b96d-4fc4-8280-2e6234cf9c9d" (UID: "30ce1050-b96d-4fc4-8280-2e6234cf9c9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:08 crc kubenswrapper[4843]: I0318 12:35:08.863265 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30ce1050-b96d-4fc4-8280-2e6234cf9c9d" (UID: "30ce1050-b96d-4fc4-8280-2e6234cf9c9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:08 crc kubenswrapper[4843]: I0318 12:35:08.939511 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:08 crc kubenswrapper[4843]: I0318 12:35:08.939575 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22b4d\" (UniqueName: \"kubernetes.io/projected/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-kube-api-access-22b4d\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:08 crc kubenswrapper[4843]: I0318 12:35:08.939601 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ce1050-b96d-4fc4-8280-2e6234cf9c9d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.302304 4843 generic.go:334] "Generic (PLEG): container finished" podID="30ce1050-b96d-4fc4-8280-2e6234cf9c9d" containerID="bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32" exitCode=0 Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.302357 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30ce1050-b96d-4fc4-8280-2e6234cf9c9d","Type":"ContainerDied","Data":"bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32"} Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.302385 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.302404 4843 scope.go:117] "RemoveContainer" containerID="bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.302392 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30ce1050-b96d-4fc4-8280-2e6234cf9c9d","Type":"ContainerDied","Data":"7f7e5f44c50a8722fdfb76be5e69ecce5d3a13f9bae3ba9a8c356ba8a547f575"} Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.321018 4843 scope.go:117] "RemoveContainer" containerID="bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32" Mar 18 12:35:09 crc kubenswrapper[4843]: E0318 12:35:09.321458 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32\": container with ID starting with bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32 not found: ID does not exist" containerID="bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.321496 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32"} err="failed to get container status \"bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32\": rpc error: code = NotFound desc = could not find container \"bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32\": container with ID starting with bbe16b5e9c7bca3eb6e0930b07a48877ea401ae7739c1701256a2654637a0f32 not found: ID does not exist" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.335002 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.343726 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.352958 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:09 crc kubenswrapper[4843]: E0318 12:35:09.353396 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ce1050-b96d-4fc4-8280-2e6234cf9c9d" containerName="nova-scheduler-scheduler" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.353411 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ce1050-b96d-4fc4-8280-2e6234cf9c9d" containerName="nova-scheduler-scheduler" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.353585 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ce1050-b96d-4fc4-8280-2e6234cf9c9d" containerName="nova-scheduler-scheduler" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.354215 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.356582 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.363181 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.479096 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-756p4\" (UniqueName: \"kubernetes.io/projected/95d1f2b2-faa6-4614-8ad7-023e2285fefc-kube-api-access-756p4\") pod \"nova-scheduler-0\" (UID: \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.479164 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d1f2b2-faa6-4614-8ad7-023e2285fefc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.479218 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d1f2b2-faa6-4614-8ad7-023e2285fefc-config-data\") pod \"nova-scheduler-0\" (UID: \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.581635 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-756p4\" (UniqueName: \"kubernetes.io/projected/95d1f2b2-faa6-4614-8ad7-023e2285fefc-kube-api-access-756p4\") pod \"nova-scheduler-0\" (UID: \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.581732 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d1f2b2-faa6-4614-8ad7-023e2285fefc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.581782 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d1f2b2-faa6-4614-8ad7-023e2285fefc-config-data\") pod \"nova-scheduler-0\" (UID: \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.585516 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d1f2b2-faa6-4614-8ad7-023e2285fefc-config-data\") pod \"nova-scheduler-0\" (UID: \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.585713 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d1f2b2-faa6-4614-8ad7-023e2285fefc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.598078 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-756p4\" (UniqueName: \"kubernetes.io/projected/95d1f2b2-faa6-4614-8ad7-023e2285fefc-kube-api-access-756p4\") pod \"nova-scheduler-0\" (UID: \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:09 crc kubenswrapper[4843]: I0318 12:35:09.679754 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.030395 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.030706 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.139939 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.140004 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:35:10 crc kubenswrapper[4843]: W0318 12:35:10.221670 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95d1f2b2_faa6_4614_8ad7_023e2285fefc.slice/crio-c967654bb1c0a6fcc06a79555820de6aa55a05fc100a4f9c74f653545320c47a WatchSource:0}: Error finding container c967654bb1c0a6fcc06a79555820de6aa55a05fc100a4f9c74f653545320c47a: Status 404 returned error can't find the container with id c967654bb1c0a6fcc06a79555820de6aa55a05fc100a4f9c74f653545320c47a Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.228484 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.241574 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.313132 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"95d1f2b2-faa6-4614-8ad7-023e2285fefc","Type":"ContainerStarted","Data":"c967654bb1c0a6fcc06a79555820de6aa55a05fc100a4f9c74f653545320c47a"} Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.317614 4843 generic.go:334] "Generic (PLEG): container finished" podID="8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" containerID="c1064949e8bf1de76fbcc70d7964e1b7063f315c50d491369bc4899f0b3fbb60" exitCode=0 Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.317674 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee","Type":"ContainerDied","Data":"c1064949e8bf1de76fbcc70d7964e1b7063f315c50d491369bc4899f0b3fbb60"} Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.317736 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee","Type":"ContainerDied","Data":"db096a37d398725bfb3172478c0e3a673fb5b2aa5f31db532709add96cd73976"} Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.317759 4843 scope.go:117] "RemoveContainer" containerID="c1064949e8bf1de76fbcc70d7964e1b7063f315c50d491369bc4899f0b3fbb60" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.317999 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.346511 4843 scope.go:117] "RemoveContainer" containerID="86cfdb594d96f7083c8aabe9012dd6f77f579aaf593d43ae9fb66b72770880db" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.373765 4843 scope.go:117] "RemoveContainer" containerID="c1064949e8bf1de76fbcc70d7964e1b7063f315c50d491369bc4899f0b3fbb60" Mar 18 12:35:10 crc kubenswrapper[4843]: E0318 12:35:10.374217 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1064949e8bf1de76fbcc70d7964e1b7063f315c50d491369bc4899f0b3fbb60\": container with ID starting with c1064949e8bf1de76fbcc70d7964e1b7063f315c50d491369bc4899f0b3fbb60 not found: ID does not exist" containerID="c1064949e8bf1de76fbcc70d7964e1b7063f315c50d491369bc4899f0b3fbb60" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.374255 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1064949e8bf1de76fbcc70d7964e1b7063f315c50d491369bc4899f0b3fbb60"} err="failed to get container status \"c1064949e8bf1de76fbcc70d7964e1b7063f315c50d491369bc4899f0b3fbb60\": rpc error: code = NotFound desc = could not find container \"c1064949e8bf1de76fbcc70d7964e1b7063f315c50d491369bc4899f0b3fbb60\": container with ID starting with c1064949e8bf1de76fbcc70d7964e1b7063f315c50d491369bc4899f0b3fbb60 not found: ID does not exist" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.374281 4843 scope.go:117] "RemoveContainer" containerID="86cfdb594d96f7083c8aabe9012dd6f77f579aaf593d43ae9fb66b72770880db" Mar 18 12:35:10 crc kubenswrapper[4843]: E0318 12:35:10.374690 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86cfdb594d96f7083c8aabe9012dd6f77f579aaf593d43ae9fb66b72770880db\": container with ID starting with 86cfdb594d96f7083c8aabe9012dd6f77f579aaf593d43ae9fb66b72770880db not found: ID does not exist" containerID="86cfdb594d96f7083c8aabe9012dd6f77f579aaf593d43ae9fb66b72770880db" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.374708 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86cfdb594d96f7083c8aabe9012dd6f77f579aaf593d43ae9fb66b72770880db"} err="failed to get container status \"86cfdb594d96f7083c8aabe9012dd6f77f579aaf593d43ae9fb66b72770880db\": rpc error: code = NotFound desc = could not find container \"86cfdb594d96f7083c8aabe9012dd6f77f579aaf593d43ae9fb66b72770880db\": container with ID starting with 86cfdb594d96f7083c8aabe9012dd6f77f579aaf593d43ae9fb66b72770880db not found: ID does not exist" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.403623 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-logs\") pod \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.403717 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76hpv\" (UniqueName: \"kubernetes.io/projected/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-kube-api-access-76hpv\") pod \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.403843 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-config-data\") pod \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.404053 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-combined-ca-bundle\") pod \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\" (UID: \"8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee\") " Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.404235 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-logs" (OuterVolumeSpecName: "logs") pod "8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" (UID: "8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.404827 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.408099 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-kube-api-access-76hpv" (OuterVolumeSpecName: "kube-api-access-76hpv") pod "8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" (UID: "8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee"). InnerVolumeSpecName "kube-api-access-76hpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.429144 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-config-data" (OuterVolumeSpecName: "config-data") pod "8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" (UID: "8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.429527 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" (UID: "8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.506571 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.506614 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76hpv\" (UniqueName: \"kubernetes.io/projected/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-kube-api-access-76hpv\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.506631 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.679915 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.689779 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.779173 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:10 crc kubenswrapper[4843]: E0318 12:35:10.779665 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" containerName="nova-api-log" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.779687 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" containerName="nova-api-log" Mar 18 12:35:10 crc kubenswrapper[4843]: E0318 12:35:10.779709 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" containerName="nova-api-api" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.779716 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" containerName="nova-api-api" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.779943 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" containerName="nova-api-log" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.779971 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" containerName="nova-api-api" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.781076 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.785146 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.793355 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.868480 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd86a17-2883-4f89-b3a6-5b31461fca9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.868880 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd86a17-2883-4f89-b3a6-5b31461fca9d-logs\") pod \"nova-api-0\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.869097 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnqwf\" (UniqueName: \"kubernetes.io/projected/4bd86a17-2883-4f89-b3a6-5b31461fca9d-kube-api-access-wnqwf\") pod \"nova-api-0\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.869268 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd86a17-2883-4f89-b3a6-5b31461fca9d-config-data\") pod \"nova-api-0\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.970630 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd86a17-2883-4f89-b3a6-5b31461fca9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.970855 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd86a17-2883-4f89-b3a6-5b31461fca9d-logs\") pod \"nova-api-0\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.970936 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnqwf\" (UniqueName: \"kubernetes.io/projected/4bd86a17-2883-4f89-b3a6-5b31461fca9d-kube-api-access-wnqwf\") pod \"nova-api-0\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.970985 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd86a17-2883-4f89-b3a6-5b31461fca9d-config-data\") pod \"nova-api-0\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.971415 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd86a17-2883-4f89-b3a6-5b31461fca9d-logs\") pod \"nova-api-0\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.976370 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd86a17-2883-4f89-b3a6-5b31461fca9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.987008 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnqwf\" (UniqueName: \"kubernetes.io/projected/4bd86a17-2883-4f89-b3a6-5b31461fca9d-kube-api-access-wnqwf\") pod \"nova-api-0\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.994867 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd86a17-2883-4f89-b3a6-5b31461fca9d-config-data\") pod \"nova-api-0\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " pod="openstack/nova-api-0" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.995668 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30ce1050-b96d-4fc4-8280-2e6234cf9c9d" path="/var/lib/kubelet/pods/30ce1050-b96d-4fc4-8280-2e6234cf9c9d/volumes" Mar 18 12:35:10 crc kubenswrapper[4843]: I0318 12:35:10.996462 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee" path="/var/lib/kubelet/pods/8959a9a3-a7b4-4a59-88e9-cf90cf7ef6ee/volumes" Mar 18 12:35:11 crc kubenswrapper[4843]: I0318 12:35:11.108257 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:11 crc kubenswrapper[4843]: I0318 12:35:11.331147 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"95d1f2b2-faa6-4614-8ad7-023e2285fefc","Type":"ContainerStarted","Data":"59d4aac0b4a0325d1dcedff9dcc41a05e8d9449a95458b2d7496241c6f532139"} Mar 18 12:35:11 crc kubenswrapper[4843]: I0318 12:35:11.369190 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.369166567 podStartE2EDuration="2.369166567s" podCreationTimestamp="2026-03-18 12:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:11.361058277 +0000 UTC m=+1545.076883811" watchObservedRunningTime="2026-03-18 12:35:11.369166567 +0000 UTC m=+1545.084992101" Mar 18 12:35:11 crc kubenswrapper[4843]: I0318 12:35:11.655484 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:11 crc kubenswrapper[4843]: W0318 12:35:11.664336 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bd86a17_2883_4f89_b3a6_5b31461fca9d.slice/crio-18f80865f3cb2859872c9ec939afd56ffd7bdd418cd4b33c5a6f0d2d615f4a23 WatchSource:0}: Error finding container 18f80865f3cb2859872c9ec939afd56ffd7bdd418cd4b33c5a6f0d2d615f4a23: Status 404 returned error can't find the container with id 18f80865f3cb2859872c9ec939afd56ffd7bdd418cd4b33c5a6f0d2d615f4a23 Mar 18 12:35:12 crc kubenswrapper[4843]: I0318 12:35:12.343737 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4bd86a17-2883-4f89-b3a6-5b31461fca9d","Type":"ContainerStarted","Data":"340b8af3929bb75a6b1f6b623a8973de3c620f453f98d9d32d7bf6b8c06d93f7"} Mar 18 12:35:12 crc kubenswrapper[4843]: I0318 12:35:12.344099 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4bd86a17-2883-4f89-b3a6-5b31461fca9d","Type":"ContainerStarted","Data":"b954ff01335a0320b58c93700008cda691967f54fdef2d51509b04d47b9e44b5"} Mar 18 12:35:12 crc kubenswrapper[4843]: I0318 12:35:12.344133 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4bd86a17-2883-4f89-b3a6-5b31461fca9d","Type":"ContainerStarted","Data":"18f80865f3cb2859872c9ec939afd56ffd7bdd418cd4b33c5a6f0d2d615f4a23"} Mar 18 12:35:12 crc kubenswrapper[4843]: I0318 12:35:12.485853 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.485826446 podStartE2EDuration="2.485826446s" podCreationTimestamp="2026-03-18 12:35:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:12.481309658 +0000 UTC m=+1546.197135182" watchObservedRunningTime="2026-03-18 12:35:12.485826446 +0000 UTC m=+1546.201651980" Mar 18 12:35:14 crc kubenswrapper[4843]: I0318 12:35:14.679935 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 12:35:15 crc kubenswrapper[4843]: I0318 12:35:15.612325 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:19 crc kubenswrapper[4843]: I0318 12:35:19.680952 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 12:35:19 crc kubenswrapper[4843]: I0318 12:35:19.717332 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 12:35:20 crc kubenswrapper[4843]: I0318 12:35:20.496427 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 12:35:21 crc kubenswrapper[4843]: I0318 12:35:21.108452 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:35:21 crc kubenswrapper[4843]: I0318 12:35:21.108504 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:35:22 crc kubenswrapper[4843]: I0318 12:35:22.149885 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4bd86a17-2883-4f89-b3a6-5b31461fca9d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:35:22 crc kubenswrapper[4843]: I0318 12:35:22.190896 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4bd86a17-2883-4f89-b3a6-5b31461fca9d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:35:24 crc kubenswrapper[4843]: I0318 12:35:24.573467 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.447117 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.453085 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.571352 4843 generic.go:334] "Generic (PLEG): container finished" podID="4eaf3a3f-c91a-4012-af55-268aa29869cf" containerID="ccfdca08b7285b390c986f2959ce5f38c57e9ab0692e98dbc77796d676ecb2a5" exitCode=137 Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.571518 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4eaf3a3f-c91a-4012-af55-268aa29869cf","Type":"ContainerDied","Data":"ccfdca08b7285b390c986f2959ce5f38c57e9ab0692e98dbc77796d676ecb2a5"} Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.571528 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.571546 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4eaf3a3f-c91a-4012-af55-268aa29869cf","Type":"ContainerDied","Data":"9970267ff8bf1abc142ced0b3273337f30db091771015a60b685a77fd574bd8c"} Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.571562 4843 scope.go:117] "RemoveContainer" containerID="ccfdca08b7285b390c986f2959ce5f38c57e9ab0692e98dbc77796d676ecb2a5" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.577098 4843 generic.go:334] "Generic (PLEG): container finished" podID="d6599567-b95e-4948-a56f-eba713e3e77f" containerID="2d17a0ffa42bf328f6a4f1470ea085f52ee0a3dcb434ffb5f1dc382919457f65" exitCode=137 Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.577148 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6599567-b95e-4948-a56f-eba713e3e77f","Type":"ContainerDied","Data":"2d17a0ffa42bf328f6a4f1470ea085f52ee0a3dcb434ffb5f1dc382919457f65"} Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.577177 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6599567-b95e-4948-a56f-eba713e3e77f","Type":"ContainerDied","Data":"f881eac98ea0a960dad43dff765a88633502ab6a8b998f2614f04ed8e28a5bcb"} Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.577499 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.610955 4843 scope.go:117] "RemoveContainer" containerID="ccfdca08b7285b390c986f2959ce5f38c57e9ab0692e98dbc77796d676ecb2a5" Mar 18 12:35:27 crc kubenswrapper[4843]: E0318 12:35:27.611975 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccfdca08b7285b390c986f2959ce5f38c57e9ab0692e98dbc77796d676ecb2a5\": container with ID starting with ccfdca08b7285b390c986f2959ce5f38c57e9ab0692e98dbc77796d676ecb2a5 not found: ID does not exist" containerID="ccfdca08b7285b390c986f2959ce5f38c57e9ab0692e98dbc77796d676ecb2a5" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.612235 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccfdca08b7285b390c986f2959ce5f38c57e9ab0692e98dbc77796d676ecb2a5"} err="failed to get container status \"ccfdca08b7285b390c986f2959ce5f38c57e9ab0692e98dbc77796d676ecb2a5\": rpc error: code = NotFound desc = could not find container \"ccfdca08b7285b390c986f2959ce5f38c57e9ab0692e98dbc77796d676ecb2a5\": container with ID starting with ccfdca08b7285b390c986f2959ce5f38c57e9ab0692e98dbc77796d676ecb2a5 not found: ID does not exist" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.612408 4843 scope.go:117] "RemoveContainer" containerID="2d17a0ffa42bf328f6a4f1470ea085f52ee0a3dcb434ffb5f1dc382919457f65" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.628101 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-config-data\") pod \"4eaf3a3f-c91a-4012-af55-268aa29869cf\" (UID: \"4eaf3a3f-c91a-4012-af55-268aa29869cf\") " Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.628983 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6599567-b95e-4948-a56f-eba713e3e77f-config-data\") pod \"d6599567-b95e-4948-a56f-eba713e3e77f\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.629273 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6599567-b95e-4948-a56f-eba713e3e77f-logs\") pod \"d6599567-b95e-4948-a56f-eba713e3e77f\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.629545 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-combined-ca-bundle\") pod \"4eaf3a3f-c91a-4012-af55-268aa29869cf\" (UID: \"4eaf3a3f-c91a-4012-af55-268aa29869cf\") " Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.629811 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwnhd\" (UniqueName: \"kubernetes.io/projected/4eaf3a3f-c91a-4012-af55-268aa29869cf-kube-api-access-fwnhd\") pod \"4eaf3a3f-c91a-4012-af55-268aa29869cf\" (UID: \"4eaf3a3f-c91a-4012-af55-268aa29869cf\") " Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.630005 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6599567-b95e-4948-a56f-eba713e3e77f-combined-ca-bundle\") pod \"d6599567-b95e-4948-a56f-eba713e3e77f\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.630229 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8zvp\" (UniqueName: \"kubernetes.io/projected/d6599567-b95e-4948-a56f-eba713e3e77f-kube-api-access-c8zvp\") pod \"d6599567-b95e-4948-a56f-eba713e3e77f\" (UID: \"d6599567-b95e-4948-a56f-eba713e3e77f\") " Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.630365 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6599567-b95e-4948-a56f-eba713e3e77f-logs" (OuterVolumeSpecName: "logs") pod "d6599567-b95e-4948-a56f-eba713e3e77f" (UID: "d6599567-b95e-4948-a56f-eba713e3e77f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.631184 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6599567-b95e-4948-a56f-eba713e3e77f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.636231 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6599567-b95e-4948-a56f-eba713e3e77f-kube-api-access-c8zvp" (OuterVolumeSpecName: "kube-api-access-c8zvp") pod "d6599567-b95e-4948-a56f-eba713e3e77f" (UID: "d6599567-b95e-4948-a56f-eba713e3e77f"). InnerVolumeSpecName "kube-api-access-c8zvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.636934 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eaf3a3f-c91a-4012-af55-268aa29869cf-kube-api-access-fwnhd" (OuterVolumeSpecName: "kube-api-access-fwnhd") pod "4eaf3a3f-c91a-4012-af55-268aa29869cf" (UID: "4eaf3a3f-c91a-4012-af55-268aa29869cf"). InnerVolumeSpecName "kube-api-access-fwnhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.637760 4843 scope.go:117] "RemoveContainer" containerID="2cd65047acbf4c25ddaaac1d45399c406950d334f21266de3e64579537b5290e" Mar 18 12:35:27 crc kubenswrapper[4843]: E0318 12:35:27.665532 4843 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-combined-ca-bundle podName:4eaf3a3f-c91a-4012-af55-268aa29869cf nodeName:}" failed. No retries permitted until 2026-03-18 12:35:28.165501353 +0000 UTC m=+1561.881326877 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-combined-ca-bundle") pod "4eaf3a3f-c91a-4012-af55-268aa29869cf" (UID: "4eaf3a3f-c91a-4012-af55-268aa29869cf") : error deleting /var/lib/kubelet/pods/4eaf3a3f-c91a-4012-af55-268aa29869cf/volume-subpaths: remove /var/lib/kubelet/pods/4eaf3a3f-c91a-4012-af55-268aa29869cf/volume-subpaths: no such file or directory Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.666041 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6599567-b95e-4948-a56f-eba713e3e77f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6599567-b95e-4948-a56f-eba713e3e77f" (UID: "d6599567-b95e-4948-a56f-eba713e3e77f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.667876 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6599567-b95e-4948-a56f-eba713e3e77f-config-data" (OuterVolumeSpecName: "config-data") pod "d6599567-b95e-4948-a56f-eba713e3e77f" (UID: "d6599567-b95e-4948-a56f-eba713e3e77f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.669410 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-config-data" (OuterVolumeSpecName: "config-data") pod "4eaf3a3f-c91a-4012-af55-268aa29869cf" (UID: "4eaf3a3f-c91a-4012-af55-268aa29869cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.735466 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6599567-b95e-4948-a56f-eba713e3e77f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.735504 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwnhd\" (UniqueName: \"kubernetes.io/projected/4eaf3a3f-c91a-4012-af55-268aa29869cf-kube-api-access-fwnhd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.735521 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8zvp\" (UniqueName: \"kubernetes.io/projected/d6599567-b95e-4948-a56f-eba713e3e77f-kube-api-access-c8zvp\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.735535 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.735548 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6599567-b95e-4948-a56f-eba713e3e77f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.745027 4843 scope.go:117] "RemoveContainer" containerID="2d17a0ffa42bf328f6a4f1470ea085f52ee0a3dcb434ffb5f1dc382919457f65" Mar 18 12:35:27 crc kubenswrapper[4843]: E0318 12:35:27.746046 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d17a0ffa42bf328f6a4f1470ea085f52ee0a3dcb434ffb5f1dc382919457f65\": container with ID starting with 2d17a0ffa42bf328f6a4f1470ea085f52ee0a3dcb434ffb5f1dc382919457f65 not found: ID does not exist" containerID="2d17a0ffa42bf328f6a4f1470ea085f52ee0a3dcb434ffb5f1dc382919457f65" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.746095 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d17a0ffa42bf328f6a4f1470ea085f52ee0a3dcb434ffb5f1dc382919457f65"} err="failed to get container status \"2d17a0ffa42bf328f6a4f1470ea085f52ee0a3dcb434ffb5f1dc382919457f65\": rpc error: code = NotFound desc = could not find container \"2d17a0ffa42bf328f6a4f1470ea085f52ee0a3dcb434ffb5f1dc382919457f65\": container with ID starting with 2d17a0ffa42bf328f6a4f1470ea085f52ee0a3dcb434ffb5f1dc382919457f65 not found: ID does not exist" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.746123 4843 scope.go:117] "RemoveContainer" containerID="2cd65047acbf4c25ddaaac1d45399c406950d334f21266de3e64579537b5290e" Mar 18 12:35:27 crc kubenswrapper[4843]: E0318 12:35:27.746778 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd65047acbf4c25ddaaac1d45399c406950d334f21266de3e64579537b5290e\": container with ID starting with 2cd65047acbf4c25ddaaac1d45399c406950d334f21266de3e64579537b5290e not found: ID does not exist" containerID="2cd65047acbf4c25ddaaac1d45399c406950d334f21266de3e64579537b5290e" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.746811 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd65047acbf4c25ddaaac1d45399c406950d334f21266de3e64579537b5290e"} err="failed to get container status \"2cd65047acbf4c25ddaaac1d45399c406950d334f21266de3e64579537b5290e\": rpc error: code = NotFound desc = could not find container \"2cd65047acbf4c25ddaaac1d45399c406950d334f21266de3e64579537b5290e\": container with ID starting with 2cd65047acbf4c25ddaaac1d45399c406950d334f21266de3e64579537b5290e not found: ID does not exist" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.916299 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.927124 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.942090 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:27 crc kubenswrapper[4843]: E0318 12:35:27.942506 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eaf3a3f-c91a-4012-af55-268aa29869cf" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.942528 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eaf3a3f-c91a-4012-af55-268aa29869cf" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 12:35:27 crc kubenswrapper[4843]: E0318 12:35:27.942560 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6599567-b95e-4948-a56f-eba713e3e77f" containerName="nova-metadata-metadata" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.942567 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6599567-b95e-4948-a56f-eba713e3e77f" containerName="nova-metadata-metadata" Mar 18 12:35:27 crc kubenswrapper[4843]: E0318 12:35:27.942590 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6599567-b95e-4948-a56f-eba713e3e77f" containerName="nova-metadata-log" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.942595 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6599567-b95e-4948-a56f-eba713e3e77f" containerName="nova-metadata-log" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.942989 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6599567-b95e-4948-a56f-eba713e3e77f" containerName="nova-metadata-log" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.943017 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eaf3a3f-c91a-4012-af55-268aa29869cf" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.943041 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6599567-b95e-4948-a56f-eba713e3e77f" containerName="nova-metadata-metadata" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.944023 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.949103 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.951179 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 12:35:27 crc kubenswrapper[4843]: I0318 12:35:27.974321 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.041974 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.042194 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-logs\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.042311 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brtjx\" (UniqueName: \"kubernetes.io/projected/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-kube-api-access-brtjx\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.042430 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.042473 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-config-data\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.144995 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-logs\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.145126 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-logs\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.146219 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brtjx\" (UniqueName: \"kubernetes.io/projected/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-kube-api-access-brtjx\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.146444 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.146568 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-config-data\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.146779 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.152685 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-config-data\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.154018 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.158029 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.163736 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brtjx\" (UniqueName: \"kubernetes.io/projected/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-kube-api-access-brtjx\") pod \"nova-metadata-0\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.248624 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-combined-ca-bundle\") pod \"4eaf3a3f-c91a-4012-af55-268aa29869cf\" (UID: \"4eaf3a3f-c91a-4012-af55-268aa29869cf\") " Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.252115 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4eaf3a3f-c91a-4012-af55-268aa29869cf" (UID: "4eaf3a3f-c91a-4012-af55-268aa29869cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.274245 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.352199 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eaf3a3f-c91a-4012-af55-268aa29869cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.518676 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.531454 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.550342 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.552990 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.556851 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.556938 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.557055 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.563223 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.659687 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/205bbfd1-7bde-4af6-a335-3bc3b8338143-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.659733 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205bbfd1-7bde-4af6-a335-3bc3b8338143-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.659843 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/205bbfd1-7bde-4af6-a335-3bc3b8338143-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.659869 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fddw8\" (UniqueName: \"kubernetes.io/projected/205bbfd1-7bde-4af6-a335-3bc3b8338143-kube-api-access-fddw8\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.659910 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205bbfd1-7bde-4af6-a335-3bc3b8338143-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.730829 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.761403 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/205bbfd1-7bde-4af6-a335-3bc3b8338143-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.761465 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fddw8\" (UniqueName: \"kubernetes.io/projected/205bbfd1-7bde-4af6-a335-3bc3b8338143-kube-api-access-fddw8\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.761533 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205bbfd1-7bde-4af6-a335-3bc3b8338143-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.761597 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/205bbfd1-7bde-4af6-a335-3bc3b8338143-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.761619 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205bbfd1-7bde-4af6-a335-3bc3b8338143-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.767489 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/205bbfd1-7bde-4af6-a335-3bc3b8338143-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.768088 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205bbfd1-7bde-4af6-a335-3bc3b8338143-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.768178 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205bbfd1-7bde-4af6-a335-3bc3b8338143-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.768231 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/205bbfd1-7bde-4af6-a335-3bc3b8338143-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.777533 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fddw8\" (UniqueName: \"kubernetes.io/projected/205bbfd1-7bde-4af6-a335-3bc3b8338143-kube-api-access-fddw8\") pod \"nova-cell1-novncproxy-0\" (UID: \"205bbfd1-7bde-4af6-a335-3bc3b8338143\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.877154 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:28 crc kubenswrapper[4843]: I0318 12:35:28.995892 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eaf3a3f-c91a-4012-af55-268aa29869cf" path="/var/lib/kubelet/pods/4eaf3a3f-c91a-4012-af55-268aa29869cf/volumes" Mar 18 12:35:29 crc kubenswrapper[4843]: I0318 12:35:29.004035 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6599567-b95e-4948-a56f-eba713e3e77f" path="/var/lib/kubelet/pods/d6599567-b95e-4948-a56f-eba713e3e77f/volumes" Mar 18 12:35:29 crc kubenswrapper[4843]: I0318 12:35:29.109098 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:35:29 crc kubenswrapper[4843]: I0318 12:35:29.110295 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:35:29 crc kubenswrapper[4843]: W0318 12:35:29.420563 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod205bbfd1_7bde_4af6_a335_3bc3b8338143.slice/crio-5d38c0dbf7f742de8831438ee4cbf388b21dce3ffd4ff5133c46bb05d06dffa2 WatchSource:0}: Error finding container 5d38c0dbf7f742de8831438ee4cbf388b21dce3ffd4ff5133c46bb05d06dffa2: Status 404 returned error can't find the container with id 5d38c0dbf7f742de8831438ee4cbf388b21dce3ffd4ff5133c46bb05d06dffa2 Mar 18 12:35:29 crc kubenswrapper[4843]: I0318 12:35:29.420710 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:35:29 crc kubenswrapper[4843]: I0318 12:35:29.596793 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9c1f745e-fc29-43b9-b4da-67f2646fdd3f","Type":"ContainerStarted","Data":"6a5d99fefab792a67220d4240c1983a015f87bc073bfed34f45a3e7d4c4e917e"} Mar 18 12:35:29 crc kubenswrapper[4843]: I0318 12:35:29.597098 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9c1f745e-fc29-43b9-b4da-67f2646fdd3f","Type":"ContainerStarted","Data":"fb810998036d9b269c6af23f6e143b270204a815169f2c30f096b154821941da"} Mar 18 12:35:29 crc kubenswrapper[4843]: I0318 12:35:29.597113 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9c1f745e-fc29-43b9-b4da-67f2646fdd3f","Type":"ContainerStarted","Data":"bad4452d9791869d4ab5cceb74ef17639332cd06800798aa14c51bd0a072bc35"} Mar 18 12:35:29 crc kubenswrapper[4843]: I0318 12:35:29.598024 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"205bbfd1-7bde-4af6-a335-3bc3b8338143","Type":"ContainerStarted","Data":"5d38c0dbf7f742de8831438ee4cbf388b21dce3ffd4ff5133c46bb05d06dffa2"} Mar 18 12:35:29 crc kubenswrapper[4843]: I0318 12:35:29.621397 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.621373764 podStartE2EDuration="2.621373764s" podCreationTimestamp="2026-03-18 12:35:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:29.615191169 +0000 UTC m=+1563.331016693" watchObservedRunningTime="2026-03-18 12:35:29.621373764 +0000 UTC m=+1563.337199288" Mar 18 12:35:30 crc kubenswrapper[4843]: I0318 12:35:30.615067 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"205bbfd1-7bde-4af6-a335-3bc3b8338143","Type":"ContainerStarted","Data":"8d083b80a9ae4d0917ff386999abcd4cdf5b35c3a095e590e78c80c2390251ef"} Mar 18 12:35:30 crc kubenswrapper[4843]: I0318 12:35:30.653212 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.653179995 podStartE2EDuration="2.653179995s" podCreationTimestamp="2026-03-18 12:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:30.650304573 +0000 UTC m=+1564.366130127" watchObservedRunningTime="2026-03-18 12:35:30.653179995 +0000 UTC m=+1564.369005559" Mar 18 12:35:31 crc kubenswrapper[4843]: I0318 12:35:31.113344 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 12:35:31 crc kubenswrapper[4843]: I0318 12:35:31.114594 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 12:35:31 crc kubenswrapper[4843]: I0318 12:35:31.117154 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 12:35:31 crc kubenswrapper[4843]: I0318 12:35:31.633126 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 12:35:31 crc kubenswrapper[4843]: I0318 12:35:31.876003 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-4dpbb"] Mar 18 12:35:31 crc kubenswrapper[4843]: I0318 12:35:31.880040 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:31 crc kubenswrapper[4843]: I0318 12:35:31.892616 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-4dpbb"] Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.044811 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncxk8\" (UniqueName: \"kubernetes.io/projected/0a54c475-4a90-4883-ae41-7d73e02d7c70-kube-api-access-ncxk8\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.044854 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-config\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.045022 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.045252 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.045329 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.045434 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.147214 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.147396 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.147465 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.147548 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.147682 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncxk8\" (UniqueName: \"kubernetes.io/projected/0a54c475-4a90-4883-ae41-7d73e02d7c70-kube-api-access-ncxk8\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.147735 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-config\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.148483 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.148625 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.148625 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.149197 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-config\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.149214 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.175409 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncxk8\" (UniqueName: \"kubernetes.io/projected/0a54c475-4a90-4883-ae41-7d73e02d7c70-kube-api-access-ncxk8\") pod \"dnsmasq-dns-5c7b6c5df9-4dpbb\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.210725 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:32 crc kubenswrapper[4843]: I0318 12:35:32.712985 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-4dpbb"] Mar 18 12:35:32 crc kubenswrapper[4843]: W0318 12:35:32.719608 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a54c475_4a90_4883_ae41_7d73e02d7c70.slice/crio-d2f1e3f3c1a01119b87ef6347b7ee45a4e908b1f733ca9774aa586d4979af15d WatchSource:0}: Error finding container d2f1e3f3c1a01119b87ef6347b7ee45a4e908b1f733ca9774aa586d4979af15d: Status 404 returned error can't find the container with id d2f1e3f3c1a01119b87ef6347b7ee45a4e908b1f733ca9774aa586d4979af15d Mar 18 12:35:33 crc kubenswrapper[4843]: I0318 12:35:33.652621 4843 generic.go:334] "Generic (PLEG): container finished" podID="0a54c475-4a90-4883-ae41-7d73e02d7c70" containerID="79562ce47d8c43e5925b90a785f37988e6644dfd3fdb818807f36907387b4710" exitCode=0 Mar 18 12:35:33 crc kubenswrapper[4843]: I0318 12:35:33.652757 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" event={"ID":"0a54c475-4a90-4883-ae41-7d73e02d7c70","Type":"ContainerDied","Data":"79562ce47d8c43e5925b90a785f37988e6644dfd3fdb818807f36907387b4710"} Mar 18 12:35:33 crc kubenswrapper[4843]: I0318 12:35:33.653101 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" event={"ID":"0a54c475-4a90-4883-ae41-7d73e02d7c70","Type":"ContainerStarted","Data":"d2f1e3f3c1a01119b87ef6347b7ee45a4e908b1f733ca9774aa586d4979af15d"} Mar 18 12:35:33 crc kubenswrapper[4843]: I0318 12:35:33.877789 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:33 crc kubenswrapper[4843]: I0318 12:35:33.963181 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:33 crc kubenswrapper[4843]: I0318 12:35:33.963540 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="ceilometer-central-agent" containerID="cri-o://ec38f802436989d59df83382131616b2862faef36cbee7f7cbecca57b7c70b4c" gracePeriod=30 Mar 18 12:35:33 crc kubenswrapper[4843]: I0318 12:35:33.963603 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="proxy-httpd" containerID="cri-o://3a144c653fccb6260a3a8aa40e8ebe83aa0c58e42459c735424e882708f397fe" gracePeriod=30 Mar 18 12:35:33 crc kubenswrapper[4843]: I0318 12:35:33.963740 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="ceilometer-notification-agent" containerID="cri-o://89ae07b1f2958ad89dd7f8c6ff71f9b8018bc22ff6163fc4b4fe84d85b27d783" gracePeriod=30 Mar 18 12:35:33 crc kubenswrapper[4843]: I0318 12:35:33.963636 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="sg-core" containerID="cri-o://9377516e936f59e97b16d09a1fb7a9e745359e4e2027e8a2df21ee17dede124e" gracePeriod=30 Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.168285 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.665297 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" event={"ID":"0a54c475-4a90-4883-ae41-7d73e02d7c70","Type":"ContainerStarted","Data":"b7cfd326543db7ba05149a8fd48e2ff852f76f12e322b91d6e3be15669f8edd3"} Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.665474 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.668848 4843 generic.go:334] "Generic (PLEG): container finished" podID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerID="3a144c653fccb6260a3a8aa40e8ebe83aa0c58e42459c735424e882708f397fe" exitCode=0 Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.668879 4843 generic.go:334] "Generic (PLEG): container finished" podID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerID="9377516e936f59e97b16d09a1fb7a9e745359e4e2027e8a2df21ee17dede124e" exitCode=2 Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.668887 4843 generic.go:334] "Generic (PLEG): container finished" podID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerID="89ae07b1f2958ad89dd7f8c6ff71f9b8018bc22ff6163fc4b4fe84d85b27d783" exitCode=0 Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.668895 4843 generic.go:334] "Generic (PLEG): container finished" podID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerID="ec38f802436989d59df83382131616b2862faef36cbee7f7cbecca57b7c70b4c" exitCode=0 Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.669047 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fde046e-d3a2-4c52-840b-b8b5f764b80d","Type":"ContainerDied","Data":"3a144c653fccb6260a3a8aa40e8ebe83aa0c58e42459c735424e882708f397fe"} Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.669097 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fde046e-d3a2-4c52-840b-b8b5f764b80d","Type":"ContainerDied","Data":"9377516e936f59e97b16d09a1fb7a9e745359e4e2027e8a2df21ee17dede124e"} Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.669103 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4bd86a17-2883-4f89-b3a6-5b31461fca9d" containerName="nova-api-log" containerID="cri-o://b954ff01335a0320b58c93700008cda691967f54fdef2d51509b04d47b9e44b5" gracePeriod=30 Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.669203 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4bd86a17-2883-4f89-b3a6-5b31461fca9d" containerName="nova-api-api" containerID="cri-o://340b8af3929bb75a6b1f6b623a8973de3c620f453f98d9d32d7bf6b8c06d93f7" gracePeriod=30 Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.669115 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fde046e-d3a2-4c52-840b-b8b5f764b80d","Type":"ContainerDied","Data":"89ae07b1f2958ad89dd7f8c6ff71f9b8018bc22ff6163fc4b4fe84d85b27d783"} Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.669346 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fde046e-d3a2-4c52-840b-b8b5f764b80d","Type":"ContainerDied","Data":"ec38f802436989d59df83382131616b2862faef36cbee7f7cbecca57b7c70b4c"} Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.698365 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" podStartSLOduration=3.698341705 podStartE2EDuration="3.698341705s" podCreationTimestamp="2026-03-18 12:35:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:34.6900697 +0000 UTC m=+1568.405895224" watchObservedRunningTime="2026-03-18 12:35:34.698341705 +0000 UTC m=+1568.414167249" Mar 18 12:35:34 crc kubenswrapper[4843]: I0318 12:35:34.917731 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.008374 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fde046e-d3a2-4c52-840b-b8b5f764b80d-log-httpd\") pod \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.008536 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-ceilometer-tls-certs\") pod \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.008565 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bswj8\" (UniqueName: \"kubernetes.io/projected/7fde046e-d3a2-4c52-840b-b8b5f764b80d-kube-api-access-bswj8\") pod \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.008597 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-config-data\") pod \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.008702 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-combined-ca-bundle\") pod \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.008726 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-scripts\") pod \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.008863 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fde046e-d3a2-4c52-840b-b8b5f764b80d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7fde046e-d3a2-4c52-840b-b8b5f764b80d" (UID: "7fde046e-d3a2-4c52-840b-b8b5f764b80d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.009337 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-sg-core-conf-yaml\") pod \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.009615 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fde046e-d3a2-4c52-840b-b8b5f764b80d-run-httpd\") pod \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\" (UID: \"7fde046e-d3a2-4c52-840b-b8b5f764b80d\") " Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.010128 4843 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fde046e-d3a2-4c52-840b-b8b5f764b80d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.010320 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fde046e-d3a2-4c52-840b-b8b5f764b80d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7fde046e-d3a2-4c52-840b-b8b5f764b80d" (UID: "7fde046e-d3a2-4c52-840b-b8b5f764b80d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.014936 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fde046e-d3a2-4c52-840b-b8b5f764b80d-kube-api-access-bswj8" (OuterVolumeSpecName: "kube-api-access-bswj8") pod "7fde046e-d3a2-4c52-840b-b8b5f764b80d" (UID: "7fde046e-d3a2-4c52-840b-b8b5f764b80d"). InnerVolumeSpecName "kube-api-access-bswj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.022823 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-scripts" (OuterVolumeSpecName: "scripts") pod "7fde046e-d3a2-4c52-840b-b8b5f764b80d" (UID: "7fde046e-d3a2-4c52-840b-b8b5f764b80d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.049627 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7fde046e-d3a2-4c52-840b-b8b5f764b80d" (UID: "7fde046e-d3a2-4c52-840b-b8b5f764b80d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.067600 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7fde046e-d3a2-4c52-840b-b8b5f764b80d" (UID: "7fde046e-d3a2-4c52-840b-b8b5f764b80d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.113224 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.113270 4843 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.113286 4843 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fde046e-d3a2-4c52-840b-b8b5f764b80d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.113298 4843 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.113312 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bswj8\" (UniqueName: \"kubernetes.io/projected/7fde046e-d3a2-4c52-840b-b8b5f764b80d-kube-api-access-bswj8\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.118627 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fde046e-d3a2-4c52-840b-b8b5f764b80d" (UID: "7fde046e-d3a2-4c52-840b-b8b5f764b80d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.125743 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-config-data" (OuterVolumeSpecName: "config-data") pod "7fde046e-d3a2-4c52-840b-b8b5f764b80d" (UID: "7fde046e-d3a2-4c52-840b-b8b5f764b80d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.214909 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.214948 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fde046e-d3a2-4c52-840b-b8b5f764b80d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.682297 4843 generic.go:334] "Generic (PLEG): container finished" podID="4bd86a17-2883-4f89-b3a6-5b31461fca9d" containerID="b954ff01335a0320b58c93700008cda691967f54fdef2d51509b04d47b9e44b5" exitCode=143 Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.682353 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4bd86a17-2883-4f89-b3a6-5b31461fca9d","Type":"ContainerDied","Data":"b954ff01335a0320b58c93700008cda691967f54fdef2d51509b04d47b9e44b5"} Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.686043 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fde046e-d3a2-4c52-840b-b8b5f764b80d","Type":"ContainerDied","Data":"c718e3295a9da16dac456a040b82b9016ce3ec9f308cb52bd5fee6d1e1a2ef0e"} Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.686088 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.686111 4843 scope.go:117] "RemoveContainer" containerID="3a144c653fccb6260a3a8aa40e8ebe83aa0c58e42459c735424e882708f397fe" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.709406 4843 scope.go:117] "RemoveContainer" containerID="9377516e936f59e97b16d09a1fb7a9e745359e4e2027e8a2df21ee17dede124e" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.723888 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.750065 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.757427 4843 scope.go:117] "RemoveContainer" containerID="89ae07b1f2958ad89dd7f8c6ff71f9b8018bc22ff6163fc4b4fe84d85b27d783" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.772690 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:35 crc kubenswrapper[4843]: E0318 12:35:35.773235 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="ceilometer-central-agent" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.773256 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="ceilometer-central-agent" Mar 18 12:35:35 crc kubenswrapper[4843]: E0318 12:35:35.773275 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="sg-core" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.773282 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="sg-core" Mar 18 12:35:35 crc kubenswrapper[4843]: E0318 12:35:35.773299 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="ceilometer-notification-agent" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.773305 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="ceilometer-notification-agent" Mar 18 12:35:35 crc kubenswrapper[4843]: E0318 12:35:35.773323 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="proxy-httpd" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.773332 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="proxy-httpd" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.773569 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="sg-core" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.773584 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="ceilometer-notification-agent" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.773610 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="proxy-httpd" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.773619 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" containerName="ceilometer-central-agent" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.775480 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.779628 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.779854 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.780918 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.782152 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.803752 4843 scope.go:117] "RemoveContainer" containerID="ec38f802436989d59df83382131616b2862faef36cbee7f7cbecca57b7c70b4c" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.903399 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:35 crc kubenswrapper[4843]: E0318 12:35:35.904207 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-hpwpz log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="89c11707-c683-413e-a6e5-40937a35dfd3" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.955471 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-scripts\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.955698 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.955819 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89c11707-c683-413e-a6e5-40937a35dfd3-run-httpd\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.956045 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-config-data\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.956172 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89c11707-c683-413e-a6e5-40937a35dfd3-log-httpd\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.956198 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpwpz\" (UniqueName: \"kubernetes.io/projected/89c11707-c683-413e-a6e5-40937a35dfd3-kube-api-access-hpwpz\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.956263 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:35 crc kubenswrapper[4843]: I0318 12:35:35.956338 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.057757 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-config-data\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.057827 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpwpz\" (UniqueName: \"kubernetes.io/projected/89c11707-c683-413e-a6e5-40937a35dfd3-kube-api-access-hpwpz\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.057848 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89c11707-c683-413e-a6e5-40937a35dfd3-log-httpd\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.057902 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.057936 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.057961 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-scripts\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.058006 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.059095 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89c11707-c683-413e-a6e5-40937a35dfd3-run-httpd\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.059405 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89c11707-c683-413e-a6e5-40937a35dfd3-log-httpd\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.059506 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89c11707-c683-413e-a6e5-40937a35dfd3-run-httpd\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.064472 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.065340 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-scripts\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.065688 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-config-data\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.068427 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.069410 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.079888 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpwpz\" (UniqueName: \"kubernetes.io/projected/89c11707-c683-413e-a6e5-40937a35dfd3-kube-api-access-hpwpz\") pod \"ceilometer-0\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.696488 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.711351 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.795214 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89c11707-c683-413e-a6e5-40937a35dfd3-log-httpd\") pod \"89c11707-c683-413e-a6e5-40937a35dfd3\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.795379 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-ceilometer-tls-certs\") pod \"89c11707-c683-413e-a6e5-40937a35dfd3\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.795440 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpwpz\" (UniqueName: \"kubernetes.io/projected/89c11707-c683-413e-a6e5-40937a35dfd3-kube-api-access-hpwpz\") pod \"89c11707-c683-413e-a6e5-40937a35dfd3\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.795501 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-sg-core-conf-yaml\") pod \"89c11707-c683-413e-a6e5-40937a35dfd3\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.795552 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-config-data\") pod \"89c11707-c683-413e-a6e5-40937a35dfd3\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.795618 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-combined-ca-bundle\") pod \"89c11707-c683-413e-a6e5-40937a35dfd3\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.795676 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89c11707-c683-413e-a6e5-40937a35dfd3-run-httpd\") pod \"89c11707-c683-413e-a6e5-40937a35dfd3\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.795754 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c11707-c683-413e-a6e5-40937a35dfd3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "89c11707-c683-413e-a6e5-40937a35dfd3" (UID: "89c11707-c683-413e-a6e5-40937a35dfd3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.795839 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-scripts\") pod \"89c11707-c683-413e-a6e5-40937a35dfd3\" (UID: \"89c11707-c683-413e-a6e5-40937a35dfd3\") " Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.796442 4843 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89c11707-c683-413e-a6e5-40937a35dfd3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.798569 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c11707-c683-413e-a6e5-40937a35dfd3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "89c11707-c683-413e-a6e5-40937a35dfd3" (UID: "89c11707-c683-413e-a6e5-40937a35dfd3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.801956 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89c11707-c683-413e-a6e5-40937a35dfd3" (UID: "89c11707-c683-413e-a6e5-40937a35dfd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.802016 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "89c11707-c683-413e-a6e5-40937a35dfd3" (UID: "89c11707-c683-413e-a6e5-40937a35dfd3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.803621 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "89c11707-c683-413e-a6e5-40937a35dfd3" (UID: "89c11707-c683-413e-a6e5-40937a35dfd3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.804004 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-scripts" (OuterVolumeSpecName: "scripts") pod "89c11707-c683-413e-a6e5-40937a35dfd3" (UID: "89c11707-c683-413e-a6e5-40937a35dfd3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.805794 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c11707-c683-413e-a6e5-40937a35dfd3-kube-api-access-hpwpz" (OuterVolumeSpecName: "kube-api-access-hpwpz") pod "89c11707-c683-413e-a6e5-40937a35dfd3" (UID: "89c11707-c683-413e-a6e5-40937a35dfd3"). InnerVolumeSpecName "kube-api-access-hpwpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.805873 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-config-data" (OuterVolumeSpecName: "config-data") pod "89c11707-c683-413e-a6e5-40937a35dfd3" (UID: "89c11707-c683-413e-a6e5-40937a35dfd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.898807 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.898861 4843 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.898885 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpwpz\" (UniqueName: \"kubernetes.io/projected/89c11707-c683-413e-a6e5-40937a35dfd3-kube-api-access-hpwpz\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.898901 4843 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.898919 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.898936 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c11707-c683-413e-a6e5-40937a35dfd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.898952 4843 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89c11707-c683-413e-a6e5-40937a35dfd3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:36 crc kubenswrapper[4843]: I0318 12:35:36.998446 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fde046e-d3a2-4c52-840b-b8b5f764b80d" path="/var/lib/kubelet/pods/7fde046e-d3a2-4c52-840b-b8b5f764b80d/volumes" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.707514 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.765119 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.779798 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.789078 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.791417 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.794736 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.795123 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.795618 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.820074 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/948691c5-0a85-451f-931d-2ab2108c1736-log-httpd\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.820141 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.820337 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-scripts\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.820569 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.820687 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-config-data\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.820743 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/948691c5-0a85-451f-931d-2ab2108c1736-run-httpd\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.820764 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v49w2\" (UniqueName: \"kubernetes.io/projected/948691c5-0a85-451f-931d-2ab2108c1736-kube-api-access-v49w2\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.820794 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.857033 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.934388 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-scripts\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.934551 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.934671 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-config-data\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.934765 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/948691c5-0a85-451f-931d-2ab2108c1736-run-httpd\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.934818 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v49w2\" (UniqueName: \"kubernetes.io/projected/948691c5-0a85-451f-931d-2ab2108c1736-kube-api-access-v49w2\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.934886 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.934958 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/948691c5-0a85-451f-931d-2ab2108c1736-log-httpd\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.935015 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.936544 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/948691c5-0a85-451f-931d-2ab2108c1736-run-httpd\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.937159 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/948691c5-0a85-451f-931d-2ab2108c1736-log-httpd\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.939492 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.941211 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-scripts\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.945647 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.945861 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-config-data\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.950269 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/948691c5-0a85-451f-931d-2ab2108c1736-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:37 crc kubenswrapper[4843]: I0318 12:35:37.955844 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v49w2\" (UniqueName: \"kubernetes.io/projected/948691c5-0a85-451f-931d-2ab2108c1736-kube-api-access-v49w2\") pod \"ceilometer-0\" (UID: \"948691c5-0a85-451f-931d-2ab2108c1736\") " pod="openstack/ceilometer-0" Mar 18 12:35:38 crc kubenswrapper[4843]: I0318 12:35:38.110811 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:38 crc kubenswrapper[4843]: I0318 12:35:38.275602 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 12:35:38 crc kubenswrapper[4843]: I0318 12:35:38.275687 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 12:35:38 crc kubenswrapper[4843]: I0318 12:35:38.334936 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.495783 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd86a17-2883-4f89-b3a6-5b31461fca9d-logs\") pod \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.496104 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnqwf\" (UniqueName: \"kubernetes.io/projected/4bd86a17-2883-4f89-b3a6-5b31461fca9d-kube-api-access-wnqwf\") pod \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.496141 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd86a17-2883-4f89-b3a6-5b31461fca9d-config-data\") pod \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.496208 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd86a17-2883-4f89-b3a6-5b31461fca9d-combined-ca-bundle\") pod \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\" (UID: \"4bd86a17-2883-4f89-b3a6-5b31461fca9d\") " Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.496343 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bd86a17-2883-4f89-b3a6-5b31461fca9d-logs" (OuterVolumeSpecName: "logs") pod "4bd86a17-2883-4f89-b3a6-5b31461fca9d" (UID: "4bd86a17-2883-4f89-b3a6-5b31461fca9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.496770 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd86a17-2883-4f89-b3a6-5b31461fca9d-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.503086 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd86a17-2883-4f89-b3a6-5b31461fca9d-kube-api-access-wnqwf" (OuterVolumeSpecName: "kube-api-access-wnqwf") pod "4bd86a17-2883-4f89-b3a6-5b31461fca9d" (UID: "4bd86a17-2883-4f89-b3a6-5b31461fca9d"). InnerVolumeSpecName "kube-api-access-wnqwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.541917 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd86a17-2883-4f89-b3a6-5b31461fca9d-config-data" (OuterVolumeSpecName: "config-data") pod "4bd86a17-2883-4f89-b3a6-5b31461fca9d" (UID: "4bd86a17-2883-4f89-b3a6-5b31461fca9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.544797 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd86a17-2883-4f89-b3a6-5b31461fca9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bd86a17-2883-4f89-b3a6-5b31461fca9d" (UID: "4bd86a17-2883-4f89-b3a6-5b31461fca9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.600511 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnqwf\" (UniqueName: \"kubernetes.io/projected/4bd86a17-2883-4f89-b3a6-5b31461fca9d-kube-api-access-wnqwf\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.600541 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd86a17-2883-4f89-b3a6-5b31461fca9d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.600550 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd86a17-2883-4f89-b3a6-5b31461fca9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.627970 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:39 crc kubenswrapper[4843]: W0318 12:35:38.640506 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod948691c5_0a85_451f_931d_2ab2108c1736.slice/crio-850ac09d1a46dd92785efdccd63657a6518cf9120eab4bdf91af67b074f85215 WatchSource:0}: Error finding container 850ac09d1a46dd92785efdccd63657a6518cf9120eab4bdf91af67b074f85215: Status 404 returned error can't find the container with id 850ac09d1a46dd92785efdccd63657a6518cf9120eab4bdf91af67b074f85215 Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.649955 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.727981 4843 generic.go:334] "Generic (PLEG): container finished" podID="4bd86a17-2883-4f89-b3a6-5b31461fca9d" containerID="340b8af3929bb75a6b1f6b623a8973de3c620f453f98d9d32d7bf6b8c06d93f7" exitCode=0 Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.728062 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4bd86a17-2883-4f89-b3a6-5b31461fca9d","Type":"ContainerDied","Data":"340b8af3929bb75a6b1f6b623a8973de3c620f453f98d9d32d7bf6b8c06d93f7"} Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.728092 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4bd86a17-2883-4f89-b3a6-5b31461fca9d","Type":"ContainerDied","Data":"18f80865f3cb2859872c9ec939afd56ffd7bdd418cd4b33c5a6f0d2d615f4a23"} Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.728112 4843 scope.go:117] "RemoveContainer" containerID="340b8af3929bb75a6b1f6b623a8973de3c620f453f98d9d32d7bf6b8c06d93f7" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.728297 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.733031 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"948691c5-0a85-451f-931d-2ab2108c1736","Type":"ContainerStarted","Data":"850ac09d1a46dd92785efdccd63657a6518cf9120eab4bdf91af67b074f85215"} Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.765805 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.772106 4843 scope.go:117] "RemoveContainer" containerID="b954ff01335a0320b58c93700008cda691967f54fdef2d51509b04d47b9e44b5" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.773534 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.791613 4843 scope.go:117] "RemoveContainer" containerID="340b8af3929bb75a6b1f6b623a8973de3c620f453f98d9d32d7bf6b8c06d93f7" Mar 18 12:35:39 crc kubenswrapper[4843]: E0318 12:35:38.793694 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"340b8af3929bb75a6b1f6b623a8973de3c620f453f98d9d32d7bf6b8c06d93f7\": container with ID starting with 340b8af3929bb75a6b1f6b623a8973de3c620f453f98d9d32d7bf6b8c06d93f7 not found: ID does not exist" containerID="340b8af3929bb75a6b1f6b623a8973de3c620f453f98d9d32d7bf6b8c06d93f7" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.793722 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340b8af3929bb75a6b1f6b623a8973de3c620f453f98d9d32d7bf6b8c06d93f7"} err="failed to get container status \"340b8af3929bb75a6b1f6b623a8973de3c620f453f98d9d32d7bf6b8c06d93f7\": rpc error: code = NotFound desc = could not find container \"340b8af3929bb75a6b1f6b623a8973de3c620f453f98d9d32d7bf6b8c06d93f7\": container with ID starting with 340b8af3929bb75a6b1f6b623a8973de3c620f453f98d9d32d7bf6b8c06d93f7 not found: ID does not exist" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.793741 4843 scope.go:117] "RemoveContainer" containerID="b954ff01335a0320b58c93700008cda691967f54fdef2d51509b04d47b9e44b5" Mar 18 12:35:39 crc kubenswrapper[4843]: E0318 12:35:38.794170 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b954ff01335a0320b58c93700008cda691967f54fdef2d51509b04d47b9e44b5\": container with ID starting with b954ff01335a0320b58c93700008cda691967f54fdef2d51509b04d47b9e44b5 not found: ID does not exist" containerID="b954ff01335a0320b58c93700008cda691967f54fdef2d51509b04d47b9e44b5" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.794188 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b954ff01335a0320b58c93700008cda691967f54fdef2d51509b04d47b9e44b5"} err="failed to get container status \"b954ff01335a0320b58c93700008cda691967f54fdef2d51509b04d47b9e44b5\": rpc error: code = NotFound desc = could not find container \"b954ff01335a0320b58c93700008cda691967f54fdef2d51509b04d47b9e44b5\": container with ID starting with b954ff01335a0320b58c93700008cda691967f54fdef2d51509b04d47b9e44b5 not found: ID does not exist" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.796864 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:39 crc kubenswrapper[4843]: E0318 12:35:38.797353 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd86a17-2883-4f89-b3a6-5b31461fca9d" containerName="nova-api-log" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.797364 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd86a17-2883-4f89-b3a6-5b31461fca9d" containerName="nova-api-log" Mar 18 12:35:39 crc kubenswrapper[4843]: E0318 12:35:38.797384 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd86a17-2883-4f89-b3a6-5b31461fca9d" containerName="nova-api-api" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.797390 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd86a17-2883-4f89-b3a6-5b31461fca9d" containerName="nova-api-api" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.797555 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd86a17-2883-4f89-b3a6-5b31461fca9d" containerName="nova-api-api" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.797569 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd86a17-2883-4f89-b3a6-5b31461fca9d" containerName="nova-api-log" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.798680 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.801955 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.802813 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.803540 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.826386 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.884937 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.906386 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.906477 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.906525 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-internal-tls-certs\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.906787 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwbrd\" (UniqueName: \"kubernetes.io/projected/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-kube-api-access-pwbrd\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.907051 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-logs\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.907105 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-config-data\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:38.907158 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-public-tls-certs\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.006420 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd86a17-2883-4f89-b3a6-5b31461fca9d" path="/var/lib/kubelet/pods/4bd86a17-2883-4f89-b3a6-5b31461fca9d/volumes" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.007224 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89c11707-c683-413e-a6e5-40937a35dfd3" path="/var/lib/kubelet/pods/89c11707-c683-413e-a6e5-40937a35dfd3/volumes" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.008784 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbrd\" (UniqueName: \"kubernetes.io/projected/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-kube-api-access-pwbrd\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.008919 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-logs\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.008949 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-config-data\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.008977 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-public-tls-certs\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.009041 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.009069 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-internal-tls-certs\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.009479 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-logs\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.018150 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-public-tls-certs\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.019216 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-config-data\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.020450 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.034053 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-internal-tls-certs\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.035151 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwbrd\" (UniqueName: \"kubernetes.io/projected/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-kube-api-access-pwbrd\") pod \"nova-api-0\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.118420 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.289851 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9c1f745e-fc29-43b9-b4da-67f2646fdd3f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.289865 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9c1f745e-fc29-43b9-b4da-67f2646fdd3f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.638088 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:39 crc kubenswrapper[4843]: W0318 12:35:39.642283 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75755a4a_0e4a_4b03_9fd7_cd08bf2df080.slice/crio-190e1b99d93b2b02e4c639567d86eb93ae91cab60fbf9fe5a3e923f73668abc8 WatchSource:0}: Error finding container 190e1b99d93b2b02e4c639567d86eb93ae91cab60fbf9fe5a3e923f73668abc8: Status 404 returned error can't find the container with id 190e1b99d93b2b02e4c639567d86eb93ae91cab60fbf9fe5a3e923f73668abc8 Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.750321 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75755a4a-0e4a-4b03-9fd7-cd08bf2df080","Type":"ContainerStarted","Data":"190e1b99d93b2b02e4c639567d86eb93ae91cab60fbf9fe5a3e923f73668abc8"} Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.754076 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"948691c5-0a85-451f-931d-2ab2108c1736","Type":"ContainerStarted","Data":"a37c1a3fe28997318c0e1dc2b3d1aaf320bf9e53bdb83ac1030db266574cb4b8"} Mar 18 12:35:39 crc kubenswrapper[4843]: I0318 12:35:39.778776 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.004714 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-pt4mg"] Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.006070 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.008681 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.010825 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.014691 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pt4mg"] Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.076080 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pqg4\" (UniqueName: \"kubernetes.io/projected/7187efa4-74d7-4162-89d7-5b9368c1924e-kube-api-access-8pqg4\") pod \"nova-cell1-cell-mapping-pt4mg\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.077116 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-scripts\") pod \"nova-cell1-cell-mapping-pt4mg\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.077386 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pt4mg\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.078015 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-config-data\") pod \"nova-cell1-cell-mapping-pt4mg\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.180301 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pt4mg\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.180359 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-config-data\") pod \"nova-cell1-cell-mapping-pt4mg\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.180459 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pqg4\" (UniqueName: \"kubernetes.io/projected/7187efa4-74d7-4162-89d7-5b9368c1924e-kube-api-access-8pqg4\") pod \"nova-cell1-cell-mapping-pt4mg\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.180509 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-scripts\") pod \"nova-cell1-cell-mapping-pt4mg\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.185876 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-config-data\") pod \"nova-cell1-cell-mapping-pt4mg\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.187175 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-scripts\") pod \"nova-cell1-cell-mapping-pt4mg\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.187328 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pt4mg\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.200156 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pqg4\" (UniqueName: \"kubernetes.io/projected/7187efa4-74d7-4162-89d7-5b9368c1924e-kube-api-access-8pqg4\") pod \"nova-cell1-cell-mapping-pt4mg\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.449751 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.774427 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75755a4a-0e4a-4b03-9fd7-cd08bf2df080","Type":"ContainerStarted","Data":"40efe9c4a411d085cb0d3cb1a29f1989d59384b70d7eec6ec5aa290472e71bf4"} Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.774717 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75755a4a-0e4a-4b03-9fd7-cd08bf2df080","Type":"ContainerStarted","Data":"12e273fb0d689fe649f2e3bd3a282018083b212a609a7030238913d79b10e66f"} Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.778425 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"948691c5-0a85-451f-931d-2ab2108c1736","Type":"ContainerStarted","Data":"cb22e9db0243c8f5a7c52542879f156d2eb225e1098e0f5e63888410d9705fa2"} Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.802812 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.802789753 podStartE2EDuration="2.802789753s" podCreationTimestamp="2026-03-18 12:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:40.796941427 +0000 UTC m=+1574.512767031" watchObservedRunningTime="2026-03-18 12:35:40.802789753 +0000 UTC m=+1574.518615277" Mar 18 12:35:40 crc kubenswrapper[4843]: W0318 12:35:40.870166 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7187efa4_74d7_4162_89d7_5b9368c1924e.slice/crio-b565bef52f99854ed795365b257a5bd8ab8d9799678bd5d77c6b1b4a830e10c6 WatchSource:0}: Error finding container b565bef52f99854ed795365b257a5bd8ab8d9799678bd5d77c6b1b4a830e10c6: Status 404 returned error can't find the container with id b565bef52f99854ed795365b257a5bd8ab8d9799678bd5d77c6b1b4a830e10c6 Mar 18 12:35:40 crc kubenswrapper[4843]: I0318 12:35:40.871111 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pt4mg"] Mar 18 12:35:41 crc kubenswrapper[4843]: I0318 12:35:41.790955 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"948691c5-0a85-451f-931d-2ab2108c1736","Type":"ContainerStarted","Data":"d8cf12a2e876deb760a1006836ec4a4b7a1c7ca8c78622e6d275cae98c41dcda"} Mar 18 12:35:41 crc kubenswrapper[4843]: I0318 12:35:41.793064 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pt4mg" event={"ID":"7187efa4-74d7-4162-89d7-5b9368c1924e","Type":"ContainerStarted","Data":"52a1fe188bd855ee8cb9efe1e288e0c092fae75c0c11b7bc53e2cb23a689c157"} Mar 18 12:35:41 crc kubenswrapper[4843]: I0318 12:35:41.793126 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pt4mg" event={"ID":"7187efa4-74d7-4162-89d7-5b9368c1924e","Type":"ContainerStarted","Data":"b565bef52f99854ed795365b257a5bd8ab8d9799678bd5d77c6b1b4a830e10c6"} Mar 18 12:35:41 crc kubenswrapper[4843]: I0318 12:35:41.808883 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-pt4mg" podStartSLOduration=2.808860773 podStartE2EDuration="2.808860773s" podCreationTimestamp="2026-03-18 12:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:41.807633268 +0000 UTC m=+1575.523458792" watchObservedRunningTime="2026-03-18 12:35:41.808860773 +0000 UTC m=+1575.524686297" Mar 18 12:35:42 crc kubenswrapper[4843]: I0318 12:35:42.212830 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:35:42 crc kubenswrapper[4843]: I0318 12:35:42.305155 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-tldf6"] Mar 18 12:35:42 crc kubenswrapper[4843]: I0318 12:35:42.305393 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-tldf6" podUID="6dd87f99-b9bf-481e-87f4-219e09ca9998" containerName="dnsmasq-dns" containerID="cri-o://5379aa099ec89e4fd5152c0fbe5ad04b4ed7e88fd9118c862f55f1c0883007a3" gracePeriod=10 Mar 18 12:35:42 crc kubenswrapper[4843]: I0318 12:35:42.897007 4843 generic.go:334] "Generic (PLEG): container finished" podID="6dd87f99-b9bf-481e-87f4-219e09ca9998" containerID="5379aa099ec89e4fd5152c0fbe5ad04b4ed7e88fd9118c862f55f1c0883007a3" exitCode=0 Mar 18 12:35:42 crc kubenswrapper[4843]: I0318 12:35:42.898089 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-tldf6" event={"ID":"6dd87f99-b9bf-481e-87f4-219e09ca9998","Type":"ContainerDied","Data":"5379aa099ec89e4fd5152c0fbe5ad04b4ed7e88fd9118c862f55f1c0883007a3"} Mar 18 12:35:42 crc kubenswrapper[4843]: I0318 12:35:42.898117 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-tldf6" event={"ID":"6dd87f99-b9bf-481e-87f4-219e09ca9998","Type":"ContainerDied","Data":"db6d1b4e0d09efa9814cf3f5145e9ecc6677cf1421f099d45bbe92574af49d4b"} Mar 18 12:35:42 crc kubenswrapper[4843]: I0318 12:35:42.898127 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db6d1b4e0d09efa9814cf3f5145e9ecc6677cf1421f099d45bbe92574af49d4b" Mar 18 12:35:42 crc kubenswrapper[4843]: I0318 12:35:42.967819 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.080169 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cvj7\" (UniqueName: \"kubernetes.io/projected/6dd87f99-b9bf-481e-87f4-219e09ca9998-kube-api-access-4cvj7\") pod \"6dd87f99-b9bf-481e-87f4-219e09ca9998\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.080313 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-ovsdbserver-sb\") pod \"6dd87f99-b9bf-481e-87f4-219e09ca9998\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.080364 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-dns-svc\") pod \"6dd87f99-b9bf-481e-87f4-219e09ca9998\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.080427 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-ovsdbserver-nb\") pod \"6dd87f99-b9bf-481e-87f4-219e09ca9998\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.080496 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-dns-swift-storage-0\") pod \"6dd87f99-b9bf-481e-87f4-219e09ca9998\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.080521 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-config\") pod \"6dd87f99-b9bf-481e-87f4-219e09ca9998\" (UID: \"6dd87f99-b9bf-481e-87f4-219e09ca9998\") " Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.103991 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd87f99-b9bf-481e-87f4-219e09ca9998-kube-api-access-4cvj7" (OuterVolumeSpecName: "kube-api-access-4cvj7") pod "6dd87f99-b9bf-481e-87f4-219e09ca9998" (UID: "6dd87f99-b9bf-481e-87f4-219e09ca9998"). InnerVolumeSpecName "kube-api-access-4cvj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.136607 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6dd87f99-b9bf-481e-87f4-219e09ca9998" (UID: "6dd87f99-b9bf-481e-87f4-219e09ca9998"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.138456 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6dd87f99-b9bf-481e-87f4-219e09ca9998" (UID: "6dd87f99-b9bf-481e-87f4-219e09ca9998"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.151724 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-config" (OuterVolumeSpecName: "config") pod "6dd87f99-b9bf-481e-87f4-219e09ca9998" (UID: "6dd87f99-b9bf-481e-87f4-219e09ca9998"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.167268 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6dd87f99-b9bf-481e-87f4-219e09ca9998" (UID: "6dd87f99-b9bf-481e-87f4-219e09ca9998"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.167486 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6dd87f99-b9bf-481e-87f4-219e09ca9998" (UID: "6dd87f99-b9bf-481e-87f4-219e09ca9998"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.183329 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.183361 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.183371 4843 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.183385 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.183395 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cvj7\" (UniqueName: \"kubernetes.io/projected/6dd87f99-b9bf-481e-87f4-219e09ca9998-kube-api-access-4cvj7\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.183403 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dd87f99-b9bf-481e-87f4-219e09ca9998-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.913724 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"948691c5-0a85-451f-931d-2ab2108c1736","Type":"ContainerStarted","Data":"ce3dfac69b7eee2ed8e0c14e02a6ef4eaa4da87aaf81c5049edd9b5ebbeaf2da"} Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.914060 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.915592 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-tldf6" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.947890 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.308890276 podStartE2EDuration="6.947865632s" podCreationTimestamp="2026-03-18 12:35:37 +0000 UTC" firstStartedPulling="2026-03-18 12:35:38.649428086 +0000 UTC m=+1572.365253620" lastFinishedPulling="2026-03-18 12:35:43.288403452 +0000 UTC m=+1577.004228976" observedRunningTime="2026-03-18 12:35:43.940457592 +0000 UTC m=+1577.656283116" watchObservedRunningTime="2026-03-18 12:35:43.947865632 +0000 UTC m=+1577.663691156" Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.978727 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-tldf6"] Mar 18 12:35:43 crc kubenswrapper[4843]: I0318 12:35:43.991870 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-tldf6"] Mar 18 12:35:44 crc kubenswrapper[4843]: I0318 12:35:44.993731 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd87f99-b9bf-481e-87f4-219e09ca9998" path="/var/lib/kubelet/pods/6dd87f99-b9bf-481e-87f4-219e09ca9998/volumes" Mar 18 12:35:46 crc kubenswrapper[4843]: I0318 12:35:46.274387 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:35:46 crc kubenswrapper[4843]: I0318 12:35:46.274850 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:35:46 crc kubenswrapper[4843]: I0318 12:35:46.944968 4843 generic.go:334] "Generic (PLEG): container finished" podID="7187efa4-74d7-4162-89d7-5b9368c1924e" containerID="52a1fe188bd855ee8cb9efe1e288e0c092fae75c0c11b7bc53e2cb23a689c157" exitCode=0 Mar 18 12:35:46 crc kubenswrapper[4843]: I0318 12:35:46.945047 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pt4mg" event={"ID":"7187efa4-74d7-4162-89d7-5b9368c1924e","Type":"ContainerDied","Data":"52a1fe188bd855ee8cb9efe1e288e0c092fae75c0c11b7bc53e2cb23a689c157"} Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.280403 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.281359 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.286253 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.425454 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.451554 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-scripts\") pod \"7187efa4-74d7-4162-89d7-5b9368c1924e\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.451761 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-config-data\") pod \"7187efa4-74d7-4162-89d7-5b9368c1924e\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.451841 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pqg4\" (UniqueName: \"kubernetes.io/projected/7187efa4-74d7-4162-89d7-5b9368c1924e-kube-api-access-8pqg4\") pod \"7187efa4-74d7-4162-89d7-5b9368c1924e\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.451936 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-combined-ca-bundle\") pod \"7187efa4-74d7-4162-89d7-5b9368c1924e\" (UID: \"7187efa4-74d7-4162-89d7-5b9368c1924e\") " Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.459949 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7187efa4-74d7-4162-89d7-5b9368c1924e-kube-api-access-8pqg4" (OuterVolumeSpecName: "kube-api-access-8pqg4") pod "7187efa4-74d7-4162-89d7-5b9368c1924e" (UID: "7187efa4-74d7-4162-89d7-5b9368c1924e"). InnerVolumeSpecName "kube-api-access-8pqg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.463902 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-scripts" (OuterVolumeSpecName: "scripts") pod "7187efa4-74d7-4162-89d7-5b9368c1924e" (UID: "7187efa4-74d7-4162-89d7-5b9368c1924e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.488956 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-config-data" (OuterVolumeSpecName: "config-data") pod "7187efa4-74d7-4162-89d7-5b9368c1924e" (UID: "7187efa4-74d7-4162-89d7-5b9368c1924e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.498233 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7187efa4-74d7-4162-89d7-5b9368c1924e" (UID: "7187efa4-74d7-4162-89d7-5b9368c1924e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.555215 4843 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.555248 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.555260 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pqg4\" (UniqueName: \"kubernetes.io/projected/7187efa4-74d7-4162-89d7-5b9368c1924e-kube-api-access-8pqg4\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.555271 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7187efa4-74d7-4162-89d7-5b9368c1924e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.973506 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pt4mg" event={"ID":"7187efa4-74d7-4162-89d7-5b9368c1924e","Type":"ContainerDied","Data":"b565bef52f99854ed795365b257a5bd8ab8d9799678bd5d77c6b1b4a830e10c6"} Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.973581 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b565bef52f99854ed795365b257a5bd8ab8d9799678bd5d77c6b1b4a830e10c6" Mar 18 12:35:48 crc kubenswrapper[4843]: I0318 12:35:48.973598 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pt4mg" Mar 18 12:35:49 crc kubenswrapper[4843]: I0318 12:35:49.018168 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 12:35:49 crc kubenswrapper[4843]: I0318 12:35:49.120533 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:35:49 crc kubenswrapper[4843]: I0318 12:35:49.120603 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:35:49 crc kubenswrapper[4843]: I0318 12:35:49.240702 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:49 crc kubenswrapper[4843]: I0318 12:35:49.240979 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="95d1f2b2-faa6-4614-8ad7-023e2285fefc" containerName="nova-scheduler-scheduler" containerID="cri-o://59d4aac0b4a0325d1dcedff9dcc41a05e8d9449a95458b2d7496241c6f532139" gracePeriod=30 Mar 18 12:35:49 crc kubenswrapper[4843]: I0318 12:35:49.268087 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:49 crc kubenswrapper[4843]: I0318 12:35:49.348962 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:49 crc kubenswrapper[4843]: E0318 12:35:49.682950 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59d4aac0b4a0325d1dcedff9dcc41a05e8d9449a95458b2d7496241c6f532139" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:35:49 crc kubenswrapper[4843]: E0318 12:35:49.684923 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59d4aac0b4a0325d1dcedff9dcc41a05e8d9449a95458b2d7496241c6f532139" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:35:49 crc kubenswrapper[4843]: E0318 12:35:49.686734 4843 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59d4aac0b4a0325d1dcedff9dcc41a05e8d9449a95458b2d7496241c6f532139" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:35:49 crc kubenswrapper[4843]: E0318 12:35:49.686788 4843 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="95d1f2b2-faa6-4614-8ad7-023e2285fefc" containerName="nova-scheduler-scheduler" Mar 18 12:35:49 crc kubenswrapper[4843]: I0318 12:35:49.996023 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="75755a4a-0e4a-4b03-9fd7-cd08bf2df080" containerName="nova-api-log" containerID="cri-o://12e273fb0d689fe649f2e3bd3a282018083b212a609a7030238913d79b10e66f" gracePeriod=30 Mar 18 12:35:49 crc kubenswrapper[4843]: I0318 12:35:49.996309 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="75755a4a-0e4a-4b03-9fd7-cd08bf2df080" containerName="nova-api-api" containerID="cri-o://40efe9c4a411d085cb0d3cb1a29f1989d59384b70d7eec6ec5aa290472e71bf4" gracePeriod=30 Mar 18 12:35:50 crc kubenswrapper[4843]: I0318 12:35:50.002074 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="75755a4a-0e4a-4b03-9fd7-cd08bf2df080" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": EOF" Mar 18 12:35:50 crc kubenswrapper[4843]: I0318 12:35:50.010737 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="75755a4a-0e4a-4b03-9fd7-cd08bf2df080" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": EOF" Mar 18 12:35:51 crc kubenswrapper[4843]: I0318 12:35:51.006526 4843 generic.go:334] "Generic (PLEG): container finished" podID="75755a4a-0e4a-4b03-9fd7-cd08bf2df080" containerID="12e273fb0d689fe649f2e3bd3a282018083b212a609a7030238913d79b10e66f" exitCode=143 Mar 18 12:35:51 crc kubenswrapper[4843]: I0318 12:35:51.006678 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75755a4a-0e4a-4b03-9fd7-cd08bf2df080","Type":"ContainerDied","Data":"12e273fb0d689fe649f2e3bd3a282018083b212a609a7030238913d79b10e66f"} Mar 18 12:35:51 crc kubenswrapper[4843]: I0318 12:35:51.007003 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9c1f745e-fc29-43b9-b4da-67f2646fdd3f" containerName="nova-metadata-log" containerID="cri-o://fb810998036d9b269c6af23f6e143b270204a815169f2c30f096b154821941da" gracePeriod=30 Mar 18 12:35:51 crc kubenswrapper[4843]: I0318 12:35:51.007669 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9c1f745e-fc29-43b9-b4da-67f2646fdd3f" containerName="nova-metadata-metadata" containerID="cri-o://6a5d99fefab792a67220d4240c1983a015f87bc073bfed34f45a3e7d4c4e917e" gracePeriod=30 Mar 18 12:35:52 crc kubenswrapper[4843]: I0318 12:35:52.021530 4843 generic.go:334] "Generic (PLEG): container finished" podID="9c1f745e-fc29-43b9-b4da-67f2646fdd3f" containerID="fb810998036d9b269c6af23f6e143b270204a815169f2c30f096b154821941da" exitCode=143 Mar 18 12:35:52 crc kubenswrapper[4843]: I0318 12:35:52.022544 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9c1f745e-fc29-43b9-b4da-67f2646fdd3f","Type":"ContainerDied","Data":"fb810998036d9b269c6af23f6e143b270204a815169f2c30f096b154821941da"} Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.046253 4843 generic.go:334] "Generic (PLEG): container finished" podID="95d1f2b2-faa6-4614-8ad7-023e2285fefc" containerID="59d4aac0b4a0325d1dcedff9dcc41a05e8d9449a95458b2d7496241c6f532139" exitCode=0 Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.046441 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"95d1f2b2-faa6-4614-8ad7-023e2285fefc","Type":"ContainerDied","Data":"59d4aac0b4a0325d1dcedff9dcc41a05e8d9449a95458b2d7496241c6f532139"} Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.046686 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"95d1f2b2-faa6-4614-8ad7-023e2285fefc","Type":"ContainerDied","Data":"c967654bb1c0a6fcc06a79555820de6aa55a05fc100a4f9c74f653545320c47a"} Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.046703 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c967654bb1c0a6fcc06a79555820de6aa55a05fc100a4f9c74f653545320c47a" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.056395 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.198913 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d1f2b2-faa6-4614-8ad7-023e2285fefc-config-data\") pod \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\" (UID: \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\") " Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.198995 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-756p4\" (UniqueName: \"kubernetes.io/projected/95d1f2b2-faa6-4614-8ad7-023e2285fefc-kube-api-access-756p4\") pod \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\" (UID: \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\") " Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.199154 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d1f2b2-faa6-4614-8ad7-023e2285fefc-combined-ca-bundle\") pod \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\" (UID: \"95d1f2b2-faa6-4614-8ad7-023e2285fefc\") " Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.206057 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d1f2b2-faa6-4614-8ad7-023e2285fefc-kube-api-access-756p4" (OuterVolumeSpecName: "kube-api-access-756p4") pod "95d1f2b2-faa6-4614-8ad7-023e2285fefc" (UID: "95d1f2b2-faa6-4614-8ad7-023e2285fefc"). InnerVolumeSpecName "kube-api-access-756p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.230998 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d1f2b2-faa6-4614-8ad7-023e2285fefc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95d1f2b2-faa6-4614-8ad7-023e2285fefc" (UID: "95d1f2b2-faa6-4614-8ad7-023e2285fefc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.245504 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d1f2b2-faa6-4614-8ad7-023e2285fefc-config-data" (OuterVolumeSpecName: "config-data") pod "95d1f2b2-faa6-4614-8ad7-023e2285fefc" (UID: "95d1f2b2-faa6-4614-8ad7-023e2285fefc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.301419 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d1f2b2-faa6-4614-8ad7-023e2285fefc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.301472 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-756p4\" (UniqueName: \"kubernetes.io/projected/95d1f2b2-faa6-4614-8ad7-023e2285fefc-kube-api-access-756p4\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.301488 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d1f2b2-faa6-4614-8ad7-023e2285fefc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.545754 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.789675 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brtjx\" (UniqueName: \"kubernetes.io/projected/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-kube-api-access-brtjx\") pod \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.789770 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-config-data\") pod \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.789803 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-nova-metadata-tls-certs\") pod \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.789821 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-combined-ca-bundle\") pod \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.789902 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-logs\") pod \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\" (UID: \"9c1f745e-fc29-43b9-b4da-67f2646fdd3f\") " Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.790820 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-logs" (OuterVolumeSpecName: "logs") pod "9c1f745e-fc29-43b9-b4da-67f2646fdd3f" (UID: "9c1f745e-fc29-43b9-b4da-67f2646fdd3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.796525 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-kube-api-access-brtjx" (OuterVolumeSpecName: "kube-api-access-brtjx") pod "9c1f745e-fc29-43b9-b4da-67f2646fdd3f" (UID: "9c1f745e-fc29-43b9-b4da-67f2646fdd3f"). InnerVolumeSpecName "kube-api-access-brtjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.819383 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c1f745e-fc29-43b9-b4da-67f2646fdd3f" (UID: "9c1f745e-fc29-43b9-b4da-67f2646fdd3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.831909 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-config-data" (OuterVolumeSpecName: "config-data") pod "9c1f745e-fc29-43b9-b4da-67f2646fdd3f" (UID: "9c1f745e-fc29-43b9-b4da-67f2646fdd3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.837418 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9c1f745e-fc29-43b9-b4da-67f2646fdd3f" (UID: "9c1f745e-fc29-43b9-b4da-67f2646fdd3f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.892642 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brtjx\" (UniqueName: \"kubernetes.io/projected/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-kube-api-access-brtjx\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.892686 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.892697 4843 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.892706 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:54 crc kubenswrapper[4843]: I0318 12:35:54.892714 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c1f745e-fc29-43b9-b4da-67f2646fdd3f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.060362 4843 generic.go:334] "Generic (PLEG): container finished" podID="9c1f745e-fc29-43b9-b4da-67f2646fdd3f" containerID="6a5d99fefab792a67220d4240c1983a015f87bc073bfed34f45a3e7d4c4e917e" exitCode=0 Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.060487 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.060515 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9c1f745e-fc29-43b9-b4da-67f2646fdd3f","Type":"ContainerDied","Data":"6a5d99fefab792a67220d4240c1983a015f87bc073bfed34f45a3e7d4c4e917e"} Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.060598 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9c1f745e-fc29-43b9-b4da-67f2646fdd3f","Type":"ContainerDied","Data":"bad4452d9791869d4ab5cceb74ef17639332cd06800798aa14c51bd0a072bc35"} Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.060631 4843 scope.go:117] "RemoveContainer" containerID="6a5d99fefab792a67220d4240c1983a015f87bc073bfed34f45a3e7d4c4e917e" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.061550 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.100253 4843 scope.go:117] "RemoveContainer" containerID="fb810998036d9b269c6af23f6e143b270204a815169f2c30f096b154821941da" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.114452 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.141316 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.156028 4843 scope.go:117] "RemoveContainer" containerID="6a5d99fefab792a67220d4240c1983a015f87bc073bfed34f45a3e7d4c4e917e" Mar 18 12:35:55 crc kubenswrapper[4843]: E0318 12:35:55.156793 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a5d99fefab792a67220d4240c1983a015f87bc073bfed34f45a3e7d4c4e917e\": container with ID starting with 6a5d99fefab792a67220d4240c1983a015f87bc073bfed34f45a3e7d4c4e917e not found: ID does not exist" containerID="6a5d99fefab792a67220d4240c1983a015f87bc073bfed34f45a3e7d4c4e917e" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.156862 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a5d99fefab792a67220d4240c1983a015f87bc073bfed34f45a3e7d4c4e917e"} err="failed to get container status \"6a5d99fefab792a67220d4240c1983a015f87bc073bfed34f45a3e7d4c4e917e\": rpc error: code = NotFound desc = could not find container \"6a5d99fefab792a67220d4240c1983a015f87bc073bfed34f45a3e7d4c4e917e\": container with ID starting with 6a5d99fefab792a67220d4240c1983a015f87bc073bfed34f45a3e7d4c4e917e not found: ID does not exist" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.156900 4843 scope.go:117] "RemoveContainer" containerID="fb810998036d9b269c6af23f6e143b270204a815169f2c30f096b154821941da" Mar 18 12:35:55 crc kubenswrapper[4843]: E0318 12:35:55.157458 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb810998036d9b269c6af23f6e143b270204a815169f2c30f096b154821941da\": container with ID starting with fb810998036d9b269c6af23f6e143b270204a815169f2c30f096b154821941da not found: ID does not exist" containerID="fb810998036d9b269c6af23f6e143b270204a815169f2c30f096b154821941da" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.157504 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb810998036d9b269c6af23f6e143b270204a815169f2c30f096b154821941da"} err="failed to get container status \"fb810998036d9b269c6af23f6e143b270204a815169f2c30f096b154821941da\": rpc error: code = NotFound desc = could not find container \"fb810998036d9b269c6af23f6e143b270204a815169f2c30f096b154821941da\": container with ID starting with fb810998036d9b269c6af23f6e143b270204a815169f2c30f096b154821941da not found: ID does not exist" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.161495 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.195822 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.216166 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:55 crc kubenswrapper[4843]: E0318 12:35:55.216796 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7187efa4-74d7-4162-89d7-5b9368c1924e" containerName="nova-manage" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.216817 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7187efa4-74d7-4162-89d7-5b9368c1924e" containerName="nova-manage" Mar 18 12:35:55 crc kubenswrapper[4843]: E0318 12:35:55.216835 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd87f99-b9bf-481e-87f4-219e09ca9998" containerName="init" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.216842 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd87f99-b9bf-481e-87f4-219e09ca9998" containerName="init" Mar 18 12:35:55 crc kubenswrapper[4843]: E0318 12:35:55.216869 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd87f99-b9bf-481e-87f4-219e09ca9998" containerName="dnsmasq-dns" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.216877 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd87f99-b9bf-481e-87f4-219e09ca9998" containerName="dnsmasq-dns" Mar 18 12:35:55 crc kubenswrapper[4843]: E0318 12:35:55.216900 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d1f2b2-faa6-4614-8ad7-023e2285fefc" containerName="nova-scheduler-scheduler" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.216920 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d1f2b2-faa6-4614-8ad7-023e2285fefc" containerName="nova-scheduler-scheduler" Mar 18 12:35:55 crc kubenswrapper[4843]: E0318 12:35:55.216932 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1f745e-fc29-43b9-b4da-67f2646fdd3f" containerName="nova-metadata-log" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.216939 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1f745e-fc29-43b9-b4da-67f2646fdd3f" containerName="nova-metadata-log" Mar 18 12:35:55 crc kubenswrapper[4843]: E0318 12:35:55.216950 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1f745e-fc29-43b9-b4da-67f2646fdd3f" containerName="nova-metadata-metadata" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.216958 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1f745e-fc29-43b9-b4da-67f2646fdd3f" containerName="nova-metadata-metadata" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.217233 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1f745e-fc29-43b9-b4da-67f2646fdd3f" containerName="nova-metadata-metadata" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.217263 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1f745e-fc29-43b9-b4da-67f2646fdd3f" containerName="nova-metadata-log" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.217284 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd87f99-b9bf-481e-87f4-219e09ca9998" containerName="dnsmasq-dns" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.217298 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7187efa4-74d7-4162-89d7-5b9368c1924e" containerName="nova-manage" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.217309 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d1f2b2-faa6-4614-8ad7-023e2285fefc" containerName="nova-scheduler-scheduler" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.218199 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.221510 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.232145 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.248779 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.250639 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.255293 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.255260 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.259211 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.404002 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9465d55f-c883-49a5-b007-68821f953f6a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.404368 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5zw6\" (UniqueName: \"kubernetes.io/projected/9465d55f-c883-49a5-b007-68821f953f6a-kube-api-access-k5zw6\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.404442 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9465d55f-c883-49a5-b007-68821f953f6a-config-data\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.404483 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8b8t\" (UniqueName: \"kubernetes.io/projected/ca3b8254-f0ce-4a83-9c8b-616800d7565f-kube-api-access-w8b8t\") pod \"nova-scheduler-0\" (UID: \"ca3b8254-f0ce-4a83-9c8b-616800d7565f\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.405177 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9465d55f-c883-49a5-b007-68821f953f6a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.405398 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3b8254-f0ce-4a83-9c8b-616800d7565f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ca3b8254-f0ce-4a83-9c8b-616800d7565f\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.405501 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9465d55f-c883-49a5-b007-68821f953f6a-logs\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.405599 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3b8254-f0ce-4a83-9c8b-616800d7565f-config-data\") pod \"nova-scheduler-0\" (UID: \"ca3b8254-f0ce-4a83-9c8b-616800d7565f\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.507760 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5zw6\" (UniqueName: \"kubernetes.io/projected/9465d55f-c883-49a5-b007-68821f953f6a-kube-api-access-k5zw6\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.507851 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9465d55f-c883-49a5-b007-68821f953f6a-config-data\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.507891 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8b8t\" (UniqueName: \"kubernetes.io/projected/ca3b8254-f0ce-4a83-9c8b-616800d7565f-kube-api-access-w8b8t\") pod \"nova-scheduler-0\" (UID: \"ca3b8254-f0ce-4a83-9c8b-616800d7565f\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.507925 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9465d55f-c883-49a5-b007-68821f953f6a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.508016 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3b8254-f0ce-4a83-9c8b-616800d7565f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ca3b8254-f0ce-4a83-9c8b-616800d7565f\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.508096 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9465d55f-c883-49a5-b007-68821f953f6a-logs\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.508135 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3b8254-f0ce-4a83-9c8b-616800d7565f-config-data\") pod \"nova-scheduler-0\" (UID: \"ca3b8254-f0ce-4a83-9c8b-616800d7565f\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.508193 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9465d55f-c883-49a5-b007-68821f953f6a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.509815 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9465d55f-c883-49a5-b007-68821f953f6a-logs\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.513139 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3b8254-f0ce-4a83-9c8b-616800d7565f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ca3b8254-f0ce-4a83-9c8b-616800d7565f\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.515928 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9465d55f-c883-49a5-b007-68821f953f6a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.516120 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3b8254-f0ce-4a83-9c8b-616800d7565f-config-data\") pod \"nova-scheduler-0\" (UID: \"ca3b8254-f0ce-4a83-9c8b-616800d7565f\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.516410 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9465d55f-c883-49a5-b007-68821f953f6a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.526419 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9465d55f-c883-49a5-b007-68821f953f6a-config-data\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.527093 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5zw6\" (UniqueName: \"kubernetes.io/projected/9465d55f-c883-49a5-b007-68821f953f6a-kube-api-access-k5zw6\") pod \"nova-metadata-0\" (UID: \"9465d55f-c883-49a5-b007-68821f953f6a\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.531893 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8b8t\" (UniqueName: \"kubernetes.io/projected/ca3b8254-f0ce-4a83-9c8b-616800d7565f-kube-api-access-w8b8t\") pod \"nova-scheduler-0\" (UID: \"ca3b8254-f0ce-4a83-9c8b-616800d7565f\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.545220 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.573023 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4843]: I0318 12:35:55.924102 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: W0318 12:35:56.064610 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca3b8254_f0ce_4a83_9c8b_616800d7565f.slice/crio-eaeda4119ec8f8a80c1dd7b539f7378e3c5c494b5a6df7ead5d3781d4af06b48 WatchSource:0}: Error finding container eaeda4119ec8f8a80c1dd7b539f7378e3c5c494b5a6df7ead5d3781d4af06b48: Status 404 returned error can't find the container with id eaeda4119ec8f8a80c1dd7b539f7378e3c5c494b5a6df7ead5d3781d4af06b48 Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.065596 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.074379 4843 generic.go:334] "Generic (PLEG): container finished" podID="75755a4a-0e4a-4b03-9fd7-cd08bf2df080" containerID="40efe9c4a411d085cb0d3cb1a29f1989d59384b70d7eec6ec5aa290472e71bf4" exitCode=0 Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.074442 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.074440 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75755a4a-0e4a-4b03-9fd7-cd08bf2df080","Type":"ContainerDied","Data":"40efe9c4a411d085cb0d3cb1a29f1989d59384b70d7eec6ec5aa290472e71bf4"} Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.074596 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75755a4a-0e4a-4b03-9fd7-cd08bf2df080","Type":"ContainerDied","Data":"190e1b99d93b2b02e4c639567d86eb93ae91cab60fbf9fe5a3e923f73668abc8"} Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.074635 4843 scope.go:117] "RemoveContainer" containerID="40efe9c4a411d085cb0d3cb1a29f1989d59384b70d7eec6ec5aa290472e71bf4" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.121815 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-internal-tls-certs\") pod \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.121913 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-logs\") pod \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.122004 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwbrd\" (UniqueName: \"kubernetes.io/projected/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-kube-api-access-pwbrd\") pod \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.122046 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-public-tls-certs\") pod \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.122072 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-combined-ca-bundle\") pod \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.122118 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-config-data\") pod \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\" (UID: \"75755a4a-0e4a-4b03-9fd7-cd08bf2df080\") " Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.124156 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-logs" (OuterVolumeSpecName: "logs") pod "75755a4a-0e4a-4b03-9fd7-cd08bf2df080" (UID: "75755a4a-0e4a-4b03-9fd7-cd08bf2df080"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.126216 4843 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.133051 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-kube-api-access-pwbrd" (OuterVolumeSpecName: "kube-api-access-pwbrd") pod "75755a4a-0e4a-4b03-9fd7-cd08bf2df080" (UID: "75755a4a-0e4a-4b03-9fd7-cd08bf2df080"). InnerVolumeSpecName "kube-api-access-pwbrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.161474 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-config-data" (OuterVolumeSpecName: "config-data") pod "75755a4a-0e4a-4b03-9fd7-cd08bf2df080" (UID: "75755a4a-0e4a-4b03-9fd7-cd08bf2df080"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.172044 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75755a4a-0e4a-4b03-9fd7-cd08bf2df080" (UID: "75755a4a-0e4a-4b03-9fd7-cd08bf2df080"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.174410 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.178345 4843 scope.go:117] "RemoveContainer" containerID="12e273fb0d689fe649f2e3bd3a282018083b212a609a7030238913d79b10e66f" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.201526 4843 scope.go:117] "RemoveContainer" containerID="40efe9c4a411d085cb0d3cb1a29f1989d59384b70d7eec6ec5aa290472e71bf4" Mar 18 12:35:56 crc kubenswrapper[4843]: E0318 12:35:56.202060 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40efe9c4a411d085cb0d3cb1a29f1989d59384b70d7eec6ec5aa290472e71bf4\": container with ID starting with 40efe9c4a411d085cb0d3cb1a29f1989d59384b70d7eec6ec5aa290472e71bf4 not found: ID does not exist" containerID="40efe9c4a411d085cb0d3cb1a29f1989d59384b70d7eec6ec5aa290472e71bf4" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.202114 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40efe9c4a411d085cb0d3cb1a29f1989d59384b70d7eec6ec5aa290472e71bf4"} err="failed to get container status \"40efe9c4a411d085cb0d3cb1a29f1989d59384b70d7eec6ec5aa290472e71bf4\": rpc error: code = NotFound desc = could not find container \"40efe9c4a411d085cb0d3cb1a29f1989d59384b70d7eec6ec5aa290472e71bf4\": container with ID starting with 40efe9c4a411d085cb0d3cb1a29f1989d59384b70d7eec6ec5aa290472e71bf4 not found: ID does not exist" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.202143 4843 scope.go:117] "RemoveContainer" containerID="12e273fb0d689fe649f2e3bd3a282018083b212a609a7030238913d79b10e66f" Mar 18 12:35:56 crc kubenswrapper[4843]: E0318 12:35:56.202702 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e273fb0d689fe649f2e3bd3a282018083b212a609a7030238913d79b10e66f\": container with ID starting with 12e273fb0d689fe649f2e3bd3a282018083b212a609a7030238913d79b10e66f not found: ID does not exist" containerID="12e273fb0d689fe649f2e3bd3a282018083b212a609a7030238913d79b10e66f" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.202770 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e273fb0d689fe649f2e3bd3a282018083b212a609a7030238913d79b10e66f"} err="failed to get container status \"12e273fb0d689fe649f2e3bd3a282018083b212a609a7030238913d79b10e66f\": rpc error: code = NotFound desc = could not find container \"12e273fb0d689fe649f2e3bd3a282018083b212a609a7030238913d79b10e66f\": container with ID starting with 12e273fb0d689fe649f2e3bd3a282018083b212a609a7030238913d79b10e66f not found: ID does not exist" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.204216 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "75755a4a-0e4a-4b03-9fd7-cd08bf2df080" (UID: "75755a4a-0e4a-4b03-9fd7-cd08bf2df080"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.220691 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "75755a4a-0e4a-4b03-9fd7-cd08bf2df080" (UID: "75755a4a-0e4a-4b03-9fd7-cd08bf2df080"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.227408 4843 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.227565 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.227663 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.227742 4843 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.227855 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwbrd\" (UniqueName: \"kubernetes.io/projected/75755a4a-0e4a-4b03-9fd7-cd08bf2df080-kube-api-access-pwbrd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.414182 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.433946 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.501240 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:56 crc kubenswrapper[4843]: E0318 12:35:56.501822 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75755a4a-0e4a-4b03-9fd7-cd08bf2df080" containerName="nova-api-log" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.501844 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="75755a4a-0e4a-4b03-9fd7-cd08bf2df080" containerName="nova-api-log" Mar 18 12:35:56 crc kubenswrapper[4843]: E0318 12:35:56.501896 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75755a4a-0e4a-4b03-9fd7-cd08bf2df080" containerName="nova-api-api" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.501904 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="75755a4a-0e4a-4b03-9fd7-cd08bf2df080" containerName="nova-api-api" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.502153 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="75755a4a-0e4a-4b03-9fd7-cd08bf2df080" containerName="nova-api-api" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.502180 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="75755a4a-0e4a-4b03-9fd7-cd08bf2df080" containerName="nova-api-log" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.503550 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.508389 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.508847 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.509840 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.517240 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.533243 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-public-tls-certs\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.533310 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92hcv\" (UniqueName: \"kubernetes.io/projected/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-kube-api-access-92hcv\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.533343 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-config-data\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.533385 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.533431 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-logs\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.533446 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.634987 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92hcv\" (UniqueName: \"kubernetes.io/projected/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-kube-api-access-92hcv\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.635066 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-config-data\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.635137 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.635219 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-logs\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.635239 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.635322 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-public-tls-certs\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.636180 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-logs\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.642146 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.642267 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.648831 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-config-data\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.648951 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-public-tls-certs\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.668125 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92hcv\" (UniqueName: \"kubernetes.io/projected/8d1869df-ee8c-406d-99e1-2c63b2d2c7f3-kube-api-access-92hcv\") pod \"nova-api-0\" (UID: \"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3\") " pod="openstack/nova-api-0" Mar 18 12:35:56 crc kubenswrapper[4843]: I0318 12:35:56.824003 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:57 crc kubenswrapper[4843]: I0318 12:35:57.011244 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75755a4a-0e4a-4b03-9fd7-cd08bf2df080" path="/var/lib/kubelet/pods/75755a4a-0e4a-4b03-9fd7-cd08bf2df080/volumes" Mar 18 12:35:57 crc kubenswrapper[4843]: I0318 12:35:57.012878 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d1f2b2-faa6-4614-8ad7-023e2285fefc" path="/var/lib/kubelet/pods/95d1f2b2-faa6-4614-8ad7-023e2285fefc/volumes" Mar 18 12:35:57 crc kubenswrapper[4843]: I0318 12:35:57.013686 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c1f745e-fc29-43b9-b4da-67f2646fdd3f" path="/var/lib/kubelet/pods/9c1f745e-fc29-43b9-b4da-67f2646fdd3f/volumes" Mar 18 12:35:57 crc kubenswrapper[4843]: I0318 12:35:57.101711 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ca3b8254-f0ce-4a83-9c8b-616800d7565f","Type":"ContainerStarted","Data":"0d095796e9f7ebd1002f2eaf0fedf26e0a82c7d2851a0468ecec5cf320f13ccb"} Mar 18 12:35:57 crc kubenswrapper[4843]: I0318 12:35:57.101758 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ca3b8254-f0ce-4a83-9c8b-616800d7565f","Type":"ContainerStarted","Data":"eaeda4119ec8f8a80c1dd7b539f7378e3c5c494b5a6df7ead5d3781d4af06b48"} Mar 18 12:35:57 crc kubenswrapper[4843]: I0318 12:35:57.105663 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9465d55f-c883-49a5-b007-68821f953f6a","Type":"ContainerStarted","Data":"d5f7ed2a4e0e13aba2abbdf0c45988ebc7c5744eeb27f4fc82bf2b835d571fa1"} Mar 18 12:35:57 crc kubenswrapper[4843]: I0318 12:35:57.105694 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9465d55f-c883-49a5-b007-68821f953f6a","Type":"ContainerStarted","Data":"2169807cf8a49e5d08ea91f38991385bec2af750484e99fcbcbd46c188a5910d"} Mar 18 12:35:57 crc kubenswrapper[4843]: I0318 12:35:57.105707 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9465d55f-c883-49a5-b007-68821f953f6a","Type":"ContainerStarted","Data":"a743f1f553156897ab64b7b067fb6d4a857efd0301583a812f0130d222354f7c"} Mar 18 12:35:57 crc kubenswrapper[4843]: I0318 12:35:57.129232 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.129212253 podStartE2EDuration="2.129212253s" podCreationTimestamp="2026-03-18 12:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:57.118676694 +0000 UTC m=+1590.834502218" watchObservedRunningTime="2026-03-18 12:35:57.129212253 +0000 UTC m=+1590.845037777" Mar 18 12:35:57 crc kubenswrapper[4843]: I0318 12:35:57.142779 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.142756537 podStartE2EDuration="2.142756537s" podCreationTimestamp="2026-03-18 12:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:57.142565652 +0000 UTC m=+1590.858391206" watchObservedRunningTime="2026-03-18 12:35:57.142756537 +0000 UTC m=+1590.858582061" Mar 18 12:35:57 crc kubenswrapper[4843]: I0318 12:35:57.321573 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:58 crc kubenswrapper[4843]: I0318 12:35:58.131273 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3","Type":"ContainerStarted","Data":"9410b1a482575c6d19f233be0c10028a49c93a3f4b845e934e97329069fb207a"} Mar 18 12:35:58 crc kubenswrapper[4843]: I0318 12:35:58.131311 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3","Type":"ContainerStarted","Data":"040da6057a094b7df553c0292346bd7673d96b31c9a39f48bfff9f5bf0771615"} Mar 18 12:35:58 crc kubenswrapper[4843]: I0318 12:35:58.131323 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d1869df-ee8c-406d-99e1-2c63b2d2c7f3","Type":"ContainerStarted","Data":"1d692abf1d9766521a0a4c5fe86aeb1cc9b1787c531ab26ea92f7b123fff0497"} Mar 18 12:35:58 crc kubenswrapper[4843]: I0318 12:35:58.166327 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.166309872 podStartE2EDuration="2.166309872s" podCreationTimestamp="2026-03-18 12:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:58.164697537 +0000 UTC m=+1591.880523061" watchObservedRunningTime="2026-03-18 12:35:58.166309872 +0000 UTC m=+1591.882135396" Mar 18 12:36:00 crc kubenswrapper[4843]: I0318 12:36:00.160301 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563956-sk8bs"] Mar 18 12:36:00 crc kubenswrapper[4843]: I0318 12:36:00.162141 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563956-sk8bs" Mar 18 12:36:00 crc kubenswrapper[4843]: I0318 12:36:00.165481 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:36:00 crc kubenswrapper[4843]: I0318 12:36:00.165686 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:36:00 crc kubenswrapper[4843]: I0318 12:36:00.165986 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:36:00 crc kubenswrapper[4843]: I0318 12:36:00.186842 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563956-sk8bs"] Mar 18 12:36:00 crc kubenswrapper[4843]: I0318 12:36:00.339147 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lj2h\" (UniqueName: \"kubernetes.io/projected/388f1788-0b98-40b9-8bbb-5c617ca97a3e-kube-api-access-2lj2h\") pod \"auto-csr-approver-29563956-sk8bs\" (UID: \"388f1788-0b98-40b9-8bbb-5c617ca97a3e\") " pod="openshift-infra/auto-csr-approver-29563956-sk8bs" Mar 18 12:36:00 crc kubenswrapper[4843]: I0318 12:36:00.441974 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lj2h\" (UniqueName: \"kubernetes.io/projected/388f1788-0b98-40b9-8bbb-5c617ca97a3e-kube-api-access-2lj2h\") pod \"auto-csr-approver-29563956-sk8bs\" (UID: \"388f1788-0b98-40b9-8bbb-5c617ca97a3e\") " pod="openshift-infra/auto-csr-approver-29563956-sk8bs" Mar 18 12:36:00 crc kubenswrapper[4843]: I0318 12:36:00.464272 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lj2h\" (UniqueName: \"kubernetes.io/projected/388f1788-0b98-40b9-8bbb-5c617ca97a3e-kube-api-access-2lj2h\") pod \"auto-csr-approver-29563956-sk8bs\" (UID: \"388f1788-0b98-40b9-8bbb-5c617ca97a3e\") " pod="openshift-infra/auto-csr-approver-29563956-sk8bs" Mar 18 12:36:00 crc kubenswrapper[4843]: I0318 12:36:00.489693 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563956-sk8bs" Mar 18 12:36:00 crc kubenswrapper[4843]: I0318 12:36:00.545963 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 12:36:00 crc kubenswrapper[4843]: I0318 12:36:00.967339 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563956-sk8bs"] Mar 18 12:36:00 crc kubenswrapper[4843]: W0318 12:36:00.979166 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod388f1788_0b98_40b9_8bbb_5c617ca97a3e.slice/crio-4fdc35a252c0ac89bbfc2deddf11cf6388a0197dae9d98155d9b461e42d5b62f WatchSource:0}: Error finding container 4fdc35a252c0ac89bbfc2deddf11cf6388a0197dae9d98155d9b461e42d5b62f: Status 404 returned error can't find the container with id 4fdc35a252c0ac89bbfc2deddf11cf6388a0197dae9d98155d9b461e42d5b62f Mar 18 12:36:01 crc kubenswrapper[4843]: I0318 12:36:01.183414 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563956-sk8bs" event={"ID":"388f1788-0b98-40b9-8bbb-5c617ca97a3e","Type":"ContainerStarted","Data":"4fdc35a252c0ac89bbfc2deddf11cf6388a0197dae9d98155d9b461e42d5b62f"} Mar 18 12:36:03 crc kubenswrapper[4843]: I0318 12:36:03.207426 4843 generic.go:334] "Generic (PLEG): container finished" podID="388f1788-0b98-40b9-8bbb-5c617ca97a3e" containerID="b7b445d9096539685fc2460f496f3352460c743a668f8c59b1e932c144bde140" exitCode=0 Mar 18 12:36:03 crc kubenswrapper[4843]: I0318 12:36:03.207526 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563956-sk8bs" event={"ID":"388f1788-0b98-40b9-8bbb-5c617ca97a3e","Type":"ContainerDied","Data":"b7b445d9096539685fc2460f496f3352460c743a668f8c59b1e932c144bde140"} Mar 18 12:36:04 crc kubenswrapper[4843]: I0318 12:36:04.662296 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563956-sk8bs" Mar 18 12:36:04 crc kubenswrapper[4843]: I0318 12:36:04.737387 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lj2h\" (UniqueName: \"kubernetes.io/projected/388f1788-0b98-40b9-8bbb-5c617ca97a3e-kube-api-access-2lj2h\") pod \"388f1788-0b98-40b9-8bbb-5c617ca97a3e\" (UID: \"388f1788-0b98-40b9-8bbb-5c617ca97a3e\") " Mar 18 12:36:04 crc kubenswrapper[4843]: I0318 12:36:04.760830 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388f1788-0b98-40b9-8bbb-5c617ca97a3e-kube-api-access-2lj2h" (OuterVolumeSpecName: "kube-api-access-2lj2h") pod "388f1788-0b98-40b9-8bbb-5c617ca97a3e" (UID: "388f1788-0b98-40b9-8bbb-5c617ca97a3e"). InnerVolumeSpecName "kube-api-access-2lj2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:04 crc kubenswrapper[4843]: I0318 12:36:04.840278 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lj2h\" (UniqueName: \"kubernetes.io/projected/388f1788-0b98-40b9-8bbb-5c617ca97a3e-kube-api-access-2lj2h\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:05 crc kubenswrapper[4843]: I0318 12:36:05.242986 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563956-sk8bs" event={"ID":"388f1788-0b98-40b9-8bbb-5c617ca97a3e","Type":"ContainerDied","Data":"4fdc35a252c0ac89bbfc2deddf11cf6388a0197dae9d98155d9b461e42d5b62f"} Mar 18 12:36:05 crc kubenswrapper[4843]: I0318 12:36:05.243027 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fdc35a252c0ac89bbfc2deddf11cf6388a0197dae9d98155d9b461e42d5b62f" Mar 18 12:36:05 crc kubenswrapper[4843]: I0318 12:36:05.243092 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563956-sk8bs" Mar 18 12:36:05 crc kubenswrapper[4843]: I0318 12:36:05.546253 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 12:36:05 crc kubenswrapper[4843]: I0318 12:36:05.574265 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 12:36:05 crc kubenswrapper[4843]: I0318 12:36:05.574336 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 12:36:05 crc kubenswrapper[4843]: I0318 12:36:05.597517 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 12:36:05 crc kubenswrapper[4843]: I0318 12:36:05.861931 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563950-8lgnx"] Mar 18 12:36:05 crc kubenswrapper[4843]: I0318 12:36:05.870853 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563950-8lgnx"] Mar 18 12:36:06 crc kubenswrapper[4843]: I0318 12:36:06.281489 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 12:36:06 crc kubenswrapper[4843]: I0318 12:36:06.585878 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9465d55f-c883-49a5-b007-68821f953f6a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:36:06 crc kubenswrapper[4843]: I0318 12:36:06.585907 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9465d55f-c883-49a5-b007-68821f953f6a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:36:06 crc kubenswrapper[4843]: I0318 12:36:06.824908 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:36:06 crc kubenswrapper[4843]: I0318 12:36:06.824981 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:36:06 crc kubenswrapper[4843]: I0318 12:36:06.997052 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="514b942b-9005-40b3-8671-8a765b1844d0" path="/var/lib/kubelet/pods/514b942b-9005-40b3-8671-8a765b1844d0/volumes" Mar 18 12:36:07 crc kubenswrapper[4843]: I0318 12:36:07.837882 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8d1869df-ee8c-406d-99e1-2c63b2d2c7f3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.217:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:36:07 crc kubenswrapper[4843]: I0318 12:36:07.837882 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8d1869df-ee8c-406d-99e1-2c63b2d2c7f3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.217:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:36:08 crc kubenswrapper[4843]: I0318 12:36:08.182552 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 12:36:13 crc kubenswrapper[4843]: I0318 12:36:13.573414 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:36:13 crc kubenswrapper[4843]: I0318 12:36:13.573973 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:36:14 crc kubenswrapper[4843]: I0318 12:36:14.825144 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:36:14 crc kubenswrapper[4843]: I0318 12:36:14.825459 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:36:15 crc kubenswrapper[4843]: I0318 12:36:15.757488 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 12:36:15 crc kubenswrapper[4843]: I0318 12:36:15.761086 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 12:36:15 crc kubenswrapper[4843]: I0318 12:36:15.766296 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 12:36:16 crc kubenswrapper[4843]: I0318 12:36:16.422123 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 12:36:16 crc kubenswrapper[4843]: I0318 12:36:16.835144 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 12:36:16 crc kubenswrapper[4843]: I0318 12:36:16.839850 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 12:36:16 crc kubenswrapper[4843]: I0318 12:36:16.847718 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 12:36:17 crc kubenswrapper[4843]: I0318 12:36:17.435444 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 12:36:20 crc kubenswrapper[4843]: I0318 12:36:20.035637 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:36:20 crc kubenswrapper[4843]: I0318 12:36:20.036501 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:36:26 crc kubenswrapper[4843]: I0318 12:36:26.500796 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:36:28 crc kubenswrapper[4843]: I0318 12:36:28.212623 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:36:33 crc kubenswrapper[4843]: I0318 12:36:33.246940 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="1c41f082-cf59-42b4-8314-64aace288dd1" containerName="rabbitmq" containerID="cri-o://482d98003089d6422095943abcc3acd42ac94a7de53b7495f1d7c5c9663fb80f" gracePeriod=604794 Mar 18 12:36:34 crc kubenswrapper[4843]: I0318 12:36:34.740216 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="257240a5-cc42-4354-9079-66e6de070b34" containerName="rabbitmq" containerID="cri-o://560a2e30ff8c9ca0f672bb737f8b8e51458216cac9b16f5096d1bb050b58d5dc" gracePeriod=604794 Mar 18 12:36:36 crc kubenswrapper[4843]: I0318 12:36:36.696901 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="257240a5-cc42-4354-9079-66e6de070b34" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 18 12:36:36 crc kubenswrapper[4843]: I0318 12:36:36.964671 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="1c41f082-cf59-42b4-8314-64aace288dd1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.593205 4843 generic.go:334] "Generic (PLEG): container finished" podID="1c41f082-cf59-42b4-8314-64aace288dd1" containerID="482d98003089d6422095943abcc3acd42ac94a7de53b7495f1d7c5c9663fb80f" exitCode=0 Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.593480 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1c41f082-cf59-42b4-8314-64aace288dd1","Type":"ContainerDied","Data":"482d98003089d6422095943abcc3acd42ac94a7de53b7495f1d7c5c9663fb80f"} Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.874889 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.899806 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-confd\") pod \"1c41f082-cf59-42b4-8314-64aace288dd1\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.899867 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-erlang-cookie\") pod \"1c41f082-cf59-42b4-8314-64aace288dd1\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.899920 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-tls\") pod \"1c41f082-cf59-42b4-8314-64aace288dd1\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.899940 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"1c41f082-cf59-42b4-8314-64aace288dd1\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.900035 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c41f082-cf59-42b4-8314-64aace288dd1-pod-info\") pod \"1c41f082-cf59-42b4-8314-64aace288dd1\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.900103 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-config-data\") pod \"1c41f082-cf59-42b4-8314-64aace288dd1\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.900135 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c41f082-cf59-42b4-8314-64aace288dd1-erlang-cookie-secret\") pod \"1c41f082-cf59-42b4-8314-64aace288dd1\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.900172 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppb4q\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-kube-api-access-ppb4q\") pod \"1c41f082-cf59-42b4-8314-64aace288dd1\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.900203 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-server-conf\") pod \"1c41f082-cf59-42b4-8314-64aace288dd1\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.900278 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-plugins\") pod \"1c41f082-cf59-42b4-8314-64aace288dd1\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.900306 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-plugins-conf\") pod \"1c41f082-cf59-42b4-8314-64aace288dd1\" (UID: \"1c41f082-cf59-42b4-8314-64aace288dd1\") " Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.901319 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1c41f082-cf59-42b4-8314-64aace288dd1" (UID: "1c41f082-cf59-42b4-8314-64aace288dd1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.902495 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1c41f082-cf59-42b4-8314-64aace288dd1" (UID: "1c41f082-cf59-42b4-8314-64aace288dd1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.907030 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1c41f082-cf59-42b4-8314-64aace288dd1-pod-info" (OuterVolumeSpecName: "pod-info") pod "1c41f082-cf59-42b4-8314-64aace288dd1" (UID: "1c41f082-cf59-42b4-8314-64aace288dd1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.911113 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-kube-api-access-ppb4q" (OuterVolumeSpecName: "kube-api-access-ppb4q") pod "1c41f082-cf59-42b4-8314-64aace288dd1" (UID: "1c41f082-cf59-42b4-8314-64aace288dd1"). InnerVolumeSpecName "kube-api-access-ppb4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.911566 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1c41f082-cf59-42b4-8314-64aace288dd1" (UID: "1c41f082-cf59-42b4-8314-64aace288dd1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.911625 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1c41f082-cf59-42b4-8314-64aace288dd1" (UID: "1c41f082-cf59-42b4-8314-64aace288dd1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.914858 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c41f082-cf59-42b4-8314-64aace288dd1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1c41f082-cf59-42b4-8314-64aace288dd1" (UID: "1c41f082-cf59-42b4-8314-64aace288dd1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.922753 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "1c41f082-cf59-42b4-8314-64aace288dd1" (UID: "1c41f082-cf59-42b4-8314-64aace288dd1"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:36:39 crc kubenswrapper[4843]: I0318 12:36:39.987662 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-config-data" (OuterVolumeSpecName: "config-data") pod "1c41f082-cf59-42b4-8314-64aace288dd1" (UID: "1c41f082-cf59-42b4-8314-64aace288dd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.003910 4843 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.003941 4843 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.003982 4843 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.003991 4843 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c41f082-cf59-42b4-8314-64aace288dd1-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.004000 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.004008 4843 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c41f082-cf59-42b4-8314-64aace288dd1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.004017 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppb4q\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-kube-api-access-ppb4q\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.004025 4843 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.004033 4843 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.026596 4843 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.049243 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-server-conf" (OuterVolumeSpecName: "server-conf") pod "1c41f082-cf59-42b4-8314-64aace288dd1" (UID: "1c41f082-cf59-42b4-8314-64aace288dd1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.107910 4843 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.107983 4843 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c41f082-cf59-42b4-8314-64aace288dd1-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.582963 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1c41f082-cf59-42b4-8314-64aace288dd1" (UID: "1c41f082-cf59-42b4-8314-64aace288dd1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.611287 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1c41f082-cf59-42b4-8314-64aace288dd1","Type":"ContainerDied","Data":"96f25e3bc4bf12b1cfb6fdf64233518801e344d3e8eca473ac443d6e8d832d6f"} Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.611705 4843 scope.go:117] "RemoveContainer" containerID="482d98003089d6422095943abcc3acd42ac94a7de53b7495f1d7c5c9663fb80f" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.611879 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.647323 4843 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c41f082-cf59-42b4-8314-64aace288dd1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.677491 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.679000 4843 scope.go:117] "RemoveContainer" containerID="b2949077702fceb64cea3279ee20a1822ebb720ae15697187e10f706bad4d9b4" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.693121 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.706991 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:36:40 crc kubenswrapper[4843]: E0318 12:36:40.707552 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c41f082-cf59-42b4-8314-64aace288dd1" containerName="rabbitmq" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.707574 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c41f082-cf59-42b4-8314-64aace288dd1" containerName="rabbitmq" Mar 18 12:36:40 crc kubenswrapper[4843]: E0318 12:36:40.707618 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388f1788-0b98-40b9-8bbb-5c617ca97a3e" containerName="oc" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.707625 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="388f1788-0b98-40b9-8bbb-5c617ca97a3e" containerName="oc" Mar 18 12:36:40 crc kubenswrapper[4843]: E0318 12:36:40.707645 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c41f082-cf59-42b4-8314-64aace288dd1" containerName="setup-container" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.707667 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c41f082-cf59-42b4-8314-64aace288dd1" containerName="setup-container" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.707916 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="388f1788-0b98-40b9-8bbb-5c617ca97a3e" containerName="oc" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.707947 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c41f082-cf59-42b4-8314-64aace288dd1" containerName="rabbitmq" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.709252 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.713806 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.713992 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.714108 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ndvrz" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.714868 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.721235 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.722183 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.722335 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.728919 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.854347 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cvjw\" (UniqueName: \"kubernetes.io/projected/63d21391-4df5-4d15-a12d-7ac03c66194c-kube-api-access-9cvjw\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.854407 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63d21391-4df5-4d15-a12d-7ac03c66194c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.854480 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63d21391-4df5-4d15-a12d-7ac03c66194c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.854531 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63d21391-4df5-4d15-a12d-7ac03c66194c-config-data\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.854570 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63d21391-4df5-4d15-a12d-7ac03c66194c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.854626 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63d21391-4df5-4d15-a12d-7ac03c66194c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.854746 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63d21391-4df5-4d15-a12d-7ac03c66194c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.854780 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63d21391-4df5-4d15-a12d-7ac03c66194c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.854830 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.854868 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63d21391-4df5-4d15-a12d-7ac03c66194c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.854897 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63d21391-4df5-4d15-a12d-7ac03c66194c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.957593 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63d21391-4df5-4d15-a12d-7ac03c66194c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.957879 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63d21391-4df5-4d15-a12d-7ac03c66194c-config-data\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.957979 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63d21391-4df5-4d15-a12d-7ac03c66194c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.958132 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63d21391-4df5-4d15-a12d-7ac03c66194c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.958210 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63d21391-4df5-4d15-a12d-7ac03c66194c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.958259 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63d21391-4df5-4d15-a12d-7ac03c66194c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.958398 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.958496 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63d21391-4df5-4d15-a12d-7ac03c66194c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.958539 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63d21391-4df5-4d15-a12d-7ac03c66194c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.958782 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cvjw\" (UniqueName: \"kubernetes.io/projected/63d21391-4df5-4d15-a12d-7ac03c66194c-kube-api-access-9cvjw\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.958822 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63d21391-4df5-4d15-a12d-7ac03c66194c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.958836 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.959260 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63d21391-4df5-4d15-a12d-7ac03c66194c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.960301 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63d21391-4df5-4d15-a12d-7ac03c66194c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.960422 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63d21391-4df5-4d15-a12d-7ac03c66194c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.961184 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63d21391-4df5-4d15-a12d-7ac03c66194c-config-data\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.961796 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63d21391-4df5-4d15-a12d-7ac03c66194c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.964770 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63d21391-4df5-4d15-a12d-7ac03c66194c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.971047 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63d21391-4df5-4d15-a12d-7ac03c66194c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.985482 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cvjw\" (UniqueName: \"kubernetes.io/projected/63d21391-4df5-4d15-a12d-7ac03c66194c-kube-api-access-9cvjw\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.988356 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63d21391-4df5-4d15-a12d-7ac03c66194c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:40 crc kubenswrapper[4843]: I0318 12:36:40.991915 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63d21391-4df5-4d15-a12d-7ac03c66194c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.011951 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c41f082-cf59-42b4-8314-64aace288dd1" path="/var/lib/kubelet/pods/1c41f082-cf59-42b4-8314-64aace288dd1/volumes" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.053530 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"63d21391-4df5-4d15-a12d-7ac03c66194c\") " pod="openstack/rabbitmq-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.058949 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.219778 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.367363 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2gkk\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-kube-api-access-r2gkk\") pod \"257240a5-cc42-4354-9079-66e6de070b34\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.367425 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"257240a5-cc42-4354-9079-66e6de070b34\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.367475 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-erlang-cookie\") pod \"257240a5-cc42-4354-9079-66e6de070b34\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.367526 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-config-data\") pod \"257240a5-cc42-4354-9079-66e6de070b34\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.367564 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-plugins-conf\") pod \"257240a5-cc42-4354-9079-66e6de070b34\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.367610 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-confd\") pod \"257240a5-cc42-4354-9079-66e6de070b34\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.367644 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-server-conf\") pod \"257240a5-cc42-4354-9079-66e6de070b34\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.367707 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-plugins\") pod \"257240a5-cc42-4354-9079-66e6de070b34\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.367740 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/257240a5-cc42-4354-9079-66e6de070b34-erlang-cookie-secret\") pod \"257240a5-cc42-4354-9079-66e6de070b34\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.367771 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-tls\") pod \"257240a5-cc42-4354-9079-66e6de070b34\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.367835 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/257240a5-cc42-4354-9079-66e6de070b34-pod-info\") pod \"257240a5-cc42-4354-9079-66e6de070b34\" (UID: \"257240a5-cc42-4354-9079-66e6de070b34\") " Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.368344 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "257240a5-cc42-4354-9079-66e6de070b34" (UID: "257240a5-cc42-4354-9079-66e6de070b34"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.368369 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "257240a5-cc42-4354-9079-66e6de070b34" (UID: "257240a5-cc42-4354-9079-66e6de070b34"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.368395 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "257240a5-cc42-4354-9079-66e6de070b34" (UID: "257240a5-cc42-4354-9079-66e6de070b34"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.375133 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "257240a5-cc42-4354-9079-66e6de070b34" (UID: "257240a5-cc42-4354-9079-66e6de070b34"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.375806 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/257240a5-cc42-4354-9079-66e6de070b34-pod-info" (OuterVolumeSpecName: "pod-info") pod "257240a5-cc42-4354-9079-66e6de070b34" (UID: "257240a5-cc42-4354-9079-66e6de070b34"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.375906 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/257240a5-cc42-4354-9079-66e6de070b34-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "257240a5-cc42-4354-9079-66e6de070b34" (UID: "257240a5-cc42-4354-9079-66e6de070b34"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.376028 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-kube-api-access-r2gkk" (OuterVolumeSpecName: "kube-api-access-r2gkk") pod "257240a5-cc42-4354-9079-66e6de070b34" (UID: "257240a5-cc42-4354-9079-66e6de070b34"). InnerVolumeSpecName "kube-api-access-r2gkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.377088 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "257240a5-cc42-4354-9079-66e6de070b34" (UID: "257240a5-cc42-4354-9079-66e6de070b34"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.391753 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-config-data" (OuterVolumeSpecName: "config-data") pod "257240a5-cc42-4354-9079-66e6de070b34" (UID: "257240a5-cc42-4354-9079-66e6de070b34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.424583 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-server-conf" (OuterVolumeSpecName: "server-conf") pod "257240a5-cc42-4354-9079-66e6de070b34" (UID: "257240a5-cc42-4354-9079-66e6de070b34"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.470499 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2gkk\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-kube-api-access-r2gkk\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.470557 4843 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.470570 4843 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.470581 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.470592 4843 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.470600 4843 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/257240a5-cc42-4354-9079-66e6de070b34-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.470609 4843 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.470617 4843 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/257240a5-cc42-4354-9079-66e6de070b34-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.470627 4843 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.470637 4843 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/257240a5-cc42-4354-9079-66e6de070b34-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.476240 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "257240a5-cc42-4354-9079-66e6de070b34" (UID: "257240a5-cc42-4354-9079-66e6de070b34"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.491667 4843 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.524926 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.572565 4843 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/257240a5-cc42-4354-9079-66e6de070b34-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.572609 4843 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.631546 4843 generic.go:334] "Generic (PLEG): container finished" podID="257240a5-cc42-4354-9079-66e6de070b34" containerID="560a2e30ff8c9ca0f672bb737f8b8e51458216cac9b16f5096d1bb050b58d5dc" exitCode=0 Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.631629 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.631669 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"257240a5-cc42-4354-9079-66e6de070b34","Type":"ContainerDied","Data":"560a2e30ff8c9ca0f672bb737f8b8e51458216cac9b16f5096d1bb050b58d5dc"} Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.631746 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"257240a5-cc42-4354-9079-66e6de070b34","Type":"ContainerDied","Data":"f11026eeedc534f2918535264f5f5070a00c005eb929a7b4be8e57789d2283f3"} Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.631771 4843 scope.go:117] "RemoveContainer" containerID="560a2e30ff8c9ca0f672bb737f8b8e51458216cac9b16f5096d1bb050b58d5dc" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.637823 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63d21391-4df5-4d15-a12d-7ac03c66194c","Type":"ContainerStarted","Data":"d6ffb6847fac3652f0b4cdd57c73a842d50e15835e71f04e82bd0beb8be47d35"} Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.673621 4843 scope.go:117] "RemoveContainer" containerID="9e69e813190bc3b6041d2a54beae48ef9ecf7969785d0b66c5880f22bcd65291" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.702506 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.708671 4843 scope.go:117] "RemoveContainer" containerID="560a2e30ff8c9ca0f672bb737f8b8e51458216cac9b16f5096d1bb050b58d5dc" Mar 18 12:36:41 crc kubenswrapper[4843]: E0318 12:36:41.709267 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560a2e30ff8c9ca0f672bb737f8b8e51458216cac9b16f5096d1bb050b58d5dc\": container with ID starting with 560a2e30ff8c9ca0f672bb737f8b8e51458216cac9b16f5096d1bb050b58d5dc not found: ID does not exist" containerID="560a2e30ff8c9ca0f672bb737f8b8e51458216cac9b16f5096d1bb050b58d5dc" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.709301 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560a2e30ff8c9ca0f672bb737f8b8e51458216cac9b16f5096d1bb050b58d5dc"} err="failed to get container status \"560a2e30ff8c9ca0f672bb737f8b8e51458216cac9b16f5096d1bb050b58d5dc\": rpc error: code = NotFound desc = could not find container \"560a2e30ff8c9ca0f672bb737f8b8e51458216cac9b16f5096d1bb050b58d5dc\": container with ID starting with 560a2e30ff8c9ca0f672bb737f8b8e51458216cac9b16f5096d1bb050b58d5dc not found: ID does not exist" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.709330 4843 scope.go:117] "RemoveContainer" containerID="9e69e813190bc3b6041d2a54beae48ef9ecf7969785d0b66c5880f22bcd65291" Mar 18 12:36:41 crc kubenswrapper[4843]: E0318 12:36:41.709927 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e69e813190bc3b6041d2a54beae48ef9ecf7969785d0b66c5880f22bcd65291\": container with ID starting with 9e69e813190bc3b6041d2a54beae48ef9ecf7969785d0b66c5880f22bcd65291 not found: ID does not exist" containerID="9e69e813190bc3b6041d2a54beae48ef9ecf7969785d0b66c5880f22bcd65291" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.709973 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e69e813190bc3b6041d2a54beae48ef9ecf7969785d0b66c5880f22bcd65291"} err="failed to get container status \"9e69e813190bc3b6041d2a54beae48ef9ecf7969785d0b66c5880f22bcd65291\": rpc error: code = NotFound desc = could not find container \"9e69e813190bc3b6041d2a54beae48ef9ecf7969785d0b66c5880f22bcd65291\": container with ID starting with 9e69e813190bc3b6041d2a54beae48ef9ecf7969785d0b66c5880f22bcd65291 not found: ID does not exist" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.714064 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.741974 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:36:41 crc kubenswrapper[4843]: E0318 12:36:41.744362 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257240a5-cc42-4354-9079-66e6de070b34" containerName="rabbitmq" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.744391 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="257240a5-cc42-4354-9079-66e6de070b34" containerName="rabbitmq" Mar 18 12:36:41 crc kubenswrapper[4843]: E0318 12:36:41.744843 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257240a5-cc42-4354-9079-66e6de070b34" containerName="setup-container" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.744860 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="257240a5-cc42-4354-9079-66e6de070b34" containerName="setup-container" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.745127 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="257240a5-cc42-4354-9079-66e6de070b34" containerName="rabbitmq" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.746485 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.748944 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.757849 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.762987 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.763083 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.763405 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.763685 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rtkzr" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.763796 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.766830 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.889772 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3aaa37b6-550a-4bd8-a166-af337e05defd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.890058 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3aaa37b6-550a-4bd8-a166-af337e05defd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.890090 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3aaa37b6-550a-4bd8-a166-af337e05defd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.890123 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3aaa37b6-550a-4bd8-a166-af337e05defd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.890143 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3aaa37b6-550a-4bd8-a166-af337e05defd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.890166 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3aaa37b6-550a-4bd8-a166-af337e05defd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.890214 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49p44\" (UniqueName: \"kubernetes.io/projected/3aaa37b6-550a-4bd8-a166-af337e05defd-kube-api-access-49p44\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.890250 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3aaa37b6-550a-4bd8-a166-af337e05defd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.890300 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3aaa37b6-550a-4bd8-a166-af337e05defd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.890536 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3aaa37b6-550a-4bd8-a166-af337e05defd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.890590 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.992199 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3aaa37b6-550a-4bd8-a166-af337e05defd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.992279 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3aaa37b6-550a-4bd8-a166-af337e05defd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.992333 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3aaa37b6-550a-4bd8-a166-af337e05defd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.992365 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.992506 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3aaa37b6-550a-4bd8-a166-af337e05defd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.992543 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3aaa37b6-550a-4bd8-a166-af337e05defd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.992599 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3aaa37b6-550a-4bd8-a166-af337e05defd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.992627 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3aaa37b6-550a-4bd8-a166-af337e05defd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.992671 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3aaa37b6-550a-4bd8-a166-af337e05defd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.992709 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3aaa37b6-550a-4bd8-a166-af337e05defd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.992743 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49p44\" (UniqueName: \"kubernetes.io/projected/3aaa37b6-550a-4bd8-a166-af337e05defd-kube-api-access-49p44\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.993740 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3aaa37b6-550a-4bd8-a166-af337e05defd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.993865 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3aaa37b6-550a-4bd8-a166-af337e05defd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.993949 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3aaa37b6-550a-4bd8-a166-af337e05defd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.994480 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3aaa37b6-550a-4bd8-a166-af337e05defd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:41 crc kubenswrapper[4843]: I0318 12:36:41.997022 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3aaa37b6-550a-4bd8-a166-af337e05defd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.003162 4843 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.003681 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3aaa37b6-550a-4bd8-a166-af337e05defd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.010425 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3aaa37b6-550a-4bd8-a166-af337e05defd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.011939 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49p44\" (UniqueName: \"kubernetes.io/projected/3aaa37b6-550a-4bd8-a166-af337e05defd-kube-api-access-49p44\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.014235 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3aaa37b6-550a-4bd8-a166-af337e05defd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.021826 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3aaa37b6-550a-4bd8-a166-af337e05defd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.045980 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3aaa37b6-550a-4bd8-a166-af337e05defd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.070729 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.144107 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-276w8"] Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.152969 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.156525 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.157358 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-276w8"] Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.298537 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-dns-svc\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.298593 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.298620 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.298950 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.299079 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-config\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.299278 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.299380 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwcst\" (UniqueName: \"kubernetes.io/projected/f2b1db3a-fd48-46dd-bb46-92b0d197b113-kube-api-access-qwcst\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.401345 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-dns-svc\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.401521 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.401550 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.401617 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.401674 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-config\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.401715 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.401743 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwcst\" (UniqueName: \"kubernetes.io/projected/f2b1db3a-fd48-46dd-bb46-92b0d197b113-kube-api-access-qwcst\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.402358 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-dns-svc\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.402694 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.403277 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-config\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.403319 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.403394 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.403608 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.429993 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwcst\" (UniqueName: \"kubernetes.io/projected/f2b1db3a-fd48-46dd-bb46-92b0d197b113-kube-api-access-qwcst\") pod \"dnsmasq-dns-5576978c7c-276w8\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.478002 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.632280 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:36:42 crc kubenswrapper[4843]: W0318 12:36:42.670490 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aaa37b6_550a_4bd8_a166_af337e05defd.slice/crio-45347bb7a036e950cbc149d8a4ff6934ec8f8ff0ba17ca1a45a4bc797d43d009 WatchSource:0}: Error finding container 45347bb7a036e950cbc149d8a4ff6934ec8f8ff0ba17ca1a45a4bc797d43d009: Status 404 returned error can't find the container with id 45347bb7a036e950cbc149d8a4ff6934ec8f8ff0ba17ca1a45a4bc797d43d009 Mar 18 12:36:42 crc kubenswrapper[4843]: I0318 12:36:42.994704 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="257240a5-cc42-4354-9079-66e6de070b34" path="/var/lib/kubelet/pods/257240a5-cc42-4354-9079-66e6de070b34/volumes" Mar 18 12:36:43 crc kubenswrapper[4843]: I0318 12:36:43.046177 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-276w8"] Mar 18 12:36:43 crc kubenswrapper[4843]: W0318 12:36:43.046741 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2b1db3a_fd48_46dd_bb46_92b0d197b113.slice/crio-ba30a948689543eeeaf0f43bfb2d2f80a050b9fbca14c8359c692476404c66bd WatchSource:0}: Error finding container ba30a948689543eeeaf0f43bfb2d2f80a050b9fbca14c8359c692476404c66bd: Status 404 returned error can't find the container with id ba30a948689543eeeaf0f43bfb2d2f80a050b9fbca14c8359c692476404c66bd Mar 18 12:36:43 crc kubenswrapper[4843]: I0318 12:36:43.679513 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3aaa37b6-550a-4bd8-a166-af337e05defd","Type":"ContainerStarted","Data":"45347bb7a036e950cbc149d8a4ff6934ec8f8ff0ba17ca1a45a4bc797d43d009"} Mar 18 12:36:43 crc kubenswrapper[4843]: I0318 12:36:43.683378 4843 generic.go:334] "Generic (PLEG): container finished" podID="f2b1db3a-fd48-46dd-bb46-92b0d197b113" containerID="39157a09c9ee04f3dbf0d168106346952a349abd1d2dc1dbccd572f0567b7d77" exitCode=0 Mar 18 12:36:43 crc kubenswrapper[4843]: I0318 12:36:43.683681 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-276w8" event={"ID":"f2b1db3a-fd48-46dd-bb46-92b0d197b113","Type":"ContainerDied","Data":"39157a09c9ee04f3dbf0d168106346952a349abd1d2dc1dbccd572f0567b7d77"} Mar 18 12:36:43 crc kubenswrapper[4843]: I0318 12:36:43.683729 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-276w8" event={"ID":"f2b1db3a-fd48-46dd-bb46-92b0d197b113","Type":"ContainerStarted","Data":"ba30a948689543eeeaf0f43bfb2d2f80a050b9fbca14c8359c692476404c66bd"} Mar 18 12:36:43 crc kubenswrapper[4843]: I0318 12:36:43.687061 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63d21391-4df5-4d15-a12d-7ac03c66194c","Type":"ContainerStarted","Data":"1ca134091c373c5d3bd597827cd447b60a073fec6e8afd7b34d28b5f4294bfa4"} Mar 18 12:36:44 crc kubenswrapper[4843]: I0318 12:36:44.701320 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-276w8" event={"ID":"f2b1db3a-fd48-46dd-bb46-92b0d197b113","Type":"ContainerStarted","Data":"e86539fbd8046d6442e687346f64bdb07a7e99490aa3e6cdb508d7eb1b178f60"} Mar 18 12:36:44 crc kubenswrapper[4843]: I0318 12:36:44.701961 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:44 crc kubenswrapper[4843]: I0318 12:36:44.703445 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3aaa37b6-550a-4bd8-a166-af337e05defd","Type":"ContainerStarted","Data":"f9cef334853145f53eccd2242f28d4164cdad993003a24e5b04e69a70355f09a"} Mar 18 12:36:44 crc kubenswrapper[4843]: I0318 12:36:44.733190 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-276w8" podStartSLOduration=2.733135838 podStartE2EDuration="2.733135838s" podCreationTimestamp="2026-03-18 12:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:44.729126624 +0000 UTC m=+1638.444952168" watchObservedRunningTime="2026-03-18 12:36:44.733135838 +0000 UTC m=+1638.448961382" Mar 18 12:36:50 crc kubenswrapper[4843]: I0318 12:36:50.034737 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:36:50 crc kubenswrapper[4843]: I0318 12:36:50.035550 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.479462 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.553356 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-4dpbb"] Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.553652 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" podUID="0a54c475-4a90-4883-ae41-7d73e02d7c70" containerName="dnsmasq-dns" containerID="cri-o://b7cfd326543db7ba05149a8fd48e2ff852f76f12e322b91d6e3be15669f8edd3" gracePeriod=10 Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.809032 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-br7qm"] Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.813422 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.826392 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-br7qm"] Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.854335 4843 generic.go:334] "Generic (PLEG): container finished" podID="0a54c475-4a90-4883-ae41-7d73e02d7c70" containerID="b7cfd326543db7ba05149a8fd48e2ff852f76f12e322b91d6e3be15669f8edd3" exitCode=0 Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.854384 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" event={"ID":"0a54c475-4a90-4883-ae41-7d73e02d7c70","Type":"ContainerDied","Data":"b7cfd326543db7ba05149a8fd48e2ff852f76f12e322b91d6e3be15669f8edd3"} Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.857000 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.857058 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wknvf\" (UniqueName: \"kubernetes.io/projected/275e8234-1b33-40c3-ade6-1c75519ca5c2-kube-api-access-wknvf\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.857289 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.857421 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.857488 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-config\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.857547 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.857609 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.960003 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.960106 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.960155 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-config\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.960200 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.960240 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.960352 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.960380 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wknvf\" (UniqueName: \"kubernetes.io/projected/275e8234-1b33-40c3-ade6-1c75519ca5c2-kube-api-access-wknvf\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.962216 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.964259 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.965041 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-config\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.966076 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.966828 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:52 crc kubenswrapper[4843]: I0318 12:36:52.967671 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/275e8234-1b33-40c3-ade6-1c75519ca5c2-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.017910 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wknvf\" (UniqueName: \"kubernetes.io/projected/275e8234-1b33-40c3-ade6-1c75519ca5c2-kube-api-access-wknvf\") pod \"dnsmasq-dns-8c6f6df99-br7qm\" (UID: \"275e8234-1b33-40c3-ade6-1c75519ca5c2\") " pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.105099 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.134334 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.163446 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-ovsdbserver-nb\") pod \"0a54c475-4a90-4883-ae41-7d73e02d7c70\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.163593 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-dns-swift-storage-0\") pod \"0a54c475-4a90-4883-ae41-7d73e02d7c70\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.163657 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-dns-svc\") pod \"0a54c475-4a90-4883-ae41-7d73e02d7c70\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.163772 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-ovsdbserver-sb\") pod \"0a54c475-4a90-4883-ae41-7d73e02d7c70\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.163800 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-config\") pod \"0a54c475-4a90-4883-ae41-7d73e02d7c70\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.163823 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncxk8\" (UniqueName: \"kubernetes.io/projected/0a54c475-4a90-4883-ae41-7d73e02d7c70-kube-api-access-ncxk8\") pod \"0a54c475-4a90-4883-ae41-7d73e02d7c70\" (UID: \"0a54c475-4a90-4883-ae41-7d73e02d7c70\") " Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.177733 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a54c475-4a90-4883-ae41-7d73e02d7c70-kube-api-access-ncxk8" (OuterVolumeSpecName: "kube-api-access-ncxk8") pod "0a54c475-4a90-4883-ae41-7d73e02d7c70" (UID: "0a54c475-4a90-4883-ae41-7d73e02d7c70"). InnerVolumeSpecName "kube-api-access-ncxk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.224043 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-config" (OuterVolumeSpecName: "config") pod "0a54c475-4a90-4883-ae41-7d73e02d7c70" (UID: "0a54c475-4a90-4883-ae41-7d73e02d7c70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.228032 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0a54c475-4a90-4883-ae41-7d73e02d7c70" (UID: "0a54c475-4a90-4883-ae41-7d73e02d7c70"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.245453 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a54c475-4a90-4883-ae41-7d73e02d7c70" (UID: "0a54c475-4a90-4883-ae41-7d73e02d7c70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.257556 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0a54c475-4a90-4883-ae41-7d73e02d7c70" (UID: "0a54c475-4a90-4883-ae41-7d73e02d7c70"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.259550 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0a54c475-4a90-4883-ae41-7d73e02d7c70" (UID: "0a54c475-4a90-4883-ae41-7d73e02d7c70"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.267068 4843 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.267144 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.267158 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.267170 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.267211 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncxk8\" (UniqueName: \"kubernetes.io/projected/0a54c475-4a90-4883-ae41-7d73e02d7c70-kube-api-access-ncxk8\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.267225 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a54c475-4a90-4883-ae41-7d73e02d7c70-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.620350 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-br7qm"] Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.865355 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" event={"ID":"275e8234-1b33-40c3-ade6-1c75519ca5c2","Type":"ContainerStarted","Data":"f3182d2e5000935564f542384474a456f8f22264454d7b34b87577a8e2d77a64"} Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.867594 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" event={"ID":"0a54c475-4a90-4883-ae41-7d73e02d7c70","Type":"ContainerDied","Data":"d2f1e3f3c1a01119b87ef6347b7ee45a4e908b1f733ca9774aa586d4979af15d"} Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.867628 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-4dpbb" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.867655 4843 scope.go:117] "RemoveContainer" containerID="b7cfd326543db7ba05149a8fd48e2ff852f76f12e322b91d6e3be15669f8edd3" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.892399 4843 scope.go:117] "RemoveContainer" containerID="79562ce47d8c43e5925b90a785f37988e6644dfd3fdb818807f36907387b4710" Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.913393 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-4dpbb"] Mar 18 12:36:53 crc kubenswrapper[4843]: I0318 12:36:53.924263 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-4dpbb"] Mar 18 12:36:54 crc kubenswrapper[4843]: I0318 12:36:54.885936 4843 generic.go:334] "Generic (PLEG): container finished" podID="275e8234-1b33-40c3-ade6-1c75519ca5c2" containerID="cd0df7c71061ba76b01af6212826f565a55fca937ee4fd2282494d0ab00792d8" exitCode=0 Mar 18 12:36:54 crc kubenswrapper[4843]: I0318 12:36:54.886010 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" event={"ID":"275e8234-1b33-40c3-ade6-1c75519ca5c2","Type":"ContainerDied","Data":"cd0df7c71061ba76b01af6212826f565a55fca937ee4fd2282494d0ab00792d8"} Mar 18 12:36:54 crc kubenswrapper[4843]: I0318 12:36:54.999047 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a54c475-4a90-4883-ae41-7d73e02d7c70" path="/var/lib/kubelet/pods/0a54c475-4a90-4883-ae41-7d73e02d7c70/volumes" Mar 18 12:36:55 crc kubenswrapper[4843]: I0318 12:36:55.906059 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" event={"ID":"275e8234-1b33-40c3-ade6-1c75519ca5c2","Type":"ContainerStarted","Data":"c08aefe0a43d8f7bb1172272b523b88153486aa3a7a0fa003eaca74cacd4a278"} Mar 18 12:36:55 crc kubenswrapper[4843]: I0318 12:36:55.906732 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:36:55 crc kubenswrapper[4843]: I0318 12:36:55.948138 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" podStartSLOduration=3.948111229 podStartE2EDuration="3.948111229s" podCreationTimestamp="2026-03-18 12:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:55.937313032 +0000 UTC m=+1649.653138596" watchObservedRunningTime="2026-03-18 12:36:55.948111229 +0000 UTC m=+1649.663936773" Mar 18 12:37:00 crc kubenswrapper[4843]: I0318 12:37:00.819275 4843 scope.go:117] "RemoveContainer" containerID="165a6da61c0dd31b49a64173d4036ea27179b8d3410a706091c805452b80b33d" Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.135920 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c6f6df99-br7qm" Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.233093 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-276w8"] Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.233374 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-276w8" podUID="f2b1db3a-fd48-46dd-bb46-92b0d197b113" containerName="dnsmasq-dns" containerID="cri-o://e86539fbd8046d6442e687346f64bdb07a7e99490aa3e6cdb508d7eb1b178f60" gracePeriod=10 Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.803363 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.898719 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-dns-svc\") pod \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.898780 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwcst\" (UniqueName: \"kubernetes.io/projected/f2b1db3a-fd48-46dd-bb46-92b0d197b113-kube-api-access-qwcst\") pod \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.898827 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-openstack-edpm-ipam\") pod \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.898862 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-dns-swift-storage-0\") pod \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.899386 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-config\") pod \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.899537 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-ovsdbserver-sb\") pod \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.899586 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-ovsdbserver-nb\") pod \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\" (UID: \"f2b1db3a-fd48-46dd-bb46-92b0d197b113\") " Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.922059 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2b1db3a-fd48-46dd-bb46-92b0d197b113-kube-api-access-qwcst" (OuterVolumeSpecName: "kube-api-access-qwcst") pod "f2b1db3a-fd48-46dd-bb46-92b0d197b113" (UID: "f2b1db3a-fd48-46dd-bb46-92b0d197b113"). InnerVolumeSpecName "kube-api-access-qwcst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.948366 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f2b1db3a-fd48-46dd-bb46-92b0d197b113" (UID: "f2b1db3a-fd48-46dd-bb46-92b0d197b113"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.950990 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "f2b1db3a-fd48-46dd-bb46-92b0d197b113" (UID: "f2b1db3a-fd48-46dd-bb46-92b0d197b113"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.957764 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f2b1db3a-fd48-46dd-bb46-92b0d197b113" (UID: "f2b1db3a-fd48-46dd-bb46-92b0d197b113"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.958968 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-config" (OuterVolumeSpecName: "config") pod "f2b1db3a-fd48-46dd-bb46-92b0d197b113" (UID: "f2b1db3a-fd48-46dd-bb46-92b0d197b113"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.960874 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2b1db3a-fd48-46dd-bb46-92b0d197b113" (UID: "f2b1db3a-fd48-46dd-bb46-92b0d197b113"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:03 crc kubenswrapper[4843]: I0318 12:37:03.976780 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f2b1db3a-fd48-46dd-bb46-92b0d197b113" (UID: "f2b1db3a-fd48-46dd-bb46-92b0d197b113"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.001896 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.001961 4843 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.001975 4843 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.002016 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwcst\" (UniqueName: \"kubernetes.io/projected/f2b1db3a-fd48-46dd-bb46-92b0d197b113-kube-api-access-qwcst\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.002036 4843 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.002048 4843 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.002060 4843 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b1db3a-fd48-46dd-bb46-92b0d197b113-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.010973 4843 generic.go:334] "Generic (PLEG): container finished" podID="f2b1db3a-fd48-46dd-bb46-92b0d197b113" containerID="e86539fbd8046d6442e687346f64bdb07a7e99490aa3e6cdb508d7eb1b178f60" exitCode=0 Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.011051 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-276w8" event={"ID":"f2b1db3a-fd48-46dd-bb46-92b0d197b113","Type":"ContainerDied","Data":"e86539fbd8046d6442e687346f64bdb07a7e99490aa3e6cdb508d7eb1b178f60"} Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.011079 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-276w8" event={"ID":"f2b1db3a-fd48-46dd-bb46-92b0d197b113","Type":"ContainerDied","Data":"ba30a948689543eeeaf0f43bfb2d2f80a050b9fbca14c8359c692476404c66bd"} Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.011095 4843 scope.go:117] "RemoveContainer" containerID="e86539fbd8046d6442e687346f64bdb07a7e99490aa3e6cdb508d7eb1b178f60" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.011218 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-276w8" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.043740 4843 scope.go:117] "RemoveContainer" containerID="39157a09c9ee04f3dbf0d168106346952a349abd1d2dc1dbccd572f0567b7d77" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.081955 4843 scope.go:117] "RemoveContainer" containerID="e86539fbd8046d6442e687346f64bdb07a7e99490aa3e6cdb508d7eb1b178f60" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.082090 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-276w8"] Mar 18 12:37:04 crc kubenswrapper[4843]: E0318 12:37:04.085378 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e86539fbd8046d6442e687346f64bdb07a7e99490aa3e6cdb508d7eb1b178f60\": container with ID starting with e86539fbd8046d6442e687346f64bdb07a7e99490aa3e6cdb508d7eb1b178f60 not found: ID does not exist" containerID="e86539fbd8046d6442e687346f64bdb07a7e99490aa3e6cdb508d7eb1b178f60" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.085443 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86539fbd8046d6442e687346f64bdb07a7e99490aa3e6cdb508d7eb1b178f60"} err="failed to get container status \"e86539fbd8046d6442e687346f64bdb07a7e99490aa3e6cdb508d7eb1b178f60\": rpc error: code = NotFound desc = could not find container \"e86539fbd8046d6442e687346f64bdb07a7e99490aa3e6cdb508d7eb1b178f60\": container with ID starting with e86539fbd8046d6442e687346f64bdb07a7e99490aa3e6cdb508d7eb1b178f60 not found: ID does not exist" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.085476 4843 scope.go:117] "RemoveContainer" containerID="39157a09c9ee04f3dbf0d168106346952a349abd1d2dc1dbccd572f0567b7d77" Mar 18 12:37:04 crc kubenswrapper[4843]: E0318 12:37:04.086247 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39157a09c9ee04f3dbf0d168106346952a349abd1d2dc1dbccd572f0567b7d77\": container with ID starting with 39157a09c9ee04f3dbf0d168106346952a349abd1d2dc1dbccd572f0567b7d77 not found: ID does not exist" containerID="39157a09c9ee04f3dbf0d168106346952a349abd1d2dc1dbccd572f0567b7d77" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.086335 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39157a09c9ee04f3dbf0d168106346952a349abd1d2dc1dbccd572f0567b7d77"} err="failed to get container status \"39157a09c9ee04f3dbf0d168106346952a349abd1d2dc1dbccd572f0567b7d77\": rpc error: code = NotFound desc = could not find container \"39157a09c9ee04f3dbf0d168106346952a349abd1d2dc1dbccd572f0567b7d77\": container with ID starting with 39157a09c9ee04f3dbf0d168106346952a349abd1d2dc1dbccd572f0567b7d77 not found: ID does not exist" Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.095623 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-276w8"] Mar 18 12:37:04 crc kubenswrapper[4843]: I0318 12:37:04.994541 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2b1db3a-fd48-46dd-bb46-92b0d197b113" path="/var/lib/kubelet/pods/f2b1db3a-fd48-46dd-bb46-92b0d197b113/volumes" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.768246 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x"] Mar 18 12:37:15 crc kubenswrapper[4843]: E0318 12:37:15.769036 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2b1db3a-fd48-46dd-bb46-92b0d197b113" containerName="init" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.769051 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b1db3a-fd48-46dd-bb46-92b0d197b113" containerName="init" Mar 18 12:37:15 crc kubenswrapper[4843]: E0318 12:37:15.769073 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a54c475-4a90-4883-ae41-7d73e02d7c70" containerName="dnsmasq-dns" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.769082 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a54c475-4a90-4883-ae41-7d73e02d7c70" containerName="dnsmasq-dns" Mar 18 12:37:15 crc kubenswrapper[4843]: E0318 12:37:15.769102 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a54c475-4a90-4883-ae41-7d73e02d7c70" containerName="init" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.769110 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a54c475-4a90-4883-ae41-7d73e02d7c70" containerName="init" Mar 18 12:37:15 crc kubenswrapper[4843]: E0318 12:37:15.769137 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2b1db3a-fd48-46dd-bb46-92b0d197b113" containerName="dnsmasq-dns" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.769145 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b1db3a-fd48-46dd-bb46-92b0d197b113" containerName="dnsmasq-dns" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.769378 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2b1db3a-fd48-46dd-bb46-92b0d197b113" containerName="dnsmasq-dns" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.769410 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a54c475-4a90-4883-ae41-7d73e02d7c70" containerName="dnsmasq-dns" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.770135 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.772424 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.772501 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.772544 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.772798 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.781043 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x"] Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.916244 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.916824 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.916901 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k749v\" (UniqueName: \"kubernetes.io/projected/76ab23d5-a490-4e61-b3ae-27e991303a9c-kube-api-access-k749v\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:15 crc kubenswrapper[4843]: I0318 12:37:15.916962 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:16 crc kubenswrapper[4843]: I0318 12:37:16.020233 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:16 crc kubenswrapper[4843]: I0318 12:37:16.020327 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:16 crc kubenswrapper[4843]: I0318 12:37:16.020426 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k749v\" (UniqueName: \"kubernetes.io/projected/76ab23d5-a490-4e61-b3ae-27e991303a9c-kube-api-access-k749v\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:16 crc kubenswrapper[4843]: I0318 12:37:16.020534 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:16 crc kubenswrapper[4843]: I0318 12:37:16.029276 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:16 crc kubenswrapper[4843]: I0318 12:37:16.029493 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:16 crc kubenswrapper[4843]: I0318 12:37:16.040977 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:16 crc kubenswrapper[4843]: I0318 12:37:16.047532 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k749v\" (UniqueName: \"kubernetes.io/projected/76ab23d5-a490-4e61-b3ae-27e991303a9c-kube-api-access-k749v\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:16 crc kubenswrapper[4843]: I0318 12:37:16.099016 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:16 crc kubenswrapper[4843]: I0318 12:37:16.143986 4843 generic.go:334] "Generic (PLEG): container finished" podID="63d21391-4df5-4d15-a12d-7ac03c66194c" containerID="1ca134091c373c5d3bd597827cd447b60a073fec6e8afd7b34d28b5f4294bfa4" exitCode=0 Mar 18 12:37:16 crc kubenswrapper[4843]: I0318 12:37:16.144036 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63d21391-4df5-4d15-a12d-7ac03c66194c","Type":"ContainerDied","Data":"1ca134091c373c5d3bd597827cd447b60a073fec6e8afd7b34d28b5f4294bfa4"} Mar 18 12:37:16 crc kubenswrapper[4843]: I0318 12:37:16.737698 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x"] Mar 18 12:37:16 crc kubenswrapper[4843]: W0318 12:37:16.738806 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76ab23d5_a490_4e61_b3ae_27e991303a9c.slice/crio-b056eca68a2e0b70813077b9d733696e106c1883952a2fbc356e04839d657996 WatchSource:0}: Error finding container b056eca68a2e0b70813077b9d733696e106c1883952a2fbc356e04839d657996: Status 404 returned error can't find the container with id b056eca68a2e0b70813077b9d733696e106c1883952a2fbc356e04839d657996 Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.155616 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63d21391-4df5-4d15-a12d-7ac03c66194c","Type":"ContainerStarted","Data":"ff49ea44e61fb8adde23a171f84e00652c6ea9b5d945d14240331fae68a1419d"} Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.155908 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.159068 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" event={"ID":"76ab23d5-a490-4e61-b3ae-27e991303a9c","Type":"ContainerStarted","Data":"b056eca68a2e0b70813077b9d733696e106c1883952a2fbc356e04839d657996"} Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.187675 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.187638286 podStartE2EDuration="37.187638286s" podCreationTimestamp="2026-03-18 12:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:37:17.174429321 +0000 UTC m=+1670.890254855" watchObservedRunningTime="2026-03-18 12:37:17.187638286 +0000 UTC m=+1670.903463810" Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.319984 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fm6ck"] Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.322141 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.334914 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fm6ck"] Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.472245 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec1e78c5-c821-4f17-b817-a8c68c065097-catalog-content\") pod \"community-operators-fm6ck\" (UID: \"ec1e78c5-c821-4f17-b817-a8c68c065097\") " pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.472532 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rjm8\" (UniqueName: \"kubernetes.io/projected/ec1e78c5-c821-4f17-b817-a8c68c065097-kube-api-access-8rjm8\") pod \"community-operators-fm6ck\" (UID: \"ec1e78c5-c821-4f17-b817-a8c68c065097\") " pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.472625 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec1e78c5-c821-4f17-b817-a8c68c065097-utilities\") pod \"community-operators-fm6ck\" (UID: \"ec1e78c5-c821-4f17-b817-a8c68c065097\") " pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.574056 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec1e78c5-c821-4f17-b817-a8c68c065097-catalog-content\") pod \"community-operators-fm6ck\" (UID: \"ec1e78c5-c821-4f17-b817-a8c68c065097\") " pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.574204 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rjm8\" (UniqueName: \"kubernetes.io/projected/ec1e78c5-c821-4f17-b817-a8c68c065097-kube-api-access-8rjm8\") pod \"community-operators-fm6ck\" (UID: \"ec1e78c5-c821-4f17-b817-a8c68c065097\") " pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.574283 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec1e78c5-c821-4f17-b817-a8c68c065097-utilities\") pod \"community-operators-fm6ck\" (UID: \"ec1e78c5-c821-4f17-b817-a8c68c065097\") " pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.574514 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec1e78c5-c821-4f17-b817-a8c68c065097-catalog-content\") pod \"community-operators-fm6ck\" (UID: \"ec1e78c5-c821-4f17-b817-a8c68c065097\") " pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.574664 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec1e78c5-c821-4f17-b817-a8c68c065097-utilities\") pod \"community-operators-fm6ck\" (UID: \"ec1e78c5-c821-4f17-b817-a8c68c065097\") " pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.594623 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rjm8\" (UniqueName: \"kubernetes.io/projected/ec1e78c5-c821-4f17-b817-a8c68c065097-kube-api-access-8rjm8\") pod \"community-operators-fm6ck\" (UID: \"ec1e78c5-c821-4f17-b817-a8c68c065097\") " pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:17 crc kubenswrapper[4843]: I0318 12:37:17.683186 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:18 crc kubenswrapper[4843]: I0318 12:37:18.168097 4843 generic.go:334] "Generic (PLEG): container finished" podID="3aaa37b6-550a-4bd8-a166-af337e05defd" containerID="f9cef334853145f53eccd2242f28d4164cdad993003a24e5b04e69a70355f09a" exitCode=0 Mar 18 12:37:18 crc kubenswrapper[4843]: I0318 12:37:18.169353 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3aaa37b6-550a-4bd8-a166-af337e05defd","Type":"ContainerDied","Data":"f9cef334853145f53eccd2242f28d4164cdad993003a24e5b04e69a70355f09a"} Mar 18 12:37:18 crc kubenswrapper[4843]: I0318 12:37:18.253268 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fm6ck"] Mar 18 12:37:18 crc kubenswrapper[4843]: W0318 12:37:18.264188 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec1e78c5_c821_4f17_b817_a8c68c065097.slice/crio-b46a383fbafc919efbc9d74be7745bd3ddf5f2ef15a2c10f2fe5ed318ca8a160 WatchSource:0}: Error finding container b46a383fbafc919efbc9d74be7745bd3ddf5f2ef15a2c10f2fe5ed318ca8a160: Status 404 returned error can't find the container with id b46a383fbafc919efbc9d74be7745bd3ddf5f2ef15a2c10f2fe5ed318ca8a160 Mar 18 12:37:19 crc kubenswrapper[4843]: I0318 12:37:19.179597 4843 generic.go:334] "Generic (PLEG): container finished" podID="ec1e78c5-c821-4f17-b817-a8c68c065097" containerID="580bf755841f539eed3b15511f59b4e8b562526b9f97bec75eb791f1bcb9b6f9" exitCode=0 Mar 18 12:37:19 crc kubenswrapper[4843]: I0318 12:37:19.179716 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fm6ck" event={"ID":"ec1e78c5-c821-4f17-b817-a8c68c065097","Type":"ContainerDied","Data":"580bf755841f539eed3b15511f59b4e8b562526b9f97bec75eb791f1bcb9b6f9"} Mar 18 12:37:19 crc kubenswrapper[4843]: I0318 12:37:19.179746 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fm6ck" event={"ID":"ec1e78c5-c821-4f17-b817-a8c68c065097","Type":"ContainerStarted","Data":"b46a383fbafc919efbc9d74be7745bd3ddf5f2ef15a2c10f2fe5ed318ca8a160"} Mar 18 12:37:19 crc kubenswrapper[4843]: I0318 12:37:19.185079 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3aaa37b6-550a-4bd8-a166-af337e05defd","Type":"ContainerStarted","Data":"06473c2e343a217b02a7db3116239a57261418117ba33726f31be38ce2232853"} Mar 18 12:37:19 crc kubenswrapper[4843]: I0318 12:37:19.185299 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:20 crc kubenswrapper[4843]: I0318 12:37:20.035489 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:37:20 crc kubenswrapper[4843]: I0318 12:37:20.036112 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:37:20 crc kubenswrapper[4843]: I0318 12:37:20.036162 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:37:20 crc kubenswrapper[4843]: I0318 12:37:20.036749 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:37:20 crc kubenswrapper[4843]: I0318 12:37:20.036812 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" gracePeriod=600 Mar 18 12:37:20 crc kubenswrapper[4843]: E0318 12:37:20.186641 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:37:20 crc kubenswrapper[4843]: I0318 12:37:20.202910 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" exitCode=0 Mar 18 12:37:20 crc kubenswrapper[4843]: I0318 12:37:20.202989 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24"} Mar 18 12:37:20 crc kubenswrapper[4843]: I0318 12:37:20.203381 4843 scope.go:117] "RemoveContainer" containerID="6c1df8135718fa79548f8df435226cda37a6730cb41d0cf14ca133c83dba65e7" Mar 18 12:37:20 crc kubenswrapper[4843]: I0318 12:37:20.204260 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:37:20 crc kubenswrapper[4843]: E0318 12:37:20.204701 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:37:20 crc kubenswrapper[4843]: I0318 12:37:20.206934 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fm6ck" event={"ID":"ec1e78c5-c821-4f17-b817-a8c68c065097","Type":"ContainerStarted","Data":"9f3856346b68502d462aa0b51c5cb1e6f794320354a61ae67a3a38599d9f64e8"} Mar 18 12:37:20 crc kubenswrapper[4843]: I0318 12:37:20.241727 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.241697002 podStartE2EDuration="39.241697002s" podCreationTimestamp="2026-03-18 12:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:37:19.231231739 +0000 UTC m=+1672.947057263" watchObservedRunningTime="2026-03-18 12:37:20.241697002 +0000 UTC m=+1673.957522536" Mar 18 12:37:21 crc kubenswrapper[4843]: I0318 12:37:21.221116 4843 generic.go:334] "Generic (PLEG): container finished" podID="ec1e78c5-c821-4f17-b817-a8c68c065097" containerID="9f3856346b68502d462aa0b51c5cb1e6f794320354a61ae67a3a38599d9f64e8" exitCode=0 Mar 18 12:37:21 crc kubenswrapper[4843]: I0318 12:37:21.221168 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fm6ck" event={"ID":"ec1e78c5-c821-4f17-b817-a8c68c065097","Type":"ContainerDied","Data":"9f3856346b68502d462aa0b51c5cb1e6f794320354a61ae67a3a38599d9f64e8"} Mar 18 12:37:22 crc kubenswrapper[4843]: I0318 12:37:22.235107 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fm6ck" event={"ID":"ec1e78c5-c821-4f17-b817-a8c68c065097","Type":"ContainerStarted","Data":"83d11f565d2b2a845aa332ea22e00062eb83ebc8295b8300838524cf9e34a1e6"} Mar 18 12:37:22 crc kubenswrapper[4843]: I0318 12:37:22.265916 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fm6ck" podStartSLOduration=2.81567625 podStartE2EDuration="5.265895524s" podCreationTimestamp="2026-03-18 12:37:17 +0000 UTC" firstStartedPulling="2026-03-18 12:37:19.182020342 +0000 UTC m=+1672.897845866" lastFinishedPulling="2026-03-18 12:37:21.632239616 +0000 UTC m=+1675.348065140" observedRunningTime="2026-03-18 12:37:22.254880711 +0000 UTC m=+1675.970706245" watchObservedRunningTime="2026-03-18 12:37:22.265895524 +0000 UTC m=+1675.981721048" Mar 18 12:37:27 crc kubenswrapper[4843]: I0318 12:37:27.683938 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:27 crc kubenswrapper[4843]: I0318 12:37:27.684559 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:27 crc kubenswrapper[4843]: I0318 12:37:27.758817 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:28 crc kubenswrapper[4843]: I0318 12:37:28.405781 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:28 crc kubenswrapper[4843]: I0318 12:37:28.467836 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fm6ck"] Mar 18 12:37:30 crc kubenswrapper[4843]: I0318 12:37:30.363411 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fm6ck" podUID="ec1e78c5-c821-4f17-b817-a8c68c065097" containerName="registry-server" containerID="cri-o://83d11f565d2b2a845aa332ea22e00062eb83ebc8295b8300838524cf9e34a1e6" gracePeriod=2 Mar 18 12:37:30 crc kubenswrapper[4843]: I0318 12:37:30.716449 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:37:30 crc kubenswrapper[4843]: I0318 12:37:30.929908 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:30 crc kubenswrapper[4843]: I0318 12:37:30.986688 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec1e78c5-c821-4f17-b817-a8c68c065097-catalog-content\") pod \"ec1e78c5-c821-4f17-b817-a8c68c065097\" (UID: \"ec1e78c5-c821-4f17-b817-a8c68c065097\") " Mar 18 12:37:30 crc kubenswrapper[4843]: I0318 12:37:30.986789 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rjm8\" (UniqueName: \"kubernetes.io/projected/ec1e78c5-c821-4f17-b817-a8c68c065097-kube-api-access-8rjm8\") pod \"ec1e78c5-c821-4f17-b817-a8c68c065097\" (UID: \"ec1e78c5-c821-4f17-b817-a8c68c065097\") " Mar 18 12:37:30 crc kubenswrapper[4843]: I0318 12:37:30.987012 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec1e78c5-c821-4f17-b817-a8c68c065097-utilities\") pod \"ec1e78c5-c821-4f17-b817-a8c68c065097\" (UID: \"ec1e78c5-c821-4f17-b817-a8c68c065097\") " Mar 18 12:37:30 crc kubenswrapper[4843]: I0318 12:37:30.987759 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec1e78c5-c821-4f17-b817-a8c68c065097-utilities" (OuterVolumeSpecName: "utilities") pod "ec1e78c5-c821-4f17-b817-a8c68c065097" (UID: "ec1e78c5-c821-4f17-b817-a8c68c065097"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:37:30 crc kubenswrapper[4843]: I0318 12:37:30.991376 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec1e78c5-c821-4f17-b817-a8c68c065097-kube-api-access-8rjm8" (OuterVolumeSpecName: "kube-api-access-8rjm8") pod "ec1e78c5-c821-4f17-b817-a8c68c065097" (UID: "ec1e78c5-c821-4f17-b817-a8c68c065097"). InnerVolumeSpecName "kube-api-access-8rjm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.037970 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec1e78c5-c821-4f17-b817-a8c68c065097-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec1e78c5-c821-4f17-b817-a8c68c065097" (UID: "ec1e78c5-c821-4f17-b817-a8c68c065097"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.061851 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.102109 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec1e78c5-c821-4f17-b817-a8c68c065097-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.102139 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec1e78c5-c821-4f17-b817-a8c68c065097-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.102151 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rjm8\" (UniqueName: \"kubernetes.io/projected/ec1e78c5-c821-4f17-b817-a8c68c065097-kube-api-access-8rjm8\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.376124 4843 generic.go:334] "Generic (PLEG): container finished" podID="ec1e78c5-c821-4f17-b817-a8c68c065097" containerID="83d11f565d2b2a845aa332ea22e00062eb83ebc8295b8300838524cf9e34a1e6" exitCode=0 Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.376201 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fm6ck" event={"ID":"ec1e78c5-c821-4f17-b817-a8c68c065097","Type":"ContainerDied","Data":"83d11f565d2b2a845aa332ea22e00062eb83ebc8295b8300838524cf9e34a1e6"} Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.376253 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fm6ck" event={"ID":"ec1e78c5-c821-4f17-b817-a8c68c065097","Type":"ContainerDied","Data":"b46a383fbafc919efbc9d74be7745bd3ddf5f2ef15a2c10f2fe5ed318ca8a160"} Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.376276 4843 scope.go:117] "RemoveContainer" containerID="83d11f565d2b2a845aa332ea22e00062eb83ebc8295b8300838524cf9e34a1e6" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.377321 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fm6ck" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.380301 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" event={"ID":"76ab23d5-a490-4e61-b3ae-27e991303a9c","Type":"ContainerStarted","Data":"dc25992c61c94559157db84d78a809ca3812a0c44e3f5c6b7a72558b23bcfa06"} Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.403366 4843 scope.go:117] "RemoveContainer" containerID="9f3856346b68502d462aa0b51c5cb1e6f794320354a61ae67a3a38599d9f64e8" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.416407 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" podStartSLOduration=2.446542355 podStartE2EDuration="16.41638287s" podCreationTimestamp="2026-03-18 12:37:15 +0000 UTC" firstStartedPulling="2026-03-18 12:37:16.741876692 +0000 UTC m=+1670.457702226" lastFinishedPulling="2026-03-18 12:37:30.711717207 +0000 UTC m=+1684.427542741" observedRunningTime="2026-03-18 12:37:31.399004587 +0000 UTC m=+1685.114830111" watchObservedRunningTime="2026-03-18 12:37:31.41638287 +0000 UTC m=+1685.132208394" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.430949 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fm6ck"] Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.436944 4843 scope.go:117] "RemoveContainer" containerID="580bf755841f539eed3b15511f59b4e8b562526b9f97bec75eb791f1bcb9b6f9" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.441379 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fm6ck"] Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.464027 4843 scope.go:117] "RemoveContainer" containerID="83d11f565d2b2a845aa332ea22e00062eb83ebc8295b8300838524cf9e34a1e6" Mar 18 12:37:31 crc kubenswrapper[4843]: E0318 12:37:31.464585 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83d11f565d2b2a845aa332ea22e00062eb83ebc8295b8300838524cf9e34a1e6\": container with ID starting with 83d11f565d2b2a845aa332ea22e00062eb83ebc8295b8300838524cf9e34a1e6 not found: ID does not exist" containerID="83d11f565d2b2a845aa332ea22e00062eb83ebc8295b8300838524cf9e34a1e6" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.464632 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83d11f565d2b2a845aa332ea22e00062eb83ebc8295b8300838524cf9e34a1e6"} err="failed to get container status \"83d11f565d2b2a845aa332ea22e00062eb83ebc8295b8300838524cf9e34a1e6\": rpc error: code = NotFound desc = could not find container \"83d11f565d2b2a845aa332ea22e00062eb83ebc8295b8300838524cf9e34a1e6\": container with ID starting with 83d11f565d2b2a845aa332ea22e00062eb83ebc8295b8300838524cf9e34a1e6 not found: ID does not exist" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.464687 4843 scope.go:117] "RemoveContainer" containerID="9f3856346b68502d462aa0b51c5cb1e6f794320354a61ae67a3a38599d9f64e8" Mar 18 12:37:31 crc kubenswrapper[4843]: E0318 12:37:31.465141 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f3856346b68502d462aa0b51c5cb1e6f794320354a61ae67a3a38599d9f64e8\": container with ID starting with 9f3856346b68502d462aa0b51c5cb1e6f794320354a61ae67a3a38599d9f64e8 not found: ID does not exist" containerID="9f3856346b68502d462aa0b51c5cb1e6f794320354a61ae67a3a38599d9f64e8" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.465184 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f3856346b68502d462aa0b51c5cb1e6f794320354a61ae67a3a38599d9f64e8"} err="failed to get container status \"9f3856346b68502d462aa0b51c5cb1e6f794320354a61ae67a3a38599d9f64e8\": rpc error: code = NotFound desc = could not find container \"9f3856346b68502d462aa0b51c5cb1e6f794320354a61ae67a3a38599d9f64e8\": container with ID starting with 9f3856346b68502d462aa0b51c5cb1e6f794320354a61ae67a3a38599d9f64e8 not found: ID does not exist" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.465207 4843 scope.go:117] "RemoveContainer" containerID="580bf755841f539eed3b15511f59b4e8b562526b9f97bec75eb791f1bcb9b6f9" Mar 18 12:37:31 crc kubenswrapper[4843]: E0318 12:37:31.465631 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"580bf755841f539eed3b15511f59b4e8b562526b9f97bec75eb791f1bcb9b6f9\": container with ID starting with 580bf755841f539eed3b15511f59b4e8b562526b9f97bec75eb791f1bcb9b6f9 not found: ID does not exist" containerID="580bf755841f539eed3b15511f59b4e8b562526b9f97bec75eb791f1bcb9b6f9" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.465696 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"580bf755841f539eed3b15511f59b4e8b562526b9f97bec75eb791f1bcb9b6f9"} err="failed to get container status \"580bf755841f539eed3b15511f59b4e8b562526b9f97bec75eb791f1bcb9b6f9\": rpc error: code = NotFound desc = could not find container \"580bf755841f539eed3b15511f59b4e8b562526b9f97bec75eb791f1bcb9b6f9\": container with ID starting with 580bf755841f539eed3b15511f59b4e8b562526b9f97bec75eb791f1bcb9b6f9 not found: ID does not exist" Mar 18 12:37:31 crc kubenswrapper[4843]: I0318 12:37:31.983861 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:37:31 crc kubenswrapper[4843]: E0318 12:37:31.984333 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:37:32 crc kubenswrapper[4843]: I0318 12:37:32.074874 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:32 crc kubenswrapper[4843]: I0318 12:37:32.994429 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec1e78c5-c821-4f17-b817-a8c68c065097" path="/var/lib/kubelet/pods/ec1e78c5-c821-4f17-b817-a8c68c065097/volumes" Mar 18 12:37:42 crc kubenswrapper[4843]: I0318 12:37:42.519202 4843 generic.go:334] "Generic (PLEG): container finished" podID="76ab23d5-a490-4e61-b3ae-27e991303a9c" containerID="dc25992c61c94559157db84d78a809ca3812a0c44e3f5c6b7a72558b23bcfa06" exitCode=0 Mar 18 12:37:42 crc kubenswrapper[4843]: I0318 12:37:42.519300 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" event={"ID":"76ab23d5-a490-4e61-b3ae-27e991303a9c","Type":"ContainerDied","Data":"dc25992c61c94559157db84d78a809ca3812a0c44e3f5c6b7a72558b23bcfa06"} Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.008665 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.086068 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-inventory\") pod \"76ab23d5-a490-4e61-b3ae-27e991303a9c\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.087311 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-repo-setup-combined-ca-bundle\") pod \"76ab23d5-a490-4e61-b3ae-27e991303a9c\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.087449 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k749v\" (UniqueName: \"kubernetes.io/projected/76ab23d5-a490-4e61-b3ae-27e991303a9c-kube-api-access-k749v\") pod \"76ab23d5-a490-4e61-b3ae-27e991303a9c\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.087678 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-ssh-key-openstack-edpm-ipam\") pod \"76ab23d5-a490-4e61-b3ae-27e991303a9c\" (UID: \"76ab23d5-a490-4e61-b3ae-27e991303a9c\") " Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.092140 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76ab23d5-a490-4e61-b3ae-27e991303a9c-kube-api-access-k749v" (OuterVolumeSpecName: "kube-api-access-k749v") pod "76ab23d5-a490-4e61-b3ae-27e991303a9c" (UID: "76ab23d5-a490-4e61-b3ae-27e991303a9c"). InnerVolumeSpecName "kube-api-access-k749v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.093842 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "76ab23d5-a490-4e61-b3ae-27e991303a9c" (UID: "76ab23d5-a490-4e61-b3ae-27e991303a9c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.113524 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-inventory" (OuterVolumeSpecName: "inventory") pod "76ab23d5-a490-4e61-b3ae-27e991303a9c" (UID: "76ab23d5-a490-4e61-b3ae-27e991303a9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.149485 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "76ab23d5-a490-4e61-b3ae-27e991303a9c" (UID: "76ab23d5-a490-4e61-b3ae-27e991303a9c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.190974 4843 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.191019 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k749v\" (UniqueName: \"kubernetes.io/projected/76ab23d5-a490-4e61-b3ae-27e991303a9c-kube-api-access-k749v\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.191031 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.191045 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76ab23d5-a490-4e61-b3ae-27e991303a9c-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.550780 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" event={"ID":"76ab23d5-a490-4e61-b3ae-27e991303a9c","Type":"ContainerDied","Data":"b056eca68a2e0b70813077b9d733696e106c1883952a2fbc356e04839d657996"} Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.550836 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b056eca68a2e0b70813077b9d733696e106c1883952a2fbc356e04839d657996" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.550925 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.667458 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr"] Mar 18 12:37:44 crc kubenswrapper[4843]: E0318 12:37:44.668363 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ab23d5-a490-4e61-b3ae-27e991303a9c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.668424 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ab23d5-a490-4e61-b3ae-27e991303a9c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 12:37:44 crc kubenswrapper[4843]: E0318 12:37:44.668470 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1e78c5-c821-4f17-b817-a8c68c065097" containerName="extract-utilities" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.668487 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1e78c5-c821-4f17-b817-a8c68c065097" containerName="extract-utilities" Mar 18 12:37:44 crc kubenswrapper[4843]: E0318 12:37:44.668511 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1e78c5-c821-4f17-b817-a8c68c065097" containerName="extract-content" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.668528 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1e78c5-c821-4f17-b817-a8c68c065097" containerName="extract-content" Mar 18 12:37:44 crc kubenswrapper[4843]: E0318 12:37:44.668563 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1e78c5-c821-4f17-b817-a8c68c065097" containerName="registry-server" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.668578 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1e78c5-c821-4f17-b817-a8c68c065097" containerName="registry-server" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.669096 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="76ab23d5-a490-4e61-b3ae-27e991303a9c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.669163 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1e78c5-c821-4f17-b817-a8c68c065097" containerName="registry-server" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.670622 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.673857 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.674223 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.674307 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.674228 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.684156 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr"] Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.701395 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2shvr\" (UID: \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.701509 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2shvr\" (UID: \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.701606 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g286f\" (UniqueName: \"kubernetes.io/projected/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-kube-api-access-g286f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2shvr\" (UID: \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.802985 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2shvr\" (UID: \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.803107 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2shvr\" (UID: \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.803203 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g286f\" (UniqueName: \"kubernetes.io/projected/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-kube-api-access-g286f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2shvr\" (UID: \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.806102 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2shvr\" (UID: \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.806303 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2shvr\" (UID: \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.823532 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g286f\" (UniqueName: \"kubernetes.io/projected/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-kube-api-access-g286f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2shvr\" (UID: \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" Mar 18 12:37:44 crc kubenswrapper[4843]: I0318 12:37:44.996605 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" Mar 18 12:37:45 crc kubenswrapper[4843]: I0318 12:37:45.582378 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr"] Mar 18 12:37:46 crc kubenswrapper[4843]: I0318 12:37:46.571695 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" event={"ID":"c4d5f3c0-524f-472b-b3f0-bcb8f735420f","Type":"ContainerStarted","Data":"9c4c16f30e0b99e45b5fdb4aef9109fe6c4ace0ce693e22bdd854911fc45bb47"} Mar 18 12:37:46 crc kubenswrapper[4843]: I0318 12:37:46.572010 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" event={"ID":"c4d5f3c0-524f-472b-b3f0-bcb8f735420f","Type":"ContainerStarted","Data":"4a8a5a4de42c82ef7743e9438055364405a9d2daafe20b3c54a9ec5a8e807c00"} Mar 18 12:37:46 crc kubenswrapper[4843]: I0318 12:37:46.607966 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" podStartSLOduration=2.422569582 podStartE2EDuration="2.607931824s" podCreationTimestamp="2026-03-18 12:37:44 +0000 UTC" firstStartedPulling="2026-03-18 12:37:45.586608812 +0000 UTC m=+1699.302434326" lastFinishedPulling="2026-03-18 12:37:45.771971044 +0000 UTC m=+1699.487796568" observedRunningTime="2026-03-18 12:37:46.590672744 +0000 UTC m=+1700.306498268" watchObservedRunningTime="2026-03-18 12:37:46.607931824 +0000 UTC m=+1700.323757388" Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.003775 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:37:47 crc kubenswrapper[4843]: E0318 12:37:47.005130 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.528269 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mgmxn"] Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.531388 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.541131 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mgmxn"] Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.664353 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c62s\" (UniqueName: \"kubernetes.io/projected/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-kube-api-access-4c62s\") pod \"certified-operators-mgmxn\" (UID: \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\") " pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.664436 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-catalog-content\") pod \"certified-operators-mgmxn\" (UID: \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\") " pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.664460 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-utilities\") pod \"certified-operators-mgmxn\" (UID: \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\") " pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.767088 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c62s\" (UniqueName: \"kubernetes.io/projected/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-kube-api-access-4c62s\") pod \"certified-operators-mgmxn\" (UID: \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\") " pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.767266 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-catalog-content\") pod \"certified-operators-mgmxn\" (UID: \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\") " pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.767356 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-utilities\") pod \"certified-operators-mgmxn\" (UID: \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\") " pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.767943 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-utilities\") pod \"certified-operators-mgmxn\" (UID: \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\") " pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.767995 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-catalog-content\") pod \"certified-operators-mgmxn\" (UID: \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\") " pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.786785 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c62s\" (UniqueName: \"kubernetes.io/projected/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-kube-api-access-4c62s\") pod \"certified-operators-mgmxn\" (UID: \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\") " pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:47 crc kubenswrapper[4843]: I0318 12:37:47.861588 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:48 crc kubenswrapper[4843]: I0318 12:37:48.408771 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mgmxn"] Mar 18 12:37:48 crc kubenswrapper[4843]: W0318 12:37:48.420043 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62c6448b_970f_41e5_a4c1_a4b5ddd53ee4.slice/crio-3d0f6635fbc60f21373aa0c2d8cf40bc14d2319943c86a4510b57b2fef67e0c9 WatchSource:0}: Error finding container 3d0f6635fbc60f21373aa0c2d8cf40bc14d2319943c86a4510b57b2fef67e0c9: Status 404 returned error can't find the container with id 3d0f6635fbc60f21373aa0c2d8cf40bc14d2319943c86a4510b57b2fef67e0c9 Mar 18 12:37:48 crc kubenswrapper[4843]: I0318 12:37:48.599517 4843 generic.go:334] "Generic (PLEG): container finished" podID="c4d5f3c0-524f-472b-b3f0-bcb8f735420f" containerID="9c4c16f30e0b99e45b5fdb4aef9109fe6c4ace0ce693e22bdd854911fc45bb47" exitCode=0 Mar 18 12:37:48 crc kubenswrapper[4843]: I0318 12:37:48.599589 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" event={"ID":"c4d5f3c0-524f-472b-b3f0-bcb8f735420f","Type":"ContainerDied","Data":"9c4c16f30e0b99e45b5fdb4aef9109fe6c4ace0ce693e22bdd854911fc45bb47"} Mar 18 12:37:48 crc kubenswrapper[4843]: I0318 12:37:48.603172 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgmxn" event={"ID":"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4","Type":"ContainerStarted","Data":"3d0f6635fbc60f21373aa0c2d8cf40bc14d2319943c86a4510b57b2fef67e0c9"} Mar 18 12:37:49 crc kubenswrapper[4843]: I0318 12:37:49.619085 4843 generic.go:334] "Generic (PLEG): container finished" podID="62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" containerID="e0dac3ba0412ce159325bdb5e66ca4ac6a81d49e26ea8306424291c205c07381" exitCode=0 Mar 18 12:37:49 crc kubenswrapper[4843]: I0318 12:37:49.619240 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgmxn" event={"ID":"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4","Type":"ContainerDied","Data":"e0dac3ba0412ce159325bdb5e66ca4ac6a81d49e26ea8306424291c205c07381"} Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.106701 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.226491 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g286f\" (UniqueName: \"kubernetes.io/projected/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-kube-api-access-g286f\") pod \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\" (UID: \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\") " Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.226709 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-ssh-key-openstack-edpm-ipam\") pod \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\" (UID: \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\") " Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.226753 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-inventory\") pod \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\" (UID: \"c4d5f3c0-524f-472b-b3f0-bcb8f735420f\") " Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.247935 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-kube-api-access-g286f" (OuterVolumeSpecName: "kube-api-access-g286f") pod "c4d5f3c0-524f-472b-b3f0-bcb8f735420f" (UID: "c4d5f3c0-524f-472b-b3f0-bcb8f735420f"). InnerVolumeSpecName "kube-api-access-g286f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.331035 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g286f\" (UniqueName: \"kubernetes.io/projected/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-kube-api-access-g286f\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.335684 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c4d5f3c0-524f-472b-b3f0-bcb8f735420f" (UID: "c4d5f3c0-524f-472b-b3f0-bcb8f735420f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.354845 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-inventory" (OuterVolumeSpecName: "inventory") pod "c4d5f3c0-524f-472b-b3f0-bcb8f735420f" (UID: "c4d5f3c0-524f-472b-b3f0-bcb8f735420f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.433124 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.433177 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4d5f3c0-524f-472b-b3f0-bcb8f735420f-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.628536 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" event={"ID":"c4d5f3c0-524f-472b-b3f0-bcb8f735420f","Type":"ContainerDied","Data":"4a8a5a4de42c82ef7743e9438055364405a9d2daafe20b3c54a9ec5a8e807c00"} Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.628582 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a8a5a4de42c82ef7743e9438055364405a9d2daafe20b3c54a9ec5a8e807c00" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.628591 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2shvr" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.631702 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgmxn" event={"ID":"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4","Type":"ContainerStarted","Data":"5c55b346a99fe3ec6aaf9f1e3e0b9f6f5e16f2e0221752d38e5a6bd53131a710"} Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.727584 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7"] Mar 18 12:37:50 crc kubenswrapper[4843]: E0318 12:37:50.728144 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d5f3c0-524f-472b-b3f0-bcb8f735420f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.728166 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d5f3c0-524f-472b-b3f0-bcb8f735420f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.728435 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d5f3c0-524f-472b-b3f0-bcb8f735420f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.729313 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.731104 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.731315 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.731533 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.738616 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7"] Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.740509 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.840356 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49hkq\" (UniqueName: \"kubernetes.io/projected/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-kube-api-access-49hkq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.840793 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.840886 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.840981 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.943759 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49hkq\" (UniqueName: \"kubernetes.io/projected/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-kube-api-access-49hkq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.943832 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.943882 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.943919 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.950507 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.951261 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.951680 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:50 crc kubenswrapper[4843]: I0318 12:37:50.965772 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49hkq\" (UniqueName: \"kubernetes.io/projected/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-kube-api-access-49hkq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:51 crc kubenswrapper[4843]: I0318 12:37:51.056719 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:37:51 crc kubenswrapper[4843]: I0318 12:37:51.613608 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7"] Mar 18 12:37:51 crc kubenswrapper[4843]: I0318 12:37:51.649905 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgmxn" event={"ID":"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4","Type":"ContainerDied","Data":"5c55b346a99fe3ec6aaf9f1e3e0b9f6f5e16f2e0221752d38e5a6bd53131a710"} Mar 18 12:37:51 crc kubenswrapper[4843]: I0318 12:37:51.649711 4843 generic.go:334] "Generic (PLEG): container finished" podID="62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" containerID="5c55b346a99fe3ec6aaf9f1e3e0b9f6f5e16f2e0221752d38e5a6bd53131a710" exitCode=0 Mar 18 12:37:51 crc kubenswrapper[4843]: I0318 12:37:51.657033 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" event={"ID":"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab","Type":"ContainerStarted","Data":"f3ac132f85d71e5d078131abfd303d1b46321c01486df26dc6a1ddef1745aa63"} Mar 18 12:37:52 crc kubenswrapper[4843]: I0318 12:37:52.671249 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgmxn" event={"ID":"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4","Type":"ContainerStarted","Data":"deb9bf1173cda8b400448cf2fe787f6f6eb8185000c35a5d8a495119dff65025"} Mar 18 12:37:52 crc kubenswrapper[4843]: I0318 12:37:52.673293 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" event={"ID":"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab","Type":"ContainerStarted","Data":"8c78b3a68e837c6c778fd5f5174e46279979bb29f40156aed1743c78b520f522"} Mar 18 12:37:52 crc kubenswrapper[4843]: I0318 12:37:52.720703 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mgmxn" podStartSLOduration=3.2712823269999998 podStartE2EDuration="5.720601715s" podCreationTimestamp="2026-03-18 12:37:47 +0000 UTC" firstStartedPulling="2026-03-18 12:37:49.624143086 +0000 UTC m=+1703.339968630" lastFinishedPulling="2026-03-18 12:37:52.073462494 +0000 UTC m=+1705.789288018" observedRunningTime="2026-03-18 12:37:52.692090836 +0000 UTC m=+1706.407916370" watchObservedRunningTime="2026-03-18 12:37:52.720601715 +0000 UTC m=+1706.436427259" Mar 18 12:37:52 crc kubenswrapper[4843]: I0318 12:37:52.748470 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" podStartSLOduration=2.569008362 podStartE2EDuration="2.748448756s" podCreationTimestamp="2026-03-18 12:37:50 +0000 UTC" firstStartedPulling="2026-03-18 12:37:51.624520541 +0000 UTC m=+1705.340346105" lastFinishedPulling="2026-03-18 12:37:51.803960975 +0000 UTC m=+1705.519786499" observedRunningTime="2026-03-18 12:37:52.723859347 +0000 UTC m=+1706.439684871" watchObservedRunningTime="2026-03-18 12:37:52.748448756 +0000 UTC m=+1706.464274280" Mar 18 12:37:57 crc kubenswrapper[4843]: I0318 12:37:57.862146 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:57 crc kubenswrapper[4843]: I0318 12:37:57.862827 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:57 crc kubenswrapper[4843]: I0318 12:37:57.918273 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:59 crc kubenswrapper[4843]: I0318 12:37:59.003491 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:37:59 crc kubenswrapper[4843]: E0318 12:37:59.004018 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:37:59 crc kubenswrapper[4843]: I0318 12:37:59.113674 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:37:59 crc kubenswrapper[4843]: I0318 12:37:59.182362 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mgmxn"] Mar 18 12:38:00 crc kubenswrapper[4843]: I0318 12:38:00.151646 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563958-rtrsj"] Mar 18 12:38:00 crc kubenswrapper[4843]: I0318 12:38:00.153904 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563958-rtrsj" Mar 18 12:38:00 crc kubenswrapper[4843]: I0318 12:38:00.156581 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:38:00 crc kubenswrapper[4843]: I0318 12:38:00.156635 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:38:00 crc kubenswrapper[4843]: I0318 12:38:00.157232 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:38:00 crc kubenswrapper[4843]: I0318 12:38:00.168080 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563958-rtrsj"] Mar 18 12:38:00 crc kubenswrapper[4843]: I0318 12:38:00.300010 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh294\" (UniqueName: \"kubernetes.io/projected/4a0bd964-b1ca-4243-991e-2bce7b1b8c84-kube-api-access-zh294\") pod \"auto-csr-approver-29563958-rtrsj\" (UID: \"4a0bd964-b1ca-4243-991e-2bce7b1b8c84\") " pod="openshift-infra/auto-csr-approver-29563958-rtrsj" Mar 18 12:38:00 crc kubenswrapper[4843]: I0318 12:38:00.434067 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh294\" (UniqueName: \"kubernetes.io/projected/4a0bd964-b1ca-4243-991e-2bce7b1b8c84-kube-api-access-zh294\") pod \"auto-csr-approver-29563958-rtrsj\" (UID: \"4a0bd964-b1ca-4243-991e-2bce7b1b8c84\") " pod="openshift-infra/auto-csr-approver-29563958-rtrsj" Mar 18 12:38:00 crc kubenswrapper[4843]: I0318 12:38:00.460641 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh294\" (UniqueName: \"kubernetes.io/projected/4a0bd964-b1ca-4243-991e-2bce7b1b8c84-kube-api-access-zh294\") pod \"auto-csr-approver-29563958-rtrsj\" (UID: \"4a0bd964-b1ca-4243-991e-2bce7b1b8c84\") " pod="openshift-infra/auto-csr-approver-29563958-rtrsj" Mar 18 12:38:00 crc kubenswrapper[4843]: I0318 12:38:00.478333 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563958-rtrsj" Mar 18 12:38:00 crc kubenswrapper[4843]: I0318 12:38:00.936451 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563958-rtrsj"] Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.101636 4843 scope.go:117] "RemoveContainer" containerID="c8f84dc934e1494b8be8261626bed21da2f91fd351e535474b8bae3a1b60ade6" Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.102685 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mgmxn" podUID="62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" containerName="registry-server" containerID="cri-o://deb9bf1173cda8b400448cf2fe787f6f6eb8185000c35a5d8a495119dff65025" gracePeriod=2 Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.108476 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563958-rtrsj" event={"ID":"4a0bd964-b1ca-4243-991e-2bce7b1b8c84","Type":"ContainerStarted","Data":"365b7b1f3197e1cdef38cc83ca56f4ef993c7a4ac08694463625bf24dcbe465a"} Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.137603 4843 scope.go:117] "RemoveContainer" containerID="d18ff4054a4fa49ce68ba30bb5d9dd0ac68a79dd0409bf2a6e344e4166715504" Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.186932 4843 scope.go:117] "RemoveContainer" containerID="7801c60089bb685762c5df2ec2757e4ff8016d98b29cd7766b6cb3bd34b37f53" Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.237976 4843 scope.go:117] "RemoveContainer" containerID="d885470a42a8c4aef1e7dd2897a5cc93b1a08a58322a56f930810ccae974ef1e" Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.281262 4843 scope.go:117] "RemoveContainer" containerID="3a92f80ff331232ad5660fb596cf0ee8429ac8e0fcc6b12246afce94cad09de1" Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.449515 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.577629 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-utilities\") pod \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\" (UID: \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\") " Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.577949 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c62s\" (UniqueName: \"kubernetes.io/projected/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-kube-api-access-4c62s\") pod \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\" (UID: \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\") " Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.577997 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-catalog-content\") pod \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\" (UID: \"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4\") " Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.579119 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-utilities" (OuterVolumeSpecName: "utilities") pod "62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" (UID: "62c6448b-970f-41e5-a4c1-a4b5ddd53ee4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.585929 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-kube-api-access-4c62s" (OuterVolumeSpecName: "kube-api-access-4c62s") pod "62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" (UID: "62c6448b-970f-41e5-a4c1-a4b5ddd53ee4"). InnerVolumeSpecName "kube-api-access-4c62s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.648090 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" (UID: "62c6448b-970f-41e5-a4c1-a4b5ddd53ee4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.680534 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.680600 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c62s\" (UniqueName: \"kubernetes.io/projected/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-kube-api-access-4c62s\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:01 crc kubenswrapper[4843]: I0318 12:38:01.680613 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.114281 4843 generic.go:334] "Generic (PLEG): container finished" podID="62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" containerID="deb9bf1173cda8b400448cf2fe787f6f6eb8185000c35a5d8a495119dff65025" exitCode=0 Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.114339 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgmxn" Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.114380 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgmxn" event={"ID":"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4","Type":"ContainerDied","Data":"deb9bf1173cda8b400448cf2fe787f6f6eb8185000c35a5d8a495119dff65025"} Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.114428 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgmxn" event={"ID":"62c6448b-970f-41e5-a4c1-a4b5ddd53ee4","Type":"ContainerDied","Data":"3d0f6635fbc60f21373aa0c2d8cf40bc14d2319943c86a4510b57b2fef67e0c9"} Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.114462 4843 scope.go:117] "RemoveContainer" containerID="deb9bf1173cda8b400448cf2fe787f6f6eb8185000c35a5d8a495119dff65025" Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.141692 4843 scope.go:117] "RemoveContainer" containerID="5c55b346a99fe3ec6aaf9f1e3e0b9f6f5e16f2e0221752d38e5a6bd53131a710" Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.156005 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mgmxn"] Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.171564 4843 scope.go:117] "RemoveContainer" containerID="e0dac3ba0412ce159325bdb5e66ca4ac6a81d49e26ea8306424291c205c07381" Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.180281 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mgmxn"] Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.235783 4843 scope.go:117] "RemoveContainer" containerID="deb9bf1173cda8b400448cf2fe787f6f6eb8185000c35a5d8a495119dff65025" Mar 18 12:38:02 crc kubenswrapper[4843]: E0318 12:38:02.236537 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb9bf1173cda8b400448cf2fe787f6f6eb8185000c35a5d8a495119dff65025\": container with ID starting with deb9bf1173cda8b400448cf2fe787f6f6eb8185000c35a5d8a495119dff65025 not found: ID does not exist" containerID="deb9bf1173cda8b400448cf2fe787f6f6eb8185000c35a5d8a495119dff65025" Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.236754 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb9bf1173cda8b400448cf2fe787f6f6eb8185000c35a5d8a495119dff65025"} err="failed to get container status \"deb9bf1173cda8b400448cf2fe787f6f6eb8185000c35a5d8a495119dff65025\": rpc error: code = NotFound desc = could not find container \"deb9bf1173cda8b400448cf2fe787f6f6eb8185000c35a5d8a495119dff65025\": container with ID starting with deb9bf1173cda8b400448cf2fe787f6f6eb8185000c35a5d8a495119dff65025 not found: ID does not exist" Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.236791 4843 scope.go:117] "RemoveContainer" containerID="5c55b346a99fe3ec6aaf9f1e3e0b9f6f5e16f2e0221752d38e5a6bd53131a710" Mar 18 12:38:02 crc kubenswrapper[4843]: E0318 12:38:02.237262 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c55b346a99fe3ec6aaf9f1e3e0b9f6f5e16f2e0221752d38e5a6bd53131a710\": container with ID starting with 5c55b346a99fe3ec6aaf9f1e3e0b9f6f5e16f2e0221752d38e5a6bd53131a710 not found: ID does not exist" containerID="5c55b346a99fe3ec6aaf9f1e3e0b9f6f5e16f2e0221752d38e5a6bd53131a710" Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.237290 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c55b346a99fe3ec6aaf9f1e3e0b9f6f5e16f2e0221752d38e5a6bd53131a710"} err="failed to get container status \"5c55b346a99fe3ec6aaf9f1e3e0b9f6f5e16f2e0221752d38e5a6bd53131a710\": rpc error: code = NotFound desc = could not find container \"5c55b346a99fe3ec6aaf9f1e3e0b9f6f5e16f2e0221752d38e5a6bd53131a710\": container with ID starting with 5c55b346a99fe3ec6aaf9f1e3e0b9f6f5e16f2e0221752d38e5a6bd53131a710 not found: ID does not exist" Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.237308 4843 scope.go:117] "RemoveContainer" containerID="e0dac3ba0412ce159325bdb5e66ca4ac6a81d49e26ea8306424291c205c07381" Mar 18 12:38:02 crc kubenswrapper[4843]: E0318 12:38:02.237584 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0dac3ba0412ce159325bdb5e66ca4ac6a81d49e26ea8306424291c205c07381\": container with ID starting with e0dac3ba0412ce159325bdb5e66ca4ac6a81d49e26ea8306424291c205c07381 not found: ID does not exist" containerID="e0dac3ba0412ce159325bdb5e66ca4ac6a81d49e26ea8306424291c205c07381" Mar 18 12:38:02 crc kubenswrapper[4843]: I0318 12:38:02.237622 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0dac3ba0412ce159325bdb5e66ca4ac6a81d49e26ea8306424291c205c07381"} err="failed to get container status \"e0dac3ba0412ce159325bdb5e66ca4ac6a81d49e26ea8306424291c205c07381\": rpc error: code = NotFound desc = could not find container \"e0dac3ba0412ce159325bdb5e66ca4ac6a81d49e26ea8306424291c205c07381\": container with ID starting with e0dac3ba0412ce159325bdb5e66ca4ac6a81d49e26ea8306424291c205c07381 not found: ID does not exist" Mar 18 12:38:03 crc kubenswrapper[4843]: I0318 12:38:03.050081 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" path="/var/lib/kubelet/pods/62c6448b-970f-41e5-a4c1-a4b5ddd53ee4/volumes" Mar 18 12:38:03 crc kubenswrapper[4843]: I0318 12:38:03.126400 4843 generic.go:334] "Generic (PLEG): container finished" podID="4a0bd964-b1ca-4243-991e-2bce7b1b8c84" containerID="c9c983ab1d9a77b9028379369f22cbaef829f1c68ec2079dc8812ee4b0840a74" exitCode=0 Mar 18 12:38:03 crc kubenswrapper[4843]: I0318 12:38:03.126447 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563958-rtrsj" event={"ID":"4a0bd964-b1ca-4243-991e-2bce7b1b8c84","Type":"ContainerDied","Data":"c9c983ab1d9a77b9028379369f22cbaef829f1c68ec2079dc8812ee4b0840a74"} Mar 18 12:38:04 crc kubenswrapper[4843]: I0318 12:38:04.699922 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563958-rtrsj" Mar 18 12:38:04 crc kubenswrapper[4843]: I0318 12:38:04.858171 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh294\" (UniqueName: \"kubernetes.io/projected/4a0bd964-b1ca-4243-991e-2bce7b1b8c84-kube-api-access-zh294\") pod \"4a0bd964-b1ca-4243-991e-2bce7b1b8c84\" (UID: \"4a0bd964-b1ca-4243-991e-2bce7b1b8c84\") " Mar 18 12:38:04 crc kubenswrapper[4843]: I0318 12:38:04.871266 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a0bd964-b1ca-4243-991e-2bce7b1b8c84-kube-api-access-zh294" (OuterVolumeSpecName: "kube-api-access-zh294") pod "4a0bd964-b1ca-4243-991e-2bce7b1b8c84" (UID: "4a0bd964-b1ca-4243-991e-2bce7b1b8c84"). InnerVolumeSpecName "kube-api-access-zh294". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:38:04 crc kubenswrapper[4843]: I0318 12:38:04.961049 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh294\" (UniqueName: \"kubernetes.io/projected/4a0bd964-b1ca-4243-991e-2bce7b1b8c84-kube-api-access-zh294\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:05 crc kubenswrapper[4843]: I0318 12:38:05.395746 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563958-rtrsj" event={"ID":"4a0bd964-b1ca-4243-991e-2bce7b1b8c84","Type":"ContainerDied","Data":"365b7b1f3197e1cdef38cc83ca56f4ef993c7a4ac08694463625bf24dcbe465a"} Mar 18 12:38:05 crc kubenswrapper[4843]: I0318 12:38:05.395952 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="365b7b1f3197e1cdef38cc83ca56f4ef993c7a4ac08694463625bf24dcbe465a" Mar 18 12:38:05 crc kubenswrapper[4843]: I0318 12:38:05.395791 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563958-rtrsj" Mar 18 12:38:05 crc kubenswrapper[4843]: I0318 12:38:05.791552 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563952-gfrqt"] Mar 18 12:38:05 crc kubenswrapper[4843]: I0318 12:38:05.799414 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563952-gfrqt"] Mar 18 12:38:07 crc kubenswrapper[4843]: I0318 12:38:07.004771 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1fb0226-598c-4497-8d1f-8711a07f13a6" path="/var/lib/kubelet/pods/f1fb0226-598c-4497-8d1f-8711a07f13a6/volumes" Mar 18 12:38:12 crc kubenswrapper[4843]: I0318 12:38:12.984496 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:38:12 crc kubenswrapper[4843]: E0318 12:38:12.985516 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:38:27 crc kubenswrapper[4843]: I0318 12:38:27.984935 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:38:27 crc kubenswrapper[4843]: E0318 12:38:27.986412 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:38:42 crc kubenswrapper[4843]: I0318 12:38:42.984149 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:38:42 crc kubenswrapper[4843]: E0318 12:38:42.984941 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:38:57 crc kubenswrapper[4843]: I0318 12:38:57.983977 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:38:57 crc kubenswrapper[4843]: E0318 12:38:57.984982 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:39:01 crc kubenswrapper[4843]: I0318 12:39:01.432079 4843 scope.go:117] "RemoveContainer" containerID="ebfc5b6a6da817c7161344f5173ae8dffd127485c51c681c5f7f5beac79e8c00" Mar 18 12:39:01 crc kubenswrapper[4843]: I0318 12:39:01.460826 4843 scope.go:117] "RemoveContainer" containerID="3ce515a7b798c870df4df626ac4d5f2aed5b96f7feee3f62285749a525f1b613" Mar 18 12:39:01 crc kubenswrapper[4843]: I0318 12:39:01.513049 4843 scope.go:117] "RemoveContainer" containerID="d59a8d024fb419c81992a96af82ccabddd6bd91e85d89a50af97553cf048110b" Mar 18 12:39:01 crc kubenswrapper[4843]: I0318 12:39:01.560501 4843 scope.go:117] "RemoveContainer" containerID="54367971f1ac999dc3e4856a8fb90d7293cc68bcd2ef1102827ff4a6c7481412" Mar 18 12:39:12 crc kubenswrapper[4843]: I0318 12:39:12.984218 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:39:12 crc kubenswrapper[4843]: E0318 12:39:12.985039 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:39:27 crc kubenswrapper[4843]: I0318 12:39:27.986972 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:39:27 crc kubenswrapper[4843]: E0318 12:39:27.988076 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:39:42 crc kubenswrapper[4843]: I0318 12:39:42.985090 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:39:42 crc kubenswrapper[4843]: E0318 12:39:42.986261 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:39:55 crc kubenswrapper[4843]: I0318 12:39:55.984198 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:39:55 crc kubenswrapper[4843]: E0318 12:39:55.985064 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.209370 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563960-nwkgt"] Mar 18 12:40:00 crc kubenswrapper[4843]: E0318 12:40:00.210100 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" containerName="extract-utilities" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.210113 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" containerName="extract-utilities" Mar 18 12:40:00 crc kubenswrapper[4843]: E0318 12:40:00.210149 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a0bd964-b1ca-4243-991e-2bce7b1b8c84" containerName="oc" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.210156 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a0bd964-b1ca-4243-991e-2bce7b1b8c84" containerName="oc" Mar 18 12:40:00 crc kubenswrapper[4843]: E0318 12:40:00.210177 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" containerName="registry-server" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.210183 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" containerName="registry-server" Mar 18 12:40:00 crc kubenswrapper[4843]: E0318 12:40:00.210196 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" containerName="extract-content" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.210201 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" containerName="extract-content" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.210371 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a0bd964-b1ca-4243-991e-2bce7b1b8c84" containerName="oc" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.210385 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c6448b-970f-41e5-a4c1-a4b5ddd53ee4" containerName="registry-server" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.211079 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563960-nwkgt" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.214990 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.215973 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.220503 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.222134 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563960-nwkgt"] Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.326494 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76mwh\" (UniqueName: \"kubernetes.io/projected/a42fe4bf-05f6-40d6-a447-93b30093c598-kube-api-access-76mwh\") pod \"auto-csr-approver-29563960-nwkgt\" (UID: \"a42fe4bf-05f6-40d6-a447-93b30093c598\") " pod="openshift-infra/auto-csr-approver-29563960-nwkgt" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.428780 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76mwh\" (UniqueName: \"kubernetes.io/projected/a42fe4bf-05f6-40d6-a447-93b30093c598-kube-api-access-76mwh\") pod \"auto-csr-approver-29563960-nwkgt\" (UID: \"a42fe4bf-05f6-40d6-a447-93b30093c598\") " pod="openshift-infra/auto-csr-approver-29563960-nwkgt" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.449512 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76mwh\" (UniqueName: \"kubernetes.io/projected/a42fe4bf-05f6-40d6-a447-93b30093c598-kube-api-access-76mwh\") pod \"auto-csr-approver-29563960-nwkgt\" (UID: \"a42fe4bf-05f6-40d6-a447-93b30093c598\") " pod="openshift-infra/auto-csr-approver-29563960-nwkgt" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.528511 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563960-nwkgt" Mar 18 12:40:00 crc kubenswrapper[4843]: I0318 12:40:00.998983 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563960-nwkgt"] Mar 18 12:40:01 crc kubenswrapper[4843]: I0318 12:40:01.686917 4843 scope.go:117] "RemoveContainer" containerID="aebb2469b89b8930f6cfa9e90cb71b157196270c2b73c90bec57a7aeb42fdaf8" Mar 18 12:40:01 crc kubenswrapper[4843]: I0318 12:40:01.829537 4843 scope.go:117] "RemoveContainer" containerID="6edfbde974f14038b9c73570c2850cb9073586496f8ea48614074132ddb0afd7" Mar 18 12:40:01 crc kubenswrapper[4843]: I0318 12:40:01.839324 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563960-nwkgt" event={"ID":"a42fe4bf-05f6-40d6-a447-93b30093c598","Type":"ContainerStarted","Data":"3774f0c3d7e63da568d6462d796de1f35270832e5a51725893addef0de52558e"} Mar 18 12:40:02 crc kubenswrapper[4843]: I0318 12:40:02.851843 4843 generic.go:334] "Generic (PLEG): container finished" podID="a42fe4bf-05f6-40d6-a447-93b30093c598" containerID="76548f0e424c05e0b37150d6e10f8475de4b9f42aca7a126784505369ac724fe" exitCode=0 Mar 18 12:40:02 crc kubenswrapper[4843]: I0318 12:40:02.851962 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563960-nwkgt" event={"ID":"a42fe4bf-05f6-40d6-a447-93b30093c598","Type":"ContainerDied","Data":"76548f0e424c05e0b37150d6e10f8475de4b9f42aca7a126784505369ac724fe"} Mar 18 12:40:04 crc kubenswrapper[4843]: I0318 12:40:04.309152 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563960-nwkgt" Mar 18 12:40:04 crc kubenswrapper[4843]: I0318 12:40:04.475779 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76mwh\" (UniqueName: \"kubernetes.io/projected/a42fe4bf-05f6-40d6-a447-93b30093c598-kube-api-access-76mwh\") pod \"a42fe4bf-05f6-40d6-a447-93b30093c598\" (UID: \"a42fe4bf-05f6-40d6-a447-93b30093c598\") " Mar 18 12:40:04 crc kubenswrapper[4843]: I0318 12:40:04.484626 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42fe4bf-05f6-40d6-a447-93b30093c598-kube-api-access-76mwh" (OuterVolumeSpecName: "kube-api-access-76mwh") pod "a42fe4bf-05f6-40d6-a447-93b30093c598" (UID: "a42fe4bf-05f6-40d6-a447-93b30093c598"). InnerVolumeSpecName "kube-api-access-76mwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:40:04 crc kubenswrapper[4843]: I0318 12:40:04.579822 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76mwh\" (UniqueName: \"kubernetes.io/projected/a42fe4bf-05f6-40d6-a447-93b30093c598-kube-api-access-76mwh\") on node \"crc\" DevicePath \"\"" Mar 18 12:40:04 crc kubenswrapper[4843]: I0318 12:40:04.890423 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563960-nwkgt" event={"ID":"a42fe4bf-05f6-40d6-a447-93b30093c598","Type":"ContainerDied","Data":"3774f0c3d7e63da568d6462d796de1f35270832e5a51725893addef0de52558e"} Mar 18 12:40:04 crc kubenswrapper[4843]: I0318 12:40:04.890899 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3774f0c3d7e63da568d6462d796de1f35270832e5a51725893addef0de52558e" Mar 18 12:40:04 crc kubenswrapper[4843]: I0318 12:40:04.890533 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563960-nwkgt" Mar 18 12:40:05 crc kubenswrapper[4843]: I0318 12:40:05.420634 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563954-gnjzm"] Mar 18 12:40:05 crc kubenswrapper[4843]: I0318 12:40:05.434400 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563954-gnjzm"] Mar 18 12:40:06 crc kubenswrapper[4843]: I0318 12:40:06.998576 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df17c133-1ad9-4e20-884e-99a07055c280" path="/var/lib/kubelet/pods/df17c133-1ad9-4e20-884e-99a07055c280/volumes" Mar 18 12:40:11 crc kubenswrapper[4843]: I0318 12:40:11.091656 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:40:11 crc kubenswrapper[4843]: E0318 12:40:11.092370 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:40:25 crc kubenswrapper[4843]: I0318 12:40:25.983604 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:40:25 crc kubenswrapper[4843]: E0318 12:40:25.984309 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:40:37 crc kubenswrapper[4843]: I0318 12:40:37.984085 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:40:37 crc kubenswrapper[4843]: E0318 12:40:37.985167 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:40:50 crc kubenswrapper[4843]: I0318 12:40:50.540722 4843 generic.go:334] "Generic (PLEG): container finished" podID="7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab" containerID="8c78b3a68e837c6c778fd5f5174e46279979bb29f40156aed1743c78b520f522" exitCode=0 Mar 18 12:40:50 crc kubenswrapper[4843]: I0318 12:40:50.540817 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" event={"ID":"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab","Type":"ContainerDied","Data":"8c78b3a68e837c6c778fd5f5174e46279979bb29f40156aed1743c78b520f522"} Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.094900 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.121527 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-bootstrap-combined-ca-bundle\") pod \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.121575 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49hkq\" (UniqueName: \"kubernetes.io/projected/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-kube-api-access-49hkq\") pod \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.121727 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-inventory\") pod \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.121773 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-ssh-key-openstack-edpm-ipam\") pod \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\" (UID: \"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab\") " Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.127917 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab" (UID: "7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.139948 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-kube-api-access-49hkq" (OuterVolumeSpecName: "kube-api-access-49hkq") pod "7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab" (UID: "7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab"). InnerVolumeSpecName "kube-api-access-49hkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.151858 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab" (UID: "7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.155006 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-inventory" (OuterVolumeSpecName: "inventory") pod "7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab" (UID: "7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.223974 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.224298 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.224313 4843 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.224325 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49hkq\" (UniqueName: \"kubernetes.io/projected/7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab-kube-api-access-49hkq\") on node \"crc\" DevicePath \"\"" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.565090 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" event={"ID":"7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab","Type":"ContainerDied","Data":"f3ac132f85d71e5d078131abfd303d1b46321c01486df26dc6a1ddef1745aa63"} Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.565139 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.565151 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3ac132f85d71e5d078131abfd303d1b46321c01486df26dc6a1ddef1745aa63" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.723330 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm"] Mar 18 12:40:52 crc kubenswrapper[4843]: E0318 12:40:52.723915 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42fe4bf-05f6-40d6-a447-93b30093c598" containerName="oc" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.723953 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42fe4bf-05f6-40d6-a447-93b30093c598" containerName="oc" Mar 18 12:40:52 crc kubenswrapper[4843]: E0318 12:40:52.723997 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.724007 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.724195 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42fe4bf-05f6-40d6-a447-93b30093c598" containerName="oc" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.724217 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.724916 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.728467 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.729734 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.730574 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm"] Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.737354 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.737412 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.741082 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qklf\" (UniqueName: \"kubernetes.io/projected/433cb022-0613-4bf3-81ed-8b4239e48629-kube-api-access-8qklf\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm\" (UID: \"433cb022-0613-4bf3-81ed-8b4239e48629\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.741164 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/433cb022-0613-4bf3-81ed-8b4239e48629-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm\" (UID: \"433cb022-0613-4bf3-81ed-8b4239e48629\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.741330 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/433cb022-0613-4bf3-81ed-8b4239e48629-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm\" (UID: \"433cb022-0613-4bf3-81ed-8b4239e48629\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.842901 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qklf\" (UniqueName: \"kubernetes.io/projected/433cb022-0613-4bf3-81ed-8b4239e48629-kube-api-access-8qklf\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm\" (UID: \"433cb022-0613-4bf3-81ed-8b4239e48629\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.843028 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/433cb022-0613-4bf3-81ed-8b4239e48629-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm\" (UID: \"433cb022-0613-4bf3-81ed-8b4239e48629\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.843062 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/433cb022-0613-4bf3-81ed-8b4239e48629-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm\" (UID: \"433cb022-0613-4bf3-81ed-8b4239e48629\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.847898 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/433cb022-0613-4bf3-81ed-8b4239e48629-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm\" (UID: \"433cb022-0613-4bf3-81ed-8b4239e48629\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.847977 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/433cb022-0613-4bf3-81ed-8b4239e48629-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm\" (UID: \"433cb022-0613-4bf3-81ed-8b4239e48629\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.861303 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qklf\" (UniqueName: \"kubernetes.io/projected/433cb022-0613-4bf3-81ed-8b4239e48629-kube-api-access-8qklf\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm\" (UID: \"433cb022-0613-4bf3-81ed-8b4239e48629\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" Mar 18 12:40:52 crc kubenswrapper[4843]: I0318 12:40:52.983985 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:40:52 crc kubenswrapper[4843]: E0318 12:40:52.984249 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:40:53 crc kubenswrapper[4843]: I0318 12:40:53.054452 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" Mar 18 12:40:53 crc kubenswrapper[4843]: I0318 12:40:53.554520 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm"] Mar 18 12:40:53 crc kubenswrapper[4843]: I0318 12:40:53.556067 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:40:53 crc kubenswrapper[4843]: I0318 12:40:53.574245 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" event={"ID":"433cb022-0613-4bf3-81ed-8b4239e48629","Type":"ContainerStarted","Data":"d0a9648715883801db1a7f790d5b88a2061bd8f609b8b2a0808d11ef277f50d8"} Mar 18 12:40:54 crc kubenswrapper[4843]: I0318 12:40:54.592588 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" event={"ID":"433cb022-0613-4bf3-81ed-8b4239e48629","Type":"ContainerStarted","Data":"62316c8684d11fd433f38d6072135c67dbe1a6f2f7ddc4b2c2224466e0da243a"} Mar 18 12:40:54 crc kubenswrapper[4843]: I0318 12:40:54.615543 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" podStartSLOduration=2.45648984 podStartE2EDuration="2.615520273s" podCreationTimestamp="2026-03-18 12:40:52 +0000 UTC" firstStartedPulling="2026-03-18 12:40:53.555842195 +0000 UTC m=+1887.271667719" lastFinishedPulling="2026-03-18 12:40:53.714872628 +0000 UTC m=+1887.430698152" observedRunningTime="2026-03-18 12:40:54.609343018 +0000 UTC m=+1888.325168562" watchObservedRunningTime="2026-03-18 12:40:54.615520273 +0000 UTC m=+1888.331345807" Mar 18 12:41:01 crc kubenswrapper[4843]: I0318 12:41:01.909349 4843 scope.go:117] "RemoveContainer" containerID="0d33d648e5242a45903289afc6259d2db51ccddffcb0d0c95cac93496b0218a4" Mar 18 12:41:01 crc kubenswrapper[4843]: I0318 12:41:01.983276 4843 scope.go:117] "RemoveContainer" containerID="5379aa099ec89e4fd5152c0fbe5ad04b4ed7e88fd9118c862f55f1c0883007a3" Mar 18 12:41:02 crc kubenswrapper[4843]: I0318 12:41:02.007233 4843 scope.go:117] "RemoveContainer" containerID="cfc0c4a5c43405f1e18bf72fe7a35a6ff27187cc2e33cbfe0b6c691f35f0d0bc" Mar 18 12:41:06 crc kubenswrapper[4843]: I0318 12:41:06.994000 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:41:06 crc kubenswrapper[4843]: E0318 12:41:06.994907 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:41:19 crc kubenswrapper[4843]: I0318 12:41:19.983739 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:41:19 crc kubenswrapper[4843]: E0318 12:41:19.984379 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:41:27 crc kubenswrapper[4843]: I0318 12:41:27.053337 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6007-account-create-update-jc8kx"] Mar 18 12:41:27 crc kubenswrapper[4843]: I0318 12:41:27.063600 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d766-account-create-update-p7lfg"] Mar 18 12:41:27 crc kubenswrapper[4843]: I0318 12:41:27.076702 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5ee3-account-create-update-5hns2"] Mar 18 12:41:27 crc kubenswrapper[4843]: I0318 12:41:27.087358 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-686w4"] Mar 18 12:41:27 crc kubenswrapper[4843]: I0318 12:41:27.096197 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6007-account-create-update-jc8kx"] Mar 18 12:41:27 crc kubenswrapper[4843]: I0318 12:41:27.105261 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d766-account-create-update-p7lfg"] Mar 18 12:41:27 crc kubenswrapper[4843]: I0318 12:41:27.112740 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5ee3-account-create-update-5hns2"] Mar 18 12:41:27 crc kubenswrapper[4843]: I0318 12:41:27.120689 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-686w4"] Mar 18 12:41:27 crc kubenswrapper[4843]: I0318 12:41:27.128902 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-g6j4q"] Mar 18 12:41:27 crc kubenswrapper[4843]: I0318 12:41:27.137228 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-g6j4q"] Mar 18 12:41:28 crc kubenswrapper[4843]: I0318 12:41:28.045084 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-sqt7x"] Mar 18 12:41:28 crc kubenswrapper[4843]: I0318 12:41:28.064394 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-sqt7x"] Mar 18 12:41:28 crc kubenswrapper[4843]: I0318 12:41:28.996546 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d5becb7-c02d-46fd-8ced-be1b3bfcf16f" path="/var/lib/kubelet/pods/0d5becb7-c02d-46fd-8ced-be1b3bfcf16f/volumes" Mar 18 12:41:28 crc kubenswrapper[4843]: I0318 12:41:28.998367 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66e19b12-173c-40cb-8e07-494707530bc1" path="/var/lib/kubelet/pods/66e19b12-173c-40cb-8e07-494707530bc1/volumes" Mar 18 12:41:28 crc kubenswrapper[4843]: I0318 12:41:28.999490 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e792089-e775-4ff8-85db-e7cfacd8bba6" path="/var/lib/kubelet/pods/7e792089-e775-4ff8-85db-e7cfacd8bba6/volumes" Mar 18 12:41:29 crc kubenswrapper[4843]: I0318 12:41:29.000446 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b8e94f1-e634-4d66-8f8e-939ede76f529" path="/var/lib/kubelet/pods/9b8e94f1-e634-4d66-8f8e-939ede76f529/volumes" Mar 18 12:41:29 crc kubenswrapper[4843]: I0318 12:41:29.002701 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf32744d-f781-4703-b0e2-0ca8ca852092" path="/var/lib/kubelet/pods/bf32744d-f781-4703-b0e2-0ca8ca852092/volumes" Mar 18 12:41:29 crc kubenswrapper[4843]: I0318 12:41:29.003738 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d49e5ae1-6976-4a45-a007-279a231ec974" path="/var/lib/kubelet/pods/d49e5ae1-6976-4a45-a007-279a231ec974/volumes" Mar 18 12:41:31 crc kubenswrapper[4843]: I0318 12:41:31.983160 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:41:31 crc kubenswrapper[4843]: E0318 12:41:31.983962 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:41:42 crc kubenswrapper[4843]: I0318 12:41:42.984466 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:41:42 crc kubenswrapper[4843]: E0318 12:41:42.985387 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:41:48 crc kubenswrapper[4843]: I0318 12:41:48.064921 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hgdnl"] Mar 18 12:41:48 crc kubenswrapper[4843]: I0318 12:41:48.076232 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hgdnl"] Mar 18 12:41:49 crc kubenswrapper[4843]: I0318 12:41:49.008714 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b814bf-1f2a-4604-922c-74073fb910d6" path="/var/lib/kubelet/pods/20b814bf-1f2a-4604-922c-74073fb910d6/volumes" Mar 18 12:41:55 crc kubenswrapper[4843]: I0318 12:41:55.046410 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-72hc6"] Mar 18 12:41:55 crc kubenswrapper[4843]: I0318 12:41:55.058105 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8d63-account-create-update-pkpgt"] Mar 18 12:41:55 crc kubenswrapper[4843]: I0318 12:41:55.068935 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-30c9-account-create-update-ctzdf"] Mar 18 12:41:55 crc kubenswrapper[4843]: I0318 12:41:55.079532 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-72hc6"] Mar 18 12:41:55 crc kubenswrapper[4843]: I0318 12:41:55.091116 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-30c9-account-create-update-ctzdf"] Mar 18 12:41:55 crc kubenswrapper[4843]: I0318 12:41:55.102754 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-8d63-account-create-update-pkpgt"] Mar 18 12:41:56 crc kubenswrapper[4843]: I0318 12:41:56.041113 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e9b5-account-create-update-sbgfm"] Mar 18 12:41:56 crc kubenswrapper[4843]: I0318 12:41:56.054739 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-jqd9q"] Mar 18 12:41:56 crc kubenswrapper[4843]: I0318 12:41:56.069297 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-85z5m"] Mar 18 12:41:56 crc kubenswrapper[4843]: I0318 12:41:56.082002 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-96rfj"] Mar 18 12:41:56 crc kubenswrapper[4843]: I0318 12:41:56.095063 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e9b5-account-create-update-sbgfm"] Mar 18 12:41:56 crc kubenswrapper[4843]: I0318 12:41:56.110259 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-96rfj"] Mar 18 12:41:56 crc kubenswrapper[4843]: I0318 12:41:56.119374 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-jqd9q"] Mar 18 12:41:56 crc kubenswrapper[4843]: I0318 12:41:56.130632 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-85z5m"] Mar 18 12:41:56 crc kubenswrapper[4843]: I0318 12:41:56.996716 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:41:56 crc kubenswrapper[4843]: E0318 12:41:56.997812 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:41:57 crc kubenswrapper[4843]: I0318 12:41:57.003424 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03b52bc5-9622-4d51-9c92-8d75afad43ac" path="/var/lib/kubelet/pods/03b52bc5-9622-4d51-9c92-8d75afad43ac/volumes" Mar 18 12:41:57 crc kubenswrapper[4843]: I0318 12:41:57.004228 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a59eee1-b0e8-402b-b89e-d7e461672a21" path="/var/lib/kubelet/pods/2a59eee1-b0e8-402b-b89e-d7e461672a21/volumes" Mar 18 12:41:57 crc kubenswrapper[4843]: I0318 12:41:57.005144 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="353e0d74-8cd5-4bf0-bd10-38f049806e4f" path="/var/lib/kubelet/pods/353e0d74-8cd5-4bf0-bd10-38f049806e4f/volumes" Mar 18 12:41:57 crc kubenswrapper[4843]: I0318 12:41:57.005918 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44130b76-6642-46ec-9e38-38880255a091" path="/var/lib/kubelet/pods/44130b76-6642-46ec-9e38-38880255a091/volumes" Mar 18 12:41:57 crc kubenswrapper[4843]: I0318 12:41:57.007366 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d22873a-9171-47bc-9c90-29e3cd5f79f1" path="/var/lib/kubelet/pods/9d22873a-9171-47bc-9c90-29e3cd5f79f1/volumes" Mar 18 12:41:57 crc kubenswrapper[4843]: I0318 12:41:57.008218 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a1442f-5fd6-45ff-9002-5532bf8593d2" path="/var/lib/kubelet/pods/c2a1442f-5fd6-45ff-9002-5532bf8593d2/volumes" Mar 18 12:41:57 crc kubenswrapper[4843]: I0318 12:41:57.009213 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55f04aa-b47f-40b3-99bc-a8a260d421db" path="/var/lib/kubelet/pods/c55f04aa-b47f-40b3-99bc-a8a260d421db/volumes" Mar 18 12:42:00 crc kubenswrapper[4843]: I0318 12:42:00.146268 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563962-k5ln6"] Mar 18 12:42:00 crc kubenswrapper[4843]: I0318 12:42:00.148263 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563962-k5ln6" Mar 18 12:42:00 crc kubenswrapper[4843]: I0318 12:42:00.151629 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:42:00 crc kubenswrapper[4843]: I0318 12:42:00.152146 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:42:00 crc kubenswrapper[4843]: I0318 12:42:00.154044 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:42:00 crc kubenswrapper[4843]: I0318 12:42:00.160143 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563962-k5ln6"] Mar 18 12:42:00 crc kubenswrapper[4843]: I0318 12:42:00.237961 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6fmp\" (UniqueName: \"kubernetes.io/projected/15b06ed2-150d-4ded-97d8-24201f827e09-kube-api-access-c6fmp\") pod \"auto-csr-approver-29563962-k5ln6\" (UID: \"15b06ed2-150d-4ded-97d8-24201f827e09\") " pod="openshift-infra/auto-csr-approver-29563962-k5ln6" Mar 18 12:42:00 crc kubenswrapper[4843]: I0318 12:42:00.339613 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6fmp\" (UniqueName: \"kubernetes.io/projected/15b06ed2-150d-4ded-97d8-24201f827e09-kube-api-access-c6fmp\") pod \"auto-csr-approver-29563962-k5ln6\" (UID: \"15b06ed2-150d-4ded-97d8-24201f827e09\") " pod="openshift-infra/auto-csr-approver-29563962-k5ln6" Mar 18 12:42:00 crc kubenswrapper[4843]: I0318 12:42:00.367319 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6fmp\" (UniqueName: \"kubernetes.io/projected/15b06ed2-150d-4ded-97d8-24201f827e09-kube-api-access-c6fmp\") pod \"auto-csr-approver-29563962-k5ln6\" (UID: \"15b06ed2-150d-4ded-97d8-24201f827e09\") " pod="openshift-infra/auto-csr-approver-29563962-k5ln6" Mar 18 12:42:00 crc kubenswrapper[4843]: I0318 12:42:00.474017 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563962-k5ln6" Mar 18 12:42:00 crc kubenswrapper[4843]: I0318 12:42:00.951207 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563962-k5ln6"] Mar 18 12:42:01 crc kubenswrapper[4843]: I0318 12:42:01.049555 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-xswdv"] Mar 18 12:42:01 crc kubenswrapper[4843]: I0318 12:42:01.124664 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-xswdv"] Mar 18 12:42:01 crc kubenswrapper[4843]: I0318 12:42:01.793925 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563962-k5ln6" event={"ID":"15b06ed2-150d-4ded-97d8-24201f827e09","Type":"ContainerStarted","Data":"cd89d516fb947d921621cb5cf73f9675284aa9c7d5dd9328c95abf67c96330e4"} Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.081819 4843 scope.go:117] "RemoveContainer" containerID="1de7ffacd682788a02c237fcc2290d042aef019d97619fa06423125a0acc9d3b" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.394569 4843 scope.go:117] "RemoveContainer" containerID="3c05b3fd28b9d04cd849035d7b9c2053855f68c57ae1f5e1a0d727f95677e162" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.424837 4843 scope.go:117] "RemoveContainer" containerID="d870cba98f1e27e5b426eb39e65593aa769e1143aef02c8d8739ea6a6ad4505c" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.484494 4843 scope.go:117] "RemoveContainer" containerID="59d4aac0b4a0325d1dcedff9dcc41a05e8d9449a95458b2d7496241c6f532139" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.523370 4843 scope.go:117] "RemoveContainer" containerID="1e6591e4d061c1a956713a72c32f83b6f4fa18d77f327e88dc1d5d74205818d8" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.554544 4843 scope.go:117] "RemoveContainer" containerID="1eac199cd750bd34a1bcefeec8d11b16af7ce3067ec1b310539de8674c1deafd" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.593974 4843 scope.go:117] "RemoveContainer" containerID="fee1484bbff24cc68525b9dbb843b766538eeff0e6e48955fa69c7bda4cdec85" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.621077 4843 scope.go:117] "RemoveContainer" containerID="9dd859577e64e8d95fb5e62e170d352f8731175bb06d53be92e529ed3818d6f1" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.642558 4843 scope.go:117] "RemoveContainer" containerID="dea13a29256cecf7eddac81e6d6a3347abb34dbb0d61995b271062f9eaa38368" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.673079 4843 scope.go:117] "RemoveContainer" containerID="84f1e80a55595ce219003cd95ea8901fed26fe1122aad71ab147263581baf074" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.695230 4843 scope.go:117] "RemoveContainer" containerID="0e4adeba34076e823f49c813041006a8d6c5c83bdd6289887901992fb2e5bac0" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.718479 4843 scope.go:117] "RemoveContainer" containerID="1336150f6db4c3268a81a5115bbdc5f4e84985ef4ad52f512d9aaebf978ff5fc" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.742464 4843 scope.go:117] "RemoveContainer" containerID="0974e2746a352a04f1a2bd01ae9f13a0f5079e16597fa6ef9ad6915378cee0ab" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.760712 4843 scope.go:117] "RemoveContainer" containerID="8e48cf8306e8d4a6b65cd6884b53685eeb027607eaf50335aa7dea804c8641b7" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.780061 4843 scope.go:117] "RemoveContainer" containerID="7c4900a57462aebe1c9cb7912592a611ce894a40f4fd21b651d028ff7450f3a6" Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.834264 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563962-k5ln6" event={"ID":"15b06ed2-150d-4ded-97d8-24201f827e09","Type":"ContainerStarted","Data":"85a062706b3b829cd2163da569e59875da2b30444fccd8da01899ee16000ef28"} Mar 18 12:42:02 crc kubenswrapper[4843]: I0318 12:42:02.859624 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563962-k5ln6" podStartSLOduration=1.845227943 podStartE2EDuration="2.859580846s" podCreationTimestamp="2026-03-18 12:42:00 +0000 UTC" firstStartedPulling="2026-03-18 12:42:00.962446244 +0000 UTC m=+1954.678271768" lastFinishedPulling="2026-03-18 12:42:01.976799127 +0000 UTC m=+1955.692624671" observedRunningTime="2026-03-18 12:42:02.855125029 +0000 UTC m=+1956.570950553" watchObservedRunningTime="2026-03-18 12:42:02.859580846 +0000 UTC m=+1956.575406370" Mar 18 12:42:03 crc kubenswrapper[4843]: I0318 12:42:03.000631 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24bc5f90-1452-42d2-90c3-72bb22f30972" path="/var/lib/kubelet/pods/24bc5f90-1452-42d2-90c3-72bb22f30972/volumes" Mar 18 12:42:03 crc kubenswrapper[4843]: I0318 12:42:03.850289 4843 generic.go:334] "Generic (PLEG): container finished" podID="15b06ed2-150d-4ded-97d8-24201f827e09" containerID="85a062706b3b829cd2163da569e59875da2b30444fccd8da01899ee16000ef28" exitCode=0 Mar 18 12:42:03 crc kubenswrapper[4843]: I0318 12:42:03.850397 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563962-k5ln6" event={"ID":"15b06ed2-150d-4ded-97d8-24201f827e09","Type":"ContainerDied","Data":"85a062706b3b829cd2163da569e59875da2b30444fccd8da01899ee16000ef28"} Mar 18 12:42:05 crc kubenswrapper[4843]: I0318 12:42:05.570017 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563962-k5ln6" Mar 18 12:42:06 crc kubenswrapper[4843]: I0318 12:42:06.061359 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563962-k5ln6" event={"ID":"15b06ed2-150d-4ded-97d8-24201f827e09","Type":"ContainerDied","Data":"cd89d516fb947d921621cb5cf73f9675284aa9c7d5dd9328c95abf67c96330e4"} Mar 18 12:42:06 crc kubenswrapper[4843]: I0318 12:42:06.061616 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd89d516fb947d921621cb5cf73f9675284aa9c7d5dd9328c95abf67c96330e4" Mar 18 12:42:06 crc kubenswrapper[4843]: I0318 12:42:06.061510 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563962-k5ln6" Mar 18 12:42:06 crc kubenswrapper[4843]: I0318 12:42:06.106558 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563956-sk8bs"] Mar 18 12:42:06 crc kubenswrapper[4843]: I0318 12:42:06.114402 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563956-sk8bs"] Mar 18 12:42:06 crc kubenswrapper[4843]: I0318 12:42:06.141301 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6fmp\" (UniqueName: \"kubernetes.io/projected/15b06ed2-150d-4ded-97d8-24201f827e09-kube-api-access-c6fmp\") pod \"15b06ed2-150d-4ded-97d8-24201f827e09\" (UID: \"15b06ed2-150d-4ded-97d8-24201f827e09\") " Mar 18 12:42:06 crc kubenswrapper[4843]: I0318 12:42:06.150176 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15b06ed2-150d-4ded-97d8-24201f827e09-kube-api-access-c6fmp" (OuterVolumeSpecName: "kube-api-access-c6fmp") pod "15b06ed2-150d-4ded-97d8-24201f827e09" (UID: "15b06ed2-150d-4ded-97d8-24201f827e09"). InnerVolumeSpecName "kube-api-access-c6fmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:42:06 crc kubenswrapper[4843]: I0318 12:42:06.244341 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6fmp\" (UniqueName: \"kubernetes.io/projected/15b06ed2-150d-4ded-97d8-24201f827e09-kube-api-access-c6fmp\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:07 crc kubenswrapper[4843]: I0318 12:42:07.003547 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388f1788-0b98-40b9-8bbb-5c617ca97a3e" path="/var/lib/kubelet/pods/388f1788-0b98-40b9-8bbb-5c617ca97a3e/volumes" Mar 18 12:42:07 crc kubenswrapper[4843]: I0318 12:42:07.984945 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:42:07 crc kubenswrapper[4843]: E0318 12:42:07.985630 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:42:19 crc kubenswrapper[4843]: I0318 12:42:19.395234 4843 generic.go:334] "Generic (PLEG): container finished" podID="433cb022-0613-4bf3-81ed-8b4239e48629" containerID="62316c8684d11fd433f38d6072135c67dbe1a6f2f7ddc4b2c2224466e0da243a" exitCode=0 Mar 18 12:42:19 crc kubenswrapper[4843]: I0318 12:42:19.395344 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" event={"ID":"433cb022-0613-4bf3-81ed-8b4239e48629","Type":"ContainerDied","Data":"62316c8684d11fd433f38d6072135c67dbe1a6f2f7ddc4b2c2224466e0da243a"} Mar 18 12:42:20 crc kubenswrapper[4843]: I0318 12:42:20.864819 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" Mar 18 12:42:20 crc kubenswrapper[4843]: I0318 12:42:20.920299 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qklf\" (UniqueName: \"kubernetes.io/projected/433cb022-0613-4bf3-81ed-8b4239e48629-kube-api-access-8qklf\") pod \"433cb022-0613-4bf3-81ed-8b4239e48629\" (UID: \"433cb022-0613-4bf3-81ed-8b4239e48629\") " Mar 18 12:42:20 crc kubenswrapper[4843]: I0318 12:42:20.920395 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/433cb022-0613-4bf3-81ed-8b4239e48629-inventory\") pod \"433cb022-0613-4bf3-81ed-8b4239e48629\" (UID: \"433cb022-0613-4bf3-81ed-8b4239e48629\") " Mar 18 12:42:20 crc kubenswrapper[4843]: I0318 12:42:20.920501 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/433cb022-0613-4bf3-81ed-8b4239e48629-ssh-key-openstack-edpm-ipam\") pod \"433cb022-0613-4bf3-81ed-8b4239e48629\" (UID: \"433cb022-0613-4bf3-81ed-8b4239e48629\") " Mar 18 12:42:20 crc kubenswrapper[4843]: I0318 12:42:20.929931 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/433cb022-0613-4bf3-81ed-8b4239e48629-kube-api-access-8qklf" (OuterVolumeSpecName: "kube-api-access-8qklf") pod "433cb022-0613-4bf3-81ed-8b4239e48629" (UID: "433cb022-0613-4bf3-81ed-8b4239e48629"). InnerVolumeSpecName "kube-api-access-8qklf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:42:20 crc kubenswrapper[4843]: I0318 12:42:20.946824 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/433cb022-0613-4bf3-81ed-8b4239e48629-inventory" (OuterVolumeSpecName: "inventory") pod "433cb022-0613-4bf3-81ed-8b4239e48629" (UID: "433cb022-0613-4bf3-81ed-8b4239e48629"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:42:20 crc kubenswrapper[4843]: I0318 12:42:20.950452 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/433cb022-0613-4bf3-81ed-8b4239e48629-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "433cb022-0613-4bf3-81ed-8b4239e48629" (UID: "433cb022-0613-4bf3-81ed-8b4239e48629"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.021558 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qklf\" (UniqueName: \"kubernetes.io/projected/433cb022-0613-4bf3-81ed-8b4239e48629-kube-api-access-8qklf\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.021593 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/433cb022-0613-4bf3-81ed-8b4239e48629-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.021604 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/433cb022-0613-4bf3-81ed-8b4239e48629-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.420854 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" event={"ID":"433cb022-0613-4bf3-81ed-8b4239e48629","Type":"ContainerDied","Data":"d0a9648715883801db1a7f790d5b88a2061bd8f609b8b2a0808d11ef277f50d8"} Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.420920 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0a9648715883801db1a7f790d5b88a2061bd8f609b8b2a0808d11ef277f50d8" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.420968 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.514573 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx"] Mar 18 12:42:21 crc kubenswrapper[4843]: E0318 12:42:21.515373 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b06ed2-150d-4ded-97d8-24201f827e09" containerName="oc" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.515416 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b06ed2-150d-4ded-97d8-24201f827e09" containerName="oc" Mar 18 12:42:21 crc kubenswrapper[4843]: E0318 12:42:21.515474 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="433cb022-0613-4bf3-81ed-8b4239e48629" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.515484 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="433cb022-0613-4bf3-81ed-8b4239e48629" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.515716 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="433cb022-0613-4bf3-81ed-8b4239e48629" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.515764 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b06ed2-150d-4ded-97d8-24201f827e09" containerName="oc" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.517032 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.519841 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.520112 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.520260 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.520473 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.534108 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf8d8fe5-8550-45de-9087-a47fb8695b53-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx\" (UID: \"bf8d8fe5-8550-45de-9087-a47fb8695b53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.534214 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf8d8fe5-8550-45de-9087-a47fb8695b53-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx\" (UID: \"bf8d8fe5-8550-45de-9087-a47fb8695b53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.534264 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnnfc\" (UniqueName: \"kubernetes.io/projected/bf8d8fe5-8550-45de-9087-a47fb8695b53-kube-api-access-dnnfc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx\" (UID: \"bf8d8fe5-8550-45de-9087-a47fb8695b53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.535692 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx"] Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.636690 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf8d8fe5-8550-45de-9087-a47fb8695b53-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx\" (UID: \"bf8d8fe5-8550-45de-9087-a47fb8695b53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.637090 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnnfc\" (UniqueName: \"kubernetes.io/projected/bf8d8fe5-8550-45de-9087-a47fb8695b53-kube-api-access-dnnfc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx\" (UID: \"bf8d8fe5-8550-45de-9087-a47fb8695b53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.637207 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf8d8fe5-8550-45de-9087-a47fb8695b53-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx\" (UID: \"bf8d8fe5-8550-45de-9087-a47fb8695b53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.641095 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf8d8fe5-8550-45de-9087-a47fb8695b53-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx\" (UID: \"bf8d8fe5-8550-45de-9087-a47fb8695b53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.641560 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf8d8fe5-8550-45de-9087-a47fb8695b53-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx\" (UID: \"bf8d8fe5-8550-45de-9087-a47fb8695b53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" Mar 18 12:42:21 crc kubenswrapper[4843]: I0318 12:42:21.657402 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnnfc\" (UniqueName: \"kubernetes.io/projected/bf8d8fe5-8550-45de-9087-a47fb8695b53-kube-api-access-dnnfc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx\" (UID: \"bf8d8fe5-8550-45de-9087-a47fb8695b53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" Mar 18 12:42:22 crc kubenswrapper[4843]: I0318 12:42:22.385014 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:42:22 crc kubenswrapper[4843]: I0318 12:42:22.385196 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" Mar 18 12:42:22 crc kubenswrapper[4843]: I0318 12:42:22.966936 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx"] Mar 18 12:42:22 crc kubenswrapper[4843]: W0318 12:42:22.975372 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf8d8fe5_8550_45de_9087_a47fb8695b53.slice/crio-cf94ba38e74654337744b5cfb61864297f6ed3e45ee3232261aa8b60e661118f WatchSource:0}: Error finding container cf94ba38e74654337744b5cfb61864297f6ed3e45ee3232261aa8b60e661118f: Status 404 returned error can't find the container with id cf94ba38e74654337744b5cfb61864297f6ed3e45ee3232261aa8b60e661118f Mar 18 12:42:23 crc kubenswrapper[4843]: I0318 12:42:23.462382 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"adf4e676a33d49fbbd7d287fa0d1930ae6c4bf2cf91763487c46260a988c52c0"} Mar 18 12:42:23 crc kubenswrapper[4843]: I0318 12:42:23.466079 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" event={"ID":"bf8d8fe5-8550-45de-9087-a47fb8695b53","Type":"ContainerStarted","Data":"cf94ba38e74654337744b5cfb61864297f6ed3e45ee3232261aa8b60e661118f"} Mar 18 12:42:23 crc kubenswrapper[4843]: I0318 12:42:23.514236 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" podStartSLOduration=2.301853386 podStartE2EDuration="2.514208901s" podCreationTimestamp="2026-03-18 12:42:21 +0000 UTC" firstStartedPulling="2026-03-18 12:42:22.981076323 +0000 UTC m=+1976.696901847" lastFinishedPulling="2026-03-18 12:42:23.193431828 +0000 UTC m=+1976.909257362" observedRunningTime="2026-03-18 12:42:23.511852554 +0000 UTC m=+1977.227678088" watchObservedRunningTime="2026-03-18 12:42:23.514208901 +0000 UTC m=+1977.230034435" Mar 18 12:42:24 crc kubenswrapper[4843]: I0318 12:42:24.475944 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" event={"ID":"bf8d8fe5-8550-45de-9087-a47fb8695b53","Type":"ContainerStarted","Data":"0fa621963788678762dd4923a19f0f93248708f438e9d925746787977747cbdf"} Mar 18 12:42:31 crc kubenswrapper[4843]: I0318 12:42:31.065721 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-p7qr8"] Mar 18 12:42:31 crc kubenswrapper[4843]: I0318 12:42:31.078681 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-p7qr8"] Mar 18 12:42:32 crc kubenswrapper[4843]: I0318 12:42:32.996322 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3cca079-c7f7-4ec6-a16b-30ba392d0e57" path="/var/lib/kubelet/pods/d3cca079-c7f7-4ec6-a16b-30ba392d0e57/volumes" Mar 18 12:42:38 crc kubenswrapper[4843]: I0318 12:42:38.096340 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-pz9j4"] Mar 18 12:42:38 crc kubenswrapper[4843]: I0318 12:42:38.107935 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-pz9j4"] Mar 18 12:42:39 crc kubenswrapper[4843]: I0318 12:42:39.000305 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1c39df-268b-4d85-a616-32c282f9a19b" path="/var/lib/kubelet/pods/2a1c39df-268b-4d85-a616-32c282f9a19b/volumes" Mar 18 12:42:39 crc kubenswrapper[4843]: I0318 12:42:39.034258 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-sq4hg"] Mar 18 12:42:39 crc kubenswrapper[4843]: I0318 12:42:39.047852 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-sq4hg"] Mar 18 12:42:40 crc kubenswrapper[4843]: I0318 12:42:40.994700 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b82175f-cf5a-4d25-81c2-2c70df039edd" path="/var/lib/kubelet/pods/9b82175f-cf5a-4d25-81c2-2c70df039edd/volumes" Mar 18 12:42:42 crc kubenswrapper[4843]: I0318 12:42:42.062080 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9g2xs"] Mar 18 12:42:42 crc kubenswrapper[4843]: I0318 12:42:42.076312 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9g2xs"] Mar 18 12:42:42 crc kubenswrapper[4843]: I0318 12:42:42.997817 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7dbbfca-f6b9-4421-8662-64ce08dade2d" path="/var/lib/kubelet/pods/d7dbbfca-f6b9-4421-8662-64ce08dade2d/volumes" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.253237 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8885c"] Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.256840 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.281474 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8885c"] Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.372874 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/760fad51-7f11-4a46-8144-7db25721528a-utilities\") pod \"redhat-marketplace-8885c\" (UID: \"760fad51-7f11-4a46-8144-7db25721528a\") " pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.373084 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9fcf\" (UniqueName: \"kubernetes.io/projected/760fad51-7f11-4a46-8144-7db25721528a-kube-api-access-l9fcf\") pod \"redhat-marketplace-8885c\" (UID: \"760fad51-7f11-4a46-8144-7db25721528a\") " pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.373470 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/760fad51-7f11-4a46-8144-7db25721528a-catalog-content\") pod \"redhat-marketplace-8885c\" (UID: \"760fad51-7f11-4a46-8144-7db25721528a\") " pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.449016 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fdt4m"] Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.451384 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.464576 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fdt4m"] Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.474809 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/760fad51-7f11-4a46-8144-7db25721528a-catalog-content\") pod \"redhat-marketplace-8885c\" (UID: \"760fad51-7f11-4a46-8144-7db25721528a\") " pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.474919 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/760fad51-7f11-4a46-8144-7db25721528a-utilities\") pod \"redhat-marketplace-8885c\" (UID: \"760fad51-7f11-4a46-8144-7db25721528a\") " pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.474965 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9fcf\" (UniqueName: \"kubernetes.io/projected/760fad51-7f11-4a46-8144-7db25721528a-kube-api-access-l9fcf\") pod \"redhat-marketplace-8885c\" (UID: \"760fad51-7f11-4a46-8144-7db25721528a\") " pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.475005 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwv2d\" (UniqueName: \"kubernetes.io/projected/6c46e218-e514-4f69-ac0a-ab2033dc202c-kube-api-access-nwv2d\") pod \"redhat-operators-fdt4m\" (UID: \"6c46e218-e514-4f69-ac0a-ab2033dc202c\") " pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.475045 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c46e218-e514-4f69-ac0a-ab2033dc202c-catalog-content\") pod \"redhat-operators-fdt4m\" (UID: \"6c46e218-e514-4f69-ac0a-ab2033dc202c\") " pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.475072 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c46e218-e514-4f69-ac0a-ab2033dc202c-utilities\") pod \"redhat-operators-fdt4m\" (UID: \"6c46e218-e514-4f69-ac0a-ab2033dc202c\") " pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.475473 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/760fad51-7f11-4a46-8144-7db25721528a-utilities\") pod \"redhat-marketplace-8885c\" (UID: \"760fad51-7f11-4a46-8144-7db25721528a\") " pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.475483 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/760fad51-7f11-4a46-8144-7db25721528a-catalog-content\") pod \"redhat-marketplace-8885c\" (UID: \"760fad51-7f11-4a46-8144-7db25721528a\") " pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.506525 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9fcf\" (UniqueName: \"kubernetes.io/projected/760fad51-7f11-4a46-8144-7db25721528a-kube-api-access-l9fcf\") pod \"redhat-marketplace-8885c\" (UID: \"760fad51-7f11-4a46-8144-7db25721528a\") " pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.577142 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwv2d\" (UniqueName: \"kubernetes.io/projected/6c46e218-e514-4f69-ac0a-ab2033dc202c-kube-api-access-nwv2d\") pod \"redhat-operators-fdt4m\" (UID: \"6c46e218-e514-4f69-ac0a-ab2033dc202c\") " pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.577219 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c46e218-e514-4f69-ac0a-ab2033dc202c-catalog-content\") pod \"redhat-operators-fdt4m\" (UID: \"6c46e218-e514-4f69-ac0a-ab2033dc202c\") " pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.577267 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c46e218-e514-4f69-ac0a-ab2033dc202c-utilities\") pod \"redhat-operators-fdt4m\" (UID: \"6c46e218-e514-4f69-ac0a-ab2033dc202c\") " pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.578284 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c46e218-e514-4f69-ac0a-ab2033dc202c-utilities\") pod \"redhat-operators-fdt4m\" (UID: \"6c46e218-e514-4f69-ac0a-ab2033dc202c\") " pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.578461 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c46e218-e514-4f69-ac0a-ab2033dc202c-catalog-content\") pod \"redhat-operators-fdt4m\" (UID: \"6c46e218-e514-4f69-ac0a-ab2033dc202c\") " pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.596272 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwv2d\" (UniqueName: \"kubernetes.io/projected/6c46e218-e514-4f69-ac0a-ab2033dc202c-kube-api-access-nwv2d\") pod \"redhat-operators-fdt4m\" (UID: \"6c46e218-e514-4f69-ac0a-ab2033dc202c\") " pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.625370 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:42:55 crc kubenswrapper[4843]: I0318 12:42:55.785312 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:42:56 crc kubenswrapper[4843]: I0318 12:42:56.269437 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8885c"] Mar 18 12:42:56 crc kubenswrapper[4843]: I0318 12:42:56.356282 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fdt4m"] Mar 18 12:42:56 crc kubenswrapper[4843]: W0318 12:42:56.357461 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c46e218_e514_4f69_ac0a_ab2033dc202c.slice/crio-23604f28d48fe02cf936724618b74071e7b0f8e8d1da86995c66c48e8f6824e6 WatchSource:0}: Error finding container 23604f28d48fe02cf936724618b74071e7b0f8e8d1da86995c66c48e8f6824e6: Status 404 returned error can't find the container with id 23604f28d48fe02cf936724618b74071e7b0f8e8d1da86995c66c48e8f6824e6 Mar 18 12:42:56 crc kubenswrapper[4843]: I0318 12:42:56.809022 4843 generic.go:334] "Generic (PLEG): container finished" podID="6c46e218-e514-4f69-ac0a-ab2033dc202c" containerID="ef3dc66e2a7871c058603436ef2de2c88f5157e9d1d2d0cc7811acbcf77f222d" exitCode=0 Mar 18 12:42:56 crc kubenswrapper[4843]: I0318 12:42:56.809094 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdt4m" event={"ID":"6c46e218-e514-4f69-ac0a-ab2033dc202c","Type":"ContainerDied","Data":"ef3dc66e2a7871c058603436ef2de2c88f5157e9d1d2d0cc7811acbcf77f222d"} Mar 18 12:42:56 crc kubenswrapper[4843]: I0318 12:42:56.809132 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdt4m" event={"ID":"6c46e218-e514-4f69-ac0a-ab2033dc202c","Type":"ContainerStarted","Data":"23604f28d48fe02cf936724618b74071e7b0f8e8d1da86995c66c48e8f6824e6"} Mar 18 12:42:56 crc kubenswrapper[4843]: I0318 12:42:56.814604 4843 generic.go:334] "Generic (PLEG): container finished" podID="760fad51-7f11-4a46-8144-7db25721528a" containerID="d19a2b96996881de382fd5d1abc5ebfe11b74a2ff1479aa8fa4ff2e41cc6c02b" exitCode=0 Mar 18 12:42:56 crc kubenswrapper[4843]: I0318 12:42:56.814669 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8885c" event={"ID":"760fad51-7f11-4a46-8144-7db25721528a","Type":"ContainerDied","Data":"d19a2b96996881de382fd5d1abc5ebfe11b74a2ff1479aa8fa4ff2e41cc6c02b"} Mar 18 12:42:56 crc kubenswrapper[4843]: I0318 12:42:56.814700 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8885c" event={"ID":"760fad51-7f11-4a46-8144-7db25721528a","Type":"ContainerStarted","Data":"ac09cdb23dfef89144d1e132fdd0ed754fba42afbf7a3b92b76e7637051a95df"} Mar 18 12:42:58 crc kubenswrapper[4843]: I0318 12:42:58.841168 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdt4m" event={"ID":"6c46e218-e514-4f69-ac0a-ab2033dc202c","Type":"ContainerStarted","Data":"de096492d7e2d7c77b57f5a6371d90b29ef8c44777e5e66d5ec55769a9fe72e6"} Mar 18 12:42:58 crc kubenswrapper[4843]: I0318 12:42:58.845471 4843 generic.go:334] "Generic (PLEG): container finished" podID="760fad51-7f11-4a46-8144-7db25721528a" containerID="58291f52595b71c20759326148f74638ef6d27efae2307e284bf0114083396e8" exitCode=0 Mar 18 12:42:58 crc kubenswrapper[4843]: I0318 12:42:58.845525 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8885c" event={"ID":"760fad51-7f11-4a46-8144-7db25721528a","Type":"ContainerDied","Data":"58291f52595b71c20759326148f74638ef6d27efae2307e284bf0114083396e8"} Mar 18 12:42:59 crc kubenswrapper[4843]: I0318 12:42:59.858930 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8885c" event={"ID":"760fad51-7f11-4a46-8144-7db25721528a","Type":"ContainerStarted","Data":"e4430bc02d91bb32dbd1747988473d6c84bff6af658788cb75388abe74f763d7"} Mar 18 12:42:59 crc kubenswrapper[4843]: I0318 12:42:59.893810 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8885c" podStartSLOduration=2.485114922 podStartE2EDuration="4.893784461s" podCreationTimestamp="2026-03-18 12:42:55 +0000 UTC" firstStartedPulling="2026-03-18 12:42:56.816570698 +0000 UTC m=+2010.532396222" lastFinishedPulling="2026-03-18 12:42:59.225240237 +0000 UTC m=+2012.941065761" observedRunningTime="2026-03-18 12:42:59.886502255 +0000 UTC m=+2013.602327819" watchObservedRunningTime="2026-03-18 12:42:59.893784461 +0000 UTC m=+2013.609609985" Mar 18 12:43:02 crc kubenswrapper[4843]: I0318 12:43:02.892666 4843 generic.go:334] "Generic (PLEG): container finished" podID="6c46e218-e514-4f69-ac0a-ab2033dc202c" containerID="de096492d7e2d7c77b57f5a6371d90b29ef8c44777e5e66d5ec55769a9fe72e6" exitCode=0 Mar 18 12:43:02 crc kubenswrapper[4843]: I0318 12:43:02.892771 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdt4m" event={"ID":"6c46e218-e514-4f69-ac0a-ab2033dc202c","Type":"ContainerDied","Data":"de096492d7e2d7c77b57f5a6371d90b29ef8c44777e5e66d5ec55769a9fe72e6"} Mar 18 12:43:03 crc kubenswrapper[4843]: I0318 12:43:03.104163 4843 scope.go:117] "RemoveContainer" containerID="fa1fc78dced51574579618c617598c724dfdbf2b8c155f325367a1b88c042ca9" Mar 18 12:43:03 crc kubenswrapper[4843]: I0318 12:43:03.213104 4843 scope.go:117] "RemoveContainer" containerID="13bd3fef3bec19cff1dcfd6380c20c71e6ea6cb88b1ca2611be4adc3e5038eb6" Mar 18 12:43:03 crc kubenswrapper[4843]: I0318 12:43:03.309839 4843 scope.go:117] "RemoveContainer" containerID="b7b445d9096539685fc2460f496f3352460c743a668f8c59b1e932c144bde140" Mar 18 12:43:03 crc kubenswrapper[4843]: I0318 12:43:03.360712 4843 scope.go:117] "RemoveContainer" containerID="82935de4b27ce7ee9cad5df7a2045d570aeb4185bfe355cadfae7b2fc128ace5" Mar 18 12:43:03 crc kubenswrapper[4843]: I0318 12:43:03.417213 4843 scope.go:117] "RemoveContainer" containerID="b2ff41297aa76480541413c952e384c624a076e36b9c789c975b0e5552cee218" Mar 18 12:43:03 crc kubenswrapper[4843]: I0318 12:43:03.508866 4843 scope.go:117] "RemoveContainer" containerID="6f282cc8f8ddfdc08e0adeca5d7a07024ca87f9c22fcd319b45a763036b3bf59" Mar 18 12:43:03 crc kubenswrapper[4843]: I0318 12:43:03.909009 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdt4m" event={"ID":"6c46e218-e514-4f69-ac0a-ab2033dc202c","Type":"ContainerStarted","Data":"6d9a0b3e6fe3a810970eb9938cfdeee105ee936352a9a1d77c88225ca70ed6ee"} Mar 18 12:43:05 crc kubenswrapper[4843]: I0318 12:43:05.626018 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:43:05 crc kubenswrapper[4843]: I0318 12:43:05.627120 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:43:05 crc kubenswrapper[4843]: I0318 12:43:05.686871 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:43:05 crc kubenswrapper[4843]: I0318 12:43:05.721680 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fdt4m" podStartSLOduration=4.2006804859999995 podStartE2EDuration="10.721634089s" podCreationTimestamp="2026-03-18 12:42:55 +0000 UTC" firstStartedPulling="2026-03-18 12:42:56.811025781 +0000 UTC m=+2010.526851305" lastFinishedPulling="2026-03-18 12:43:03.331979384 +0000 UTC m=+2017.047804908" observedRunningTime="2026-03-18 12:43:03.948427802 +0000 UTC m=+2017.664253396" watchObservedRunningTime="2026-03-18 12:43:05.721634089 +0000 UTC m=+2019.437459623" Mar 18 12:43:05 crc kubenswrapper[4843]: I0318 12:43:05.786066 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:43:05 crc kubenswrapper[4843]: I0318 12:43:05.786557 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:43:05 crc kubenswrapper[4843]: I0318 12:43:05.986781 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:43:06 crc kubenswrapper[4843]: I0318 12:43:06.844910 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fdt4m" podUID="6c46e218-e514-4f69-ac0a-ab2033dc202c" containerName="registry-server" probeResult="failure" output=< Mar 18 12:43:06 crc kubenswrapper[4843]: timeout: failed to connect service ":50051" within 1s Mar 18 12:43:06 crc kubenswrapper[4843]: > Mar 18 12:43:07 crc kubenswrapper[4843]: I0318 12:43:07.037776 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8885c"] Mar 18 12:43:07 crc kubenswrapper[4843]: I0318 12:43:07.946850 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8885c" podUID="760fad51-7f11-4a46-8144-7db25721528a" containerName="registry-server" containerID="cri-o://e4430bc02d91bb32dbd1747988473d6c84bff6af658788cb75388abe74f763d7" gracePeriod=2 Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.480095 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.511498 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9fcf\" (UniqueName: \"kubernetes.io/projected/760fad51-7f11-4a46-8144-7db25721528a-kube-api-access-l9fcf\") pod \"760fad51-7f11-4a46-8144-7db25721528a\" (UID: \"760fad51-7f11-4a46-8144-7db25721528a\") " Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.511547 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/760fad51-7f11-4a46-8144-7db25721528a-utilities\") pod \"760fad51-7f11-4a46-8144-7db25721528a\" (UID: \"760fad51-7f11-4a46-8144-7db25721528a\") " Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.511623 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/760fad51-7f11-4a46-8144-7db25721528a-catalog-content\") pod \"760fad51-7f11-4a46-8144-7db25721528a\" (UID: \"760fad51-7f11-4a46-8144-7db25721528a\") " Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.512704 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/760fad51-7f11-4a46-8144-7db25721528a-utilities" (OuterVolumeSpecName: "utilities") pod "760fad51-7f11-4a46-8144-7db25721528a" (UID: "760fad51-7f11-4a46-8144-7db25721528a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.519449 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/760fad51-7f11-4a46-8144-7db25721528a-kube-api-access-l9fcf" (OuterVolumeSpecName: "kube-api-access-l9fcf") pod "760fad51-7f11-4a46-8144-7db25721528a" (UID: "760fad51-7f11-4a46-8144-7db25721528a"). InnerVolumeSpecName "kube-api-access-l9fcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.552274 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/760fad51-7f11-4a46-8144-7db25721528a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "760fad51-7f11-4a46-8144-7db25721528a" (UID: "760fad51-7f11-4a46-8144-7db25721528a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.613644 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9fcf\" (UniqueName: \"kubernetes.io/projected/760fad51-7f11-4a46-8144-7db25721528a-kube-api-access-l9fcf\") on node \"crc\" DevicePath \"\"" Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.613749 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/760fad51-7f11-4a46-8144-7db25721528a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.613763 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/760fad51-7f11-4a46-8144-7db25721528a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.963995 4843 generic.go:334] "Generic (PLEG): container finished" podID="760fad51-7f11-4a46-8144-7db25721528a" containerID="e4430bc02d91bb32dbd1747988473d6c84bff6af658788cb75388abe74f763d7" exitCode=0 Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.964056 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8885c" event={"ID":"760fad51-7f11-4a46-8144-7db25721528a","Type":"ContainerDied","Data":"e4430bc02d91bb32dbd1747988473d6c84bff6af658788cb75388abe74f763d7"} Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.964089 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8885c" event={"ID":"760fad51-7f11-4a46-8144-7db25721528a","Type":"ContainerDied","Data":"ac09cdb23dfef89144d1e132fdd0ed754fba42afbf7a3b92b76e7637051a95df"} Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.964111 4843 scope.go:117] "RemoveContainer" containerID="e4430bc02d91bb32dbd1747988473d6c84bff6af658788cb75388abe74f763d7" Mar 18 12:43:08 crc kubenswrapper[4843]: I0318 12:43:08.964273 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8885c" Mar 18 12:43:09 crc kubenswrapper[4843]: I0318 12:43:09.019966 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8885c"] Mar 18 12:43:09 crc kubenswrapper[4843]: I0318 12:43:09.040880 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8885c"] Mar 18 12:43:09 crc kubenswrapper[4843]: I0318 12:43:09.046004 4843 scope.go:117] "RemoveContainer" containerID="58291f52595b71c20759326148f74638ef6d27efae2307e284bf0114083396e8" Mar 18 12:43:09 crc kubenswrapper[4843]: I0318 12:43:09.049760 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9tcmj"] Mar 18 12:43:09 crc kubenswrapper[4843]: I0318 12:43:09.062111 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9tcmj"] Mar 18 12:43:09 crc kubenswrapper[4843]: I0318 12:43:09.078217 4843 scope.go:117] "RemoveContainer" containerID="d19a2b96996881de382fd5d1abc5ebfe11b74a2ff1479aa8fa4ff2e41cc6c02b" Mar 18 12:43:09 crc kubenswrapper[4843]: I0318 12:43:09.128982 4843 scope.go:117] "RemoveContainer" containerID="e4430bc02d91bb32dbd1747988473d6c84bff6af658788cb75388abe74f763d7" Mar 18 12:43:09 crc kubenswrapper[4843]: E0318 12:43:09.129718 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4430bc02d91bb32dbd1747988473d6c84bff6af658788cb75388abe74f763d7\": container with ID starting with e4430bc02d91bb32dbd1747988473d6c84bff6af658788cb75388abe74f763d7 not found: ID does not exist" containerID="e4430bc02d91bb32dbd1747988473d6c84bff6af658788cb75388abe74f763d7" Mar 18 12:43:09 crc kubenswrapper[4843]: I0318 12:43:09.129774 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4430bc02d91bb32dbd1747988473d6c84bff6af658788cb75388abe74f763d7"} err="failed to get container status \"e4430bc02d91bb32dbd1747988473d6c84bff6af658788cb75388abe74f763d7\": rpc error: code = NotFound desc = could not find container \"e4430bc02d91bb32dbd1747988473d6c84bff6af658788cb75388abe74f763d7\": container with ID starting with e4430bc02d91bb32dbd1747988473d6c84bff6af658788cb75388abe74f763d7 not found: ID does not exist" Mar 18 12:43:09 crc kubenswrapper[4843]: I0318 12:43:09.129810 4843 scope.go:117] "RemoveContainer" containerID="58291f52595b71c20759326148f74638ef6d27efae2307e284bf0114083396e8" Mar 18 12:43:09 crc kubenswrapper[4843]: E0318 12:43:09.130265 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58291f52595b71c20759326148f74638ef6d27efae2307e284bf0114083396e8\": container with ID starting with 58291f52595b71c20759326148f74638ef6d27efae2307e284bf0114083396e8 not found: ID does not exist" containerID="58291f52595b71c20759326148f74638ef6d27efae2307e284bf0114083396e8" Mar 18 12:43:09 crc kubenswrapper[4843]: I0318 12:43:09.130325 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58291f52595b71c20759326148f74638ef6d27efae2307e284bf0114083396e8"} err="failed to get container status \"58291f52595b71c20759326148f74638ef6d27efae2307e284bf0114083396e8\": rpc error: code = NotFound desc = could not find container \"58291f52595b71c20759326148f74638ef6d27efae2307e284bf0114083396e8\": container with ID starting with 58291f52595b71c20759326148f74638ef6d27efae2307e284bf0114083396e8 not found: ID does not exist" Mar 18 12:43:09 crc kubenswrapper[4843]: I0318 12:43:09.130369 4843 scope.go:117] "RemoveContainer" containerID="d19a2b96996881de382fd5d1abc5ebfe11b74a2ff1479aa8fa4ff2e41cc6c02b" Mar 18 12:43:09 crc kubenswrapper[4843]: E0318 12:43:09.130690 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d19a2b96996881de382fd5d1abc5ebfe11b74a2ff1479aa8fa4ff2e41cc6c02b\": container with ID starting with d19a2b96996881de382fd5d1abc5ebfe11b74a2ff1479aa8fa4ff2e41cc6c02b not found: ID does not exist" containerID="d19a2b96996881de382fd5d1abc5ebfe11b74a2ff1479aa8fa4ff2e41cc6c02b" Mar 18 12:43:09 crc kubenswrapper[4843]: I0318 12:43:09.130730 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d19a2b96996881de382fd5d1abc5ebfe11b74a2ff1479aa8fa4ff2e41cc6c02b"} err="failed to get container status \"d19a2b96996881de382fd5d1abc5ebfe11b74a2ff1479aa8fa4ff2e41cc6c02b\": rpc error: code = NotFound desc = could not find container \"d19a2b96996881de382fd5d1abc5ebfe11b74a2ff1479aa8fa4ff2e41cc6c02b\": container with ID starting with d19a2b96996881de382fd5d1abc5ebfe11b74a2ff1479aa8fa4ff2e41cc6c02b not found: ID does not exist" Mar 18 12:43:10 crc kubenswrapper[4843]: I0318 12:43:10.993635 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d32f17-ac20-4f6d-8e00-db5fdafdc210" path="/var/lib/kubelet/pods/16d32f17-ac20-4f6d-8e00-db5fdafdc210/volumes" Mar 18 12:43:10 crc kubenswrapper[4843]: I0318 12:43:10.994574 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="760fad51-7f11-4a46-8144-7db25721528a" path="/var/lib/kubelet/pods/760fad51-7f11-4a46-8144-7db25721528a/volumes" Mar 18 12:43:15 crc kubenswrapper[4843]: I0318 12:43:15.836289 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:43:15 crc kubenswrapper[4843]: I0318 12:43:15.898716 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:43:16 crc kubenswrapper[4843]: I0318 12:43:16.078853 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fdt4m"] Mar 18 12:43:17 crc kubenswrapper[4843]: I0318 12:43:17.103187 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fdt4m" podUID="6c46e218-e514-4f69-ac0a-ab2033dc202c" containerName="registry-server" containerID="cri-o://6d9a0b3e6fe3a810970eb9938cfdeee105ee936352a9a1d77c88225ca70ed6ee" gracePeriod=2 Mar 18 12:43:17 crc kubenswrapper[4843]: I0318 12:43:17.625709 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:43:17 crc kubenswrapper[4843]: I0318 12:43:17.786874 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwv2d\" (UniqueName: \"kubernetes.io/projected/6c46e218-e514-4f69-ac0a-ab2033dc202c-kube-api-access-nwv2d\") pod \"6c46e218-e514-4f69-ac0a-ab2033dc202c\" (UID: \"6c46e218-e514-4f69-ac0a-ab2033dc202c\") " Mar 18 12:43:17 crc kubenswrapper[4843]: I0318 12:43:17.788162 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c46e218-e514-4f69-ac0a-ab2033dc202c-utilities\") pod \"6c46e218-e514-4f69-ac0a-ab2033dc202c\" (UID: \"6c46e218-e514-4f69-ac0a-ab2033dc202c\") " Mar 18 12:43:17 crc kubenswrapper[4843]: I0318 12:43:17.788523 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c46e218-e514-4f69-ac0a-ab2033dc202c-catalog-content\") pod \"6c46e218-e514-4f69-ac0a-ab2033dc202c\" (UID: \"6c46e218-e514-4f69-ac0a-ab2033dc202c\") " Mar 18 12:43:17 crc kubenswrapper[4843]: I0318 12:43:17.788827 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c46e218-e514-4f69-ac0a-ab2033dc202c-utilities" (OuterVolumeSpecName: "utilities") pod "6c46e218-e514-4f69-ac0a-ab2033dc202c" (UID: "6c46e218-e514-4f69-ac0a-ab2033dc202c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:43:17 crc kubenswrapper[4843]: I0318 12:43:17.789479 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c46e218-e514-4f69-ac0a-ab2033dc202c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:43:17 crc kubenswrapper[4843]: I0318 12:43:17.794846 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c46e218-e514-4f69-ac0a-ab2033dc202c-kube-api-access-nwv2d" (OuterVolumeSpecName: "kube-api-access-nwv2d") pod "6c46e218-e514-4f69-ac0a-ab2033dc202c" (UID: "6c46e218-e514-4f69-ac0a-ab2033dc202c"). InnerVolumeSpecName "kube-api-access-nwv2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:43:17 crc kubenswrapper[4843]: I0318 12:43:17.891966 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwv2d\" (UniqueName: \"kubernetes.io/projected/6c46e218-e514-4f69-ac0a-ab2033dc202c-kube-api-access-nwv2d\") on node \"crc\" DevicePath \"\"" Mar 18 12:43:17 crc kubenswrapper[4843]: I0318 12:43:17.940652 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c46e218-e514-4f69-ac0a-ab2033dc202c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c46e218-e514-4f69-ac0a-ab2033dc202c" (UID: "6c46e218-e514-4f69-ac0a-ab2033dc202c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:43:17 crc kubenswrapper[4843]: I0318 12:43:17.995065 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c46e218-e514-4f69-ac0a-ab2033dc202c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.115430 4843 generic.go:334] "Generic (PLEG): container finished" podID="6c46e218-e514-4f69-ac0a-ab2033dc202c" containerID="6d9a0b3e6fe3a810970eb9938cfdeee105ee936352a9a1d77c88225ca70ed6ee" exitCode=0 Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.115480 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdt4m" event={"ID":"6c46e218-e514-4f69-ac0a-ab2033dc202c","Type":"ContainerDied","Data":"6d9a0b3e6fe3a810970eb9938cfdeee105ee936352a9a1d77c88225ca70ed6ee"} Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.115512 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdt4m" event={"ID":"6c46e218-e514-4f69-ac0a-ab2033dc202c","Type":"ContainerDied","Data":"23604f28d48fe02cf936724618b74071e7b0f8e8d1da86995c66c48e8f6824e6"} Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.115534 4843 scope.go:117] "RemoveContainer" containerID="6d9a0b3e6fe3a810970eb9938cfdeee105ee936352a9a1d77c88225ca70ed6ee" Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.115735 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdt4m" Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.177259 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fdt4m"] Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.184959 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fdt4m"] Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.192263 4843 scope.go:117] "RemoveContainer" containerID="de096492d7e2d7c77b57f5a6371d90b29ef8c44777e5e66d5ec55769a9fe72e6" Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.247756 4843 scope.go:117] "RemoveContainer" containerID="ef3dc66e2a7871c058603436ef2de2c88f5157e9d1d2d0cc7811acbcf77f222d" Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.300308 4843 scope.go:117] "RemoveContainer" containerID="6d9a0b3e6fe3a810970eb9938cfdeee105ee936352a9a1d77c88225ca70ed6ee" Mar 18 12:43:18 crc kubenswrapper[4843]: E0318 12:43:18.301014 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d9a0b3e6fe3a810970eb9938cfdeee105ee936352a9a1d77c88225ca70ed6ee\": container with ID starting with 6d9a0b3e6fe3a810970eb9938cfdeee105ee936352a9a1d77c88225ca70ed6ee not found: ID does not exist" containerID="6d9a0b3e6fe3a810970eb9938cfdeee105ee936352a9a1d77c88225ca70ed6ee" Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.301071 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d9a0b3e6fe3a810970eb9938cfdeee105ee936352a9a1d77c88225ca70ed6ee"} err="failed to get container status \"6d9a0b3e6fe3a810970eb9938cfdeee105ee936352a9a1d77c88225ca70ed6ee\": rpc error: code = NotFound desc = could not find container \"6d9a0b3e6fe3a810970eb9938cfdeee105ee936352a9a1d77c88225ca70ed6ee\": container with ID starting with 6d9a0b3e6fe3a810970eb9938cfdeee105ee936352a9a1d77c88225ca70ed6ee not found: ID does not exist" Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.301114 4843 scope.go:117] "RemoveContainer" containerID="de096492d7e2d7c77b57f5a6371d90b29ef8c44777e5e66d5ec55769a9fe72e6" Mar 18 12:43:18 crc kubenswrapper[4843]: E0318 12:43:18.301517 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de096492d7e2d7c77b57f5a6371d90b29ef8c44777e5e66d5ec55769a9fe72e6\": container with ID starting with de096492d7e2d7c77b57f5a6371d90b29ef8c44777e5e66d5ec55769a9fe72e6 not found: ID does not exist" containerID="de096492d7e2d7c77b57f5a6371d90b29ef8c44777e5e66d5ec55769a9fe72e6" Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.301593 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de096492d7e2d7c77b57f5a6371d90b29ef8c44777e5e66d5ec55769a9fe72e6"} err="failed to get container status \"de096492d7e2d7c77b57f5a6371d90b29ef8c44777e5e66d5ec55769a9fe72e6\": rpc error: code = NotFound desc = could not find container \"de096492d7e2d7c77b57f5a6371d90b29ef8c44777e5e66d5ec55769a9fe72e6\": container with ID starting with de096492d7e2d7c77b57f5a6371d90b29ef8c44777e5e66d5ec55769a9fe72e6 not found: ID does not exist" Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.301638 4843 scope.go:117] "RemoveContainer" containerID="ef3dc66e2a7871c058603436ef2de2c88f5157e9d1d2d0cc7811acbcf77f222d" Mar 18 12:43:18 crc kubenswrapper[4843]: E0318 12:43:18.302608 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef3dc66e2a7871c058603436ef2de2c88f5157e9d1d2d0cc7811acbcf77f222d\": container with ID starting with ef3dc66e2a7871c058603436ef2de2c88f5157e9d1d2d0cc7811acbcf77f222d not found: ID does not exist" containerID="ef3dc66e2a7871c058603436ef2de2c88f5157e9d1d2d0cc7811acbcf77f222d" Mar 18 12:43:18 crc kubenswrapper[4843]: I0318 12:43:18.302662 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef3dc66e2a7871c058603436ef2de2c88f5157e9d1d2d0cc7811acbcf77f222d"} err="failed to get container status \"ef3dc66e2a7871c058603436ef2de2c88f5157e9d1d2d0cc7811acbcf77f222d\": rpc error: code = NotFound desc = could not find container \"ef3dc66e2a7871c058603436ef2de2c88f5157e9d1d2d0cc7811acbcf77f222d\": container with ID starting with ef3dc66e2a7871c058603436ef2de2c88f5157e9d1d2d0cc7811acbcf77f222d not found: ID does not exist" Mar 18 12:43:19 crc kubenswrapper[4843]: I0318 12:43:19.148963 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c46e218-e514-4f69-ac0a-ab2033dc202c" path="/var/lib/kubelet/pods/6c46e218-e514-4f69-ac0a-ab2033dc202c/volumes" Mar 18 12:43:29 crc kubenswrapper[4843]: I0318 12:43:29.241125 4843 generic.go:334] "Generic (PLEG): container finished" podID="bf8d8fe5-8550-45de-9087-a47fb8695b53" containerID="0fa621963788678762dd4923a19f0f93248708f438e9d925746787977747cbdf" exitCode=0 Mar 18 12:43:29 crc kubenswrapper[4843]: I0318 12:43:29.241210 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" event={"ID":"bf8d8fe5-8550-45de-9087-a47fb8695b53","Type":"ContainerDied","Data":"0fa621963788678762dd4923a19f0f93248708f438e9d925746787977747cbdf"} Mar 18 12:43:30 crc kubenswrapper[4843]: I0318 12:43:30.838359 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" Mar 18 12:43:30 crc kubenswrapper[4843]: I0318 12:43:30.979106 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf8d8fe5-8550-45de-9087-a47fb8695b53-ssh-key-openstack-edpm-ipam\") pod \"bf8d8fe5-8550-45de-9087-a47fb8695b53\" (UID: \"bf8d8fe5-8550-45de-9087-a47fb8695b53\") " Mar 18 12:43:30 crc kubenswrapper[4843]: I0318 12:43:30.979217 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf8d8fe5-8550-45de-9087-a47fb8695b53-inventory\") pod \"bf8d8fe5-8550-45de-9087-a47fb8695b53\" (UID: \"bf8d8fe5-8550-45de-9087-a47fb8695b53\") " Mar 18 12:43:30 crc kubenswrapper[4843]: I0318 12:43:30.979267 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnnfc\" (UniqueName: \"kubernetes.io/projected/bf8d8fe5-8550-45de-9087-a47fb8695b53-kube-api-access-dnnfc\") pod \"bf8d8fe5-8550-45de-9087-a47fb8695b53\" (UID: \"bf8d8fe5-8550-45de-9087-a47fb8695b53\") " Mar 18 12:43:30 crc kubenswrapper[4843]: I0318 12:43:30.992224 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf8d8fe5-8550-45de-9087-a47fb8695b53-kube-api-access-dnnfc" (OuterVolumeSpecName: "kube-api-access-dnnfc") pod "bf8d8fe5-8550-45de-9087-a47fb8695b53" (UID: "bf8d8fe5-8550-45de-9087-a47fb8695b53"). InnerVolumeSpecName "kube-api-access-dnnfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.022936 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf8d8fe5-8550-45de-9087-a47fb8695b53-inventory" (OuterVolumeSpecName: "inventory") pod "bf8d8fe5-8550-45de-9087-a47fb8695b53" (UID: "bf8d8fe5-8550-45de-9087-a47fb8695b53"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.030574 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf8d8fe5-8550-45de-9087-a47fb8695b53-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bf8d8fe5-8550-45de-9087-a47fb8695b53" (UID: "bf8d8fe5-8550-45de-9087-a47fb8695b53"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.082051 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf8d8fe5-8550-45de-9087-a47fb8695b53-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.082116 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf8d8fe5-8550-45de-9087-a47fb8695b53-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.082136 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnnfc\" (UniqueName: \"kubernetes.io/projected/bf8d8fe5-8550-45de-9087-a47fb8695b53-kube-api-access-dnnfc\") on node \"crc\" DevicePath \"\"" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.264019 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" event={"ID":"bf8d8fe5-8550-45de-9087-a47fb8695b53","Type":"ContainerDied","Data":"cf94ba38e74654337744b5cfb61864297f6ed3e45ee3232261aa8b60e661118f"} Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.264072 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf94ba38e74654337744b5cfb61864297f6ed3e45ee3232261aa8b60e661118f" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.264117 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.366321 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w"] Mar 18 12:43:31 crc kubenswrapper[4843]: E0318 12:43:31.366838 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c46e218-e514-4f69-ac0a-ab2033dc202c" containerName="extract-utilities" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.366859 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c46e218-e514-4f69-ac0a-ab2033dc202c" containerName="extract-utilities" Mar 18 12:43:31 crc kubenswrapper[4843]: E0318 12:43:31.366880 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c46e218-e514-4f69-ac0a-ab2033dc202c" containerName="extract-content" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.366889 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c46e218-e514-4f69-ac0a-ab2033dc202c" containerName="extract-content" Mar 18 12:43:31 crc kubenswrapper[4843]: E0318 12:43:31.366919 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c46e218-e514-4f69-ac0a-ab2033dc202c" containerName="registry-server" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.366928 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c46e218-e514-4f69-ac0a-ab2033dc202c" containerName="registry-server" Mar 18 12:43:31 crc kubenswrapper[4843]: E0318 12:43:31.366940 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="760fad51-7f11-4a46-8144-7db25721528a" containerName="extract-utilities" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.366948 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="760fad51-7f11-4a46-8144-7db25721528a" containerName="extract-utilities" Mar 18 12:43:31 crc kubenswrapper[4843]: E0318 12:43:31.366958 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="760fad51-7f11-4a46-8144-7db25721528a" containerName="registry-server" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.366968 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="760fad51-7f11-4a46-8144-7db25721528a" containerName="registry-server" Mar 18 12:43:31 crc kubenswrapper[4843]: E0318 12:43:31.366984 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8d8fe5-8550-45de-9087-a47fb8695b53" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.366994 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8d8fe5-8550-45de-9087-a47fb8695b53" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 12:43:31 crc kubenswrapper[4843]: E0318 12:43:31.367015 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="760fad51-7f11-4a46-8144-7db25721528a" containerName="extract-content" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.367023 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="760fad51-7f11-4a46-8144-7db25721528a" containerName="extract-content" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.367241 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c46e218-e514-4f69-ac0a-ab2033dc202c" containerName="registry-server" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.367269 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="760fad51-7f11-4a46-8144-7db25721528a" containerName="registry-server" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.367287 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf8d8fe5-8550-45de-9087-a47fb8695b53" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.368092 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.390253 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.391593 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.391677 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.392125 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w"] Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.394193 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ed71bc4-df8e-47f7-87f0-bd7173061676-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-78z9w\" (UID: \"4ed71bc4-df8e-47f7-87f0-bd7173061676\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.394474 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmdjw\" (UniqueName: \"kubernetes.io/projected/4ed71bc4-df8e-47f7-87f0-bd7173061676-kube-api-access-gmdjw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-78z9w\" (UID: \"4ed71bc4-df8e-47f7-87f0-bd7173061676\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.394566 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ed71bc4-df8e-47f7-87f0-bd7173061676-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-78z9w\" (UID: \"4ed71bc4-df8e-47f7-87f0-bd7173061676\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.396007 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.497253 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ed71bc4-df8e-47f7-87f0-bd7173061676-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-78z9w\" (UID: \"4ed71bc4-df8e-47f7-87f0-bd7173061676\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.497631 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmdjw\" (UniqueName: \"kubernetes.io/projected/4ed71bc4-df8e-47f7-87f0-bd7173061676-kube-api-access-gmdjw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-78z9w\" (UID: \"4ed71bc4-df8e-47f7-87f0-bd7173061676\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.498017 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ed71bc4-df8e-47f7-87f0-bd7173061676-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-78z9w\" (UID: \"4ed71bc4-df8e-47f7-87f0-bd7173061676\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.504348 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ed71bc4-df8e-47f7-87f0-bd7173061676-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-78z9w\" (UID: \"4ed71bc4-df8e-47f7-87f0-bd7173061676\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.504693 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ed71bc4-df8e-47f7-87f0-bd7173061676-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-78z9w\" (UID: \"4ed71bc4-df8e-47f7-87f0-bd7173061676\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.515007 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmdjw\" (UniqueName: \"kubernetes.io/projected/4ed71bc4-df8e-47f7-87f0-bd7173061676-kube-api-access-gmdjw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-78z9w\" (UID: \"4ed71bc4-df8e-47f7-87f0-bd7173061676\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" Mar 18 12:43:31 crc kubenswrapper[4843]: I0318 12:43:31.690219 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" Mar 18 12:43:32 crc kubenswrapper[4843]: I0318 12:43:32.265501 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w"] Mar 18 12:43:33 crc kubenswrapper[4843]: I0318 12:43:33.358002 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" event={"ID":"4ed71bc4-df8e-47f7-87f0-bd7173061676","Type":"ContainerStarted","Data":"6bcb2497f57e928a5cf6678a2ceb0adf781738109b1afeb3402a7d3ee1d57bd4"} Mar 18 12:43:33 crc kubenswrapper[4843]: I0318 12:43:33.358315 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" event={"ID":"4ed71bc4-df8e-47f7-87f0-bd7173061676","Type":"ContainerStarted","Data":"bf7c15d278a2a14b58d735d16e8b93dd2f8b9d4412c817786c1e60579c9064b8"} Mar 18 12:43:33 crc kubenswrapper[4843]: I0318 12:43:33.385891 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" podStartSLOduration=2.21346254 podStartE2EDuration="2.385865158s" podCreationTimestamp="2026-03-18 12:43:31 +0000 UTC" firstStartedPulling="2026-03-18 12:43:32.268966131 +0000 UTC m=+2045.984791665" lastFinishedPulling="2026-03-18 12:43:32.441368759 +0000 UTC m=+2046.157194283" observedRunningTime="2026-03-18 12:43:33.377490581 +0000 UTC m=+2047.093316105" watchObservedRunningTime="2026-03-18 12:43:33.385865158 +0000 UTC m=+2047.101690682" Mar 18 12:43:35 crc kubenswrapper[4843]: I0318 12:43:35.044034 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7ggtj"] Mar 18 12:43:35 crc kubenswrapper[4843]: I0318 12:43:35.056381 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7ggtj"] Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.035178 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-l67jn"] Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.048569 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-7cjkn"] Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.065131 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2934-account-create-update-vtchm"] Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.075520 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-af08-account-create-update-5q5p4"] Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.082370 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0dac-account-create-update-ttlb8"] Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.089545 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-l67jn"] Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.096425 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-7cjkn"] Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.104306 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2934-account-create-update-vtchm"] Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.111530 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0dac-account-create-update-ttlb8"] Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.118670 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-af08-account-create-update-5q5p4"] Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.994389 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4" path="/var/lib/kubelet/pods/27e4dfa1-8e7f-42c8-9e5e-247f97ccc6f4/volumes" Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.995042 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf" path="/var/lib/kubelet/pods/3d64f775-a8ef-4bf4-a69f-8bd9dec34bbf/volumes" Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.995700 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479484ef-0791-436d-91df-e50b0b4390b2" path="/var/lib/kubelet/pods/479484ef-0791-436d-91df-e50b0b4390b2/volumes" Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.996340 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50047908-1fa0-4181-a690-f87f5d4b0a6a" path="/var/lib/kubelet/pods/50047908-1fa0-4181-a690-f87f5d4b0a6a/volumes" Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.997455 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed163bc-a192-464b-9025-43220a0864b7" path="/var/lib/kubelet/pods/7ed163bc-a192-464b-9025-43220a0864b7/volumes" Mar 18 12:43:36 crc kubenswrapper[4843]: I0318 12:43:36.998095 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97556db6-d486-4ca2-9218-13e7691ae6ee" path="/var/lib/kubelet/pods/97556db6-d486-4ca2-9218-13e7691ae6ee/volumes" Mar 18 12:43:38 crc kubenswrapper[4843]: I0318 12:43:38.633700 4843 generic.go:334] "Generic (PLEG): container finished" podID="4ed71bc4-df8e-47f7-87f0-bd7173061676" containerID="6bcb2497f57e928a5cf6678a2ceb0adf781738109b1afeb3402a7d3ee1d57bd4" exitCode=0 Mar 18 12:43:38 crc kubenswrapper[4843]: I0318 12:43:38.633904 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" event={"ID":"4ed71bc4-df8e-47f7-87f0-bd7173061676","Type":"ContainerDied","Data":"6bcb2497f57e928a5cf6678a2ceb0adf781738109b1afeb3402a7d3ee1d57bd4"} Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.290967 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.396037 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmdjw\" (UniqueName: \"kubernetes.io/projected/4ed71bc4-df8e-47f7-87f0-bd7173061676-kube-api-access-gmdjw\") pod \"4ed71bc4-df8e-47f7-87f0-bd7173061676\" (UID: \"4ed71bc4-df8e-47f7-87f0-bd7173061676\") " Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.396147 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ed71bc4-df8e-47f7-87f0-bd7173061676-ssh-key-openstack-edpm-ipam\") pod \"4ed71bc4-df8e-47f7-87f0-bd7173061676\" (UID: \"4ed71bc4-df8e-47f7-87f0-bd7173061676\") " Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.396211 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ed71bc4-df8e-47f7-87f0-bd7173061676-inventory\") pod \"4ed71bc4-df8e-47f7-87f0-bd7173061676\" (UID: \"4ed71bc4-df8e-47f7-87f0-bd7173061676\") " Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.405104 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed71bc4-df8e-47f7-87f0-bd7173061676-kube-api-access-gmdjw" (OuterVolumeSpecName: "kube-api-access-gmdjw") pod "4ed71bc4-df8e-47f7-87f0-bd7173061676" (UID: "4ed71bc4-df8e-47f7-87f0-bd7173061676"). InnerVolumeSpecName "kube-api-access-gmdjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.456993 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed71bc4-df8e-47f7-87f0-bd7173061676-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4ed71bc4-df8e-47f7-87f0-bd7173061676" (UID: "4ed71bc4-df8e-47f7-87f0-bd7173061676"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.470295 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed71bc4-df8e-47f7-87f0-bd7173061676-inventory" (OuterVolumeSpecName: "inventory") pod "4ed71bc4-df8e-47f7-87f0-bd7173061676" (UID: "4ed71bc4-df8e-47f7-87f0-bd7173061676"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.498834 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmdjw\" (UniqueName: \"kubernetes.io/projected/4ed71bc4-df8e-47f7-87f0-bd7173061676-kube-api-access-gmdjw\") on node \"crc\" DevicePath \"\"" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.498888 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ed71bc4-df8e-47f7-87f0-bd7173061676-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.498904 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ed71bc4-df8e-47f7-87f0-bd7173061676-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.676850 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" event={"ID":"4ed71bc4-df8e-47f7-87f0-bd7173061676","Type":"ContainerDied","Data":"bf7c15d278a2a14b58d735d16e8b93dd2f8b9d4412c817786c1e60579c9064b8"} Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.676897 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf7c15d278a2a14b58d735d16e8b93dd2f8b9d4412c817786c1e60579c9064b8" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.676956 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-78z9w" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.744014 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns"] Mar 18 12:43:40 crc kubenswrapper[4843]: E0318 12:43:40.744791 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed71bc4-df8e-47f7-87f0-bd7173061676" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.744819 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed71bc4-df8e-47f7-87f0-bd7173061676" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.745102 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ed71bc4-df8e-47f7-87f0-bd7173061676" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.746173 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.750020 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.750257 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.750478 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.750602 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.764366 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns"] Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.804903 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31c490be-9979-4c2c-b49a-191985508c24-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5x8ns\" (UID: \"31c490be-9979-4c2c-b49a-191985508c24\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.804974 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqgls\" (UniqueName: \"kubernetes.io/projected/31c490be-9979-4c2c-b49a-191985508c24-kube-api-access-zqgls\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5x8ns\" (UID: \"31c490be-9979-4c2c-b49a-191985508c24\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.805235 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31c490be-9979-4c2c-b49a-191985508c24-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5x8ns\" (UID: \"31c490be-9979-4c2c-b49a-191985508c24\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.907504 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31c490be-9979-4c2c-b49a-191985508c24-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5x8ns\" (UID: \"31c490be-9979-4c2c-b49a-191985508c24\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.907682 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31c490be-9979-4c2c-b49a-191985508c24-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5x8ns\" (UID: \"31c490be-9979-4c2c-b49a-191985508c24\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.907722 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqgls\" (UniqueName: \"kubernetes.io/projected/31c490be-9979-4c2c-b49a-191985508c24-kube-api-access-zqgls\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5x8ns\" (UID: \"31c490be-9979-4c2c-b49a-191985508c24\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.911520 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31c490be-9979-4c2c-b49a-191985508c24-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5x8ns\" (UID: \"31c490be-9979-4c2c-b49a-191985508c24\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.911573 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31c490be-9979-4c2c-b49a-191985508c24-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5x8ns\" (UID: \"31c490be-9979-4c2c-b49a-191985508c24\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" Mar 18 12:43:40 crc kubenswrapper[4843]: I0318 12:43:40.923613 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqgls\" (UniqueName: \"kubernetes.io/projected/31c490be-9979-4c2c-b49a-191985508c24-kube-api-access-zqgls\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5x8ns\" (UID: \"31c490be-9979-4c2c-b49a-191985508c24\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" Mar 18 12:43:41 crc kubenswrapper[4843]: I0318 12:43:41.072092 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" Mar 18 12:43:41 crc kubenswrapper[4843]: I0318 12:43:41.620537 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns"] Mar 18 12:43:41 crc kubenswrapper[4843]: I0318 12:43:41.688188 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" event={"ID":"31c490be-9979-4c2c-b49a-191985508c24","Type":"ContainerStarted","Data":"9c1735a87484b92d465a55f1c12eea409da90db173b065490243b98e143cae05"} Mar 18 12:43:42 crc kubenswrapper[4843]: I0318 12:43:42.698271 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" event={"ID":"31c490be-9979-4c2c-b49a-191985508c24","Type":"ContainerStarted","Data":"26a387cdb44188726ec38fa6ee807e0ca8798861d365bea68cd096b2c095b889"} Mar 18 12:43:42 crc kubenswrapper[4843]: I0318 12:43:42.716062 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" podStartSLOduration=2.526999326 podStartE2EDuration="2.716026136s" podCreationTimestamp="2026-03-18 12:43:40 +0000 UTC" firstStartedPulling="2026-03-18 12:43:41.626858645 +0000 UTC m=+2055.342684169" lastFinishedPulling="2026-03-18 12:43:41.815885415 +0000 UTC m=+2055.531710979" observedRunningTime="2026-03-18 12:43:42.712537497 +0000 UTC m=+2056.428363021" watchObservedRunningTime="2026-03-18 12:43:42.716026136 +0000 UTC m=+2056.431851660" Mar 18 12:44:00 crc kubenswrapper[4843]: I0318 12:44:00.160727 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563964-7vc65"] Mar 18 12:44:00 crc kubenswrapper[4843]: I0318 12:44:00.162610 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563964-7vc65" Mar 18 12:44:00 crc kubenswrapper[4843]: I0318 12:44:00.167134 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:44:00 crc kubenswrapper[4843]: I0318 12:44:00.168608 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:44:00 crc kubenswrapper[4843]: I0318 12:44:00.168793 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:44:00 crc kubenswrapper[4843]: I0318 12:44:00.171796 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563964-7vc65"] Mar 18 12:44:00 crc kubenswrapper[4843]: I0318 12:44:00.316041 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftz7g\" (UniqueName: \"kubernetes.io/projected/ddb370c2-9293-469a-bfbf-3f2ed7906e0d-kube-api-access-ftz7g\") pod \"auto-csr-approver-29563964-7vc65\" (UID: \"ddb370c2-9293-469a-bfbf-3f2ed7906e0d\") " pod="openshift-infra/auto-csr-approver-29563964-7vc65" Mar 18 12:44:00 crc kubenswrapper[4843]: I0318 12:44:00.418007 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftz7g\" (UniqueName: \"kubernetes.io/projected/ddb370c2-9293-469a-bfbf-3f2ed7906e0d-kube-api-access-ftz7g\") pod \"auto-csr-approver-29563964-7vc65\" (UID: \"ddb370c2-9293-469a-bfbf-3f2ed7906e0d\") " pod="openshift-infra/auto-csr-approver-29563964-7vc65" Mar 18 12:44:00 crc kubenswrapper[4843]: I0318 12:44:00.444353 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftz7g\" (UniqueName: \"kubernetes.io/projected/ddb370c2-9293-469a-bfbf-3f2ed7906e0d-kube-api-access-ftz7g\") pod \"auto-csr-approver-29563964-7vc65\" (UID: \"ddb370c2-9293-469a-bfbf-3f2ed7906e0d\") " pod="openshift-infra/auto-csr-approver-29563964-7vc65" Mar 18 12:44:00 crc kubenswrapper[4843]: I0318 12:44:00.488473 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563964-7vc65" Mar 18 12:44:01 crc kubenswrapper[4843]: I0318 12:44:01.861010 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563964-7vc65"] Mar 18 12:44:02 crc kubenswrapper[4843]: I0318 12:44:02.253488 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563964-7vc65" event={"ID":"ddb370c2-9293-469a-bfbf-3f2ed7906e0d","Type":"ContainerStarted","Data":"1ffe9f4c33f37f0a1884b15b2b9c86f80f3379afd7334b224f49ca8d94c827e1"} Mar 18 12:44:03 crc kubenswrapper[4843]: I0318 12:44:03.265717 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563964-7vc65" event={"ID":"ddb370c2-9293-469a-bfbf-3f2ed7906e0d","Type":"ContainerStarted","Data":"fb09b19d4bc1217603d3cf98120ebab664e4c6a5cf05157ae90fcf1b4e58c83b"} Mar 18 12:44:03 crc kubenswrapper[4843]: I0318 12:44:03.283944 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563964-7vc65" podStartSLOduration=2.299565235 podStartE2EDuration="3.283924285s" podCreationTimestamp="2026-03-18 12:44:00 +0000 UTC" firstStartedPulling="2026-03-18 12:44:01.88107892 +0000 UTC m=+2075.596904444" lastFinishedPulling="2026-03-18 12:44:02.86543797 +0000 UTC m=+2076.581263494" observedRunningTime="2026-03-18 12:44:03.279710716 +0000 UTC m=+2076.995536240" watchObservedRunningTime="2026-03-18 12:44:03.283924285 +0000 UTC m=+2076.999749809" Mar 18 12:44:03 crc kubenswrapper[4843]: I0318 12:44:03.780764 4843 scope.go:117] "RemoveContainer" containerID="8d928a4c134e013085bf463bbf4b6e935645e2bdea80e0e9e44784c294ec7395" Mar 18 12:44:03 crc kubenswrapper[4843]: I0318 12:44:03.835929 4843 scope.go:117] "RemoveContainer" containerID="5666649acb1afd40e1f7f5b2d8526cac42defb517c4ec93d694206562175521e" Mar 18 12:44:03 crc kubenswrapper[4843]: I0318 12:44:03.854049 4843 scope.go:117] "RemoveContainer" containerID="ce9e4149d7240e54ddab26e5ab4fc65e28f66ffa281e256fa2b07abbbabef878" Mar 18 12:44:03 crc kubenswrapper[4843]: I0318 12:44:03.892884 4843 scope.go:117] "RemoveContainer" containerID="2abf3ef0293d4755cd999d2ba56202726ddd7ea924d350e6620b91a8018db2e6" Mar 18 12:44:03 crc kubenswrapper[4843]: I0318 12:44:03.957385 4843 scope.go:117] "RemoveContainer" containerID="cec1fec0f0a4d9106d73fb660bb462b724583e5825f9fb7cf9c0ede2cb623a3e" Mar 18 12:44:03 crc kubenswrapper[4843]: I0318 12:44:03.976191 4843 scope.go:117] "RemoveContainer" containerID="20db8d802e7f4d1bdddd8859dd325e7f48989815b7ffbdf8b70d377128356192" Mar 18 12:44:04 crc kubenswrapper[4843]: I0318 12:44:04.026008 4843 scope.go:117] "RemoveContainer" containerID="5375cdbdf6aa2c6b49317e91459362258a0b5dea21f6782dc26054d721dd2c4d" Mar 18 12:44:04 crc kubenswrapper[4843]: I0318 12:44:04.276681 4843 generic.go:334] "Generic (PLEG): container finished" podID="ddb370c2-9293-469a-bfbf-3f2ed7906e0d" containerID="fb09b19d4bc1217603d3cf98120ebab664e4c6a5cf05157ae90fcf1b4e58c83b" exitCode=0 Mar 18 12:44:04 crc kubenswrapper[4843]: I0318 12:44:04.276748 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563964-7vc65" event={"ID":"ddb370c2-9293-469a-bfbf-3f2ed7906e0d","Type":"ContainerDied","Data":"fb09b19d4bc1217603d3cf98120ebab664e4c6a5cf05157ae90fcf1b4e58c83b"} Mar 18 12:44:05 crc kubenswrapper[4843]: I0318 12:44:05.047116 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jv79s"] Mar 18 12:44:05 crc kubenswrapper[4843]: I0318 12:44:05.061756 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jv79s"] Mar 18 12:44:06 crc kubenswrapper[4843]: I0318 12:44:06.536933 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563964-7vc65" Mar 18 12:44:06 crc kubenswrapper[4843]: I0318 12:44:06.557137 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftz7g\" (UniqueName: \"kubernetes.io/projected/ddb370c2-9293-469a-bfbf-3f2ed7906e0d-kube-api-access-ftz7g\") pod \"ddb370c2-9293-469a-bfbf-3f2ed7906e0d\" (UID: \"ddb370c2-9293-469a-bfbf-3f2ed7906e0d\") " Mar 18 12:44:06 crc kubenswrapper[4843]: I0318 12:44:06.586879 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddb370c2-9293-469a-bfbf-3f2ed7906e0d-kube-api-access-ftz7g" (OuterVolumeSpecName: "kube-api-access-ftz7g") pod "ddb370c2-9293-469a-bfbf-3f2ed7906e0d" (UID: "ddb370c2-9293-469a-bfbf-3f2ed7906e0d"). InnerVolumeSpecName "kube-api-access-ftz7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:44:06 crc kubenswrapper[4843]: I0318 12:44:06.662218 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftz7g\" (UniqueName: \"kubernetes.io/projected/ddb370c2-9293-469a-bfbf-3f2ed7906e0d-kube-api-access-ftz7g\") on node \"crc\" DevicePath \"\"" Mar 18 12:44:06 crc kubenswrapper[4843]: I0318 12:44:06.996207 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e5376f-b6a5-4462-925f-a0daa0d3aa5b" path="/var/lib/kubelet/pods/c2e5376f-b6a5-4462-925f-a0daa0d3aa5b/volumes" Mar 18 12:44:07 crc kubenswrapper[4843]: I0318 12:44:07.398553 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563964-7vc65" event={"ID":"ddb370c2-9293-469a-bfbf-3f2ed7906e0d","Type":"ContainerDied","Data":"1ffe9f4c33f37f0a1884b15b2b9c86f80f3379afd7334b224f49ca8d94c827e1"} Mar 18 12:44:07 crc kubenswrapper[4843]: I0318 12:44:07.398599 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563964-7vc65" Mar 18 12:44:07 crc kubenswrapper[4843]: I0318 12:44:07.398589 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ffe9f4c33f37f0a1884b15b2b9c86f80f3379afd7334b224f49ca8d94c827e1" Mar 18 12:44:07 crc kubenswrapper[4843]: I0318 12:44:07.605891 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563958-rtrsj"] Mar 18 12:44:07 crc kubenswrapper[4843]: I0318 12:44:07.615062 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563958-rtrsj"] Mar 18 12:44:08 crc kubenswrapper[4843]: I0318 12:44:08.998874 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a0bd964-b1ca-4243-991e-2bce7b1b8c84" path="/var/lib/kubelet/pods/4a0bd964-b1ca-4243-991e-2bce7b1b8c84/volumes" Mar 18 12:44:20 crc kubenswrapper[4843]: I0318 12:44:20.518870 4843 generic.go:334] "Generic (PLEG): container finished" podID="31c490be-9979-4c2c-b49a-191985508c24" containerID="26a387cdb44188726ec38fa6ee807e0ca8798861d365bea68cd096b2c095b889" exitCode=0 Mar 18 12:44:20 crc kubenswrapper[4843]: I0318 12:44:20.518960 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" event={"ID":"31c490be-9979-4c2c-b49a-191985508c24","Type":"ContainerDied","Data":"26a387cdb44188726ec38fa6ee807e0ca8798861d365bea68cd096b2c095b889"} Mar 18 12:44:21 crc kubenswrapper[4843]: I0318 12:44:21.952407 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.089675 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31c490be-9979-4c2c-b49a-191985508c24-inventory\") pod \"31c490be-9979-4c2c-b49a-191985508c24\" (UID: \"31c490be-9979-4c2c-b49a-191985508c24\") " Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.089871 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31c490be-9979-4c2c-b49a-191985508c24-ssh-key-openstack-edpm-ipam\") pod \"31c490be-9979-4c2c-b49a-191985508c24\" (UID: \"31c490be-9979-4c2c-b49a-191985508c24\") " Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.089919 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqgls\" (UniqueName: \"kubernetes.io/projected/31c490be-9979-4c2c-b49a-191985508c24-kube-api-access-zqgls\") pod \"31c490be-9979-4c2c-b49a-191985508c24\" (UID: \"31c490be-9979-4c2c-b49a-191985508c24\") " Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.095920 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31c490be-9979-4c2c-b49a-191985508c24-kube-api-access-zqgls" (OuterVolumeSpecName: "kube-api-access-zqgls") pod "31c490be-9979-4c2c-b49a-191985508c24" (UID: "31c490be-9979-4c2c-b49a-191985508c24"). InnerVolumeSpecName "kube-api-access-zqgls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.123782 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31c490be-9979-4c2c-b49a-191985508c24-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "31c490be-9979-4c2c-b49a-191985508c24" (UID: "31c490be-9979-4c2c-b49a-191985508c24"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.126329 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31c490be-9979-4c2c-b49a-191985508c24-inventory" (OuterVolumeSpecName: "inventory") pod "31c490be-9979-4c2c-b49a-191985508c24" (UID: "31c490be-9979-4c2c-b49a-191985508c24"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.195315 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31c490be-9979-4c2c-b49a-191985508c24-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.195361 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqgls\" (UniqueName: \"kubernetes.io/projected/31c490be-9979-4c2c-b49a-191985508c24-kube-api-access-zqgls\") on node \"crc\" DevicePath \"\"" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.195377 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31c490be-9979-4c2c-b49a-191985508c24-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.538325 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" event={"ID":"31c490be-9979-4c2c-b49a-191985508c24","Type":"ContainerDied","Data":"9c1735a87484b92d465a55f1c12eea409da90db173b065490243b98e143cae05"} Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.538377 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c1735a87484b92d465a55f1c12eea409da90db173b065490243b98e143cae05" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.538380 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5x8ns" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.622334 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr"] Mar 18 12:44:22 crc kubenswrapper[4843]: E0318 12:44:22.623028 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31c490be-9979-4c2c-b49a-191985508c24" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.623141 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="31c490be-9979-4c2c-b49a-191985508c24" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:44:22 crc kubenswrapper[4843]: E0318 12:44:22.623242 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb370c2-9293-469a-bfbf-3f2ed7906e0d" containerName="oc" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.623367 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb370c2-9293-469a-bfbf-3f2ed7906e0d" containerName="oc" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.623723 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddb370c2-9293-469a-bfbf-3f2ed7906e0d" containerName="oc" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.623862 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="31c490be-9979-4c2c-b49a-191985508c24" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.624782 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.631025 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.631281 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.631400 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.632201 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.636253 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr"] Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.706414 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b700040-447e-4328-a5ac-f2a5ee4fc818-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr\" (UID: \"6b700040-447e-4328-a5ac-f2a5ee4fc818\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.706566 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b700040-447e-4328-a5ac-f2a5ee4fc818-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr\" (UID: \"6b700040-447e-4328-a5ac-f2a5ee4fc818\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.706623 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n69tx\" (UniqueName: \"kubernetes.io/projected/6b700040-447e-4328-a5ac-f2a5ee4fc818-kube-api-access-n69tx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr\" (UID: \"6b700040-447e-4328-a5ac-f2a5ee4fc818\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.808124 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b700040-447e-4328-a5ac-f2a5ee4fc818-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr\" (UID: \"6b700040-447e-4328-a5ac-f2a5ee4fc818\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.808202 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n69tx\" (UniqueName: \"kubernetes.io/projected/6b700040-447e-4328-a5ac-f2a5ee4fc818-kube-api-access-n69tx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr\" (UID: \"6b700040-447e-4328-a5ac-f2a5ee4fc818\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.808263 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b700040-447e-4328-a5ac-f2a5ee4fc818-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr\" (UID: \"6b700040-447e-4328-a5ac-f2a5ee4fc818\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.813132 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b700040-447e-4328-a5ac-f2a5ee4fc818-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr\" (UID: \"6b700040-447e-4328-a5ac-f2a5ee4fc818\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.817120 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b700040-447e-4328-a5ac-f2a5ee4fc818-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr\" (UID: \"6b700040-447e-4328-a5ac-f2a5ee4fc818\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.828109 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n69tx\" (UniqueName: \"kubernetes.io/projected/6b700040-447e-4328-a5ac-f2a5ee4fc818-kube-api-access-n69tx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr\" (UID: \"6b700040-447e-4328-a5ac-f2a5ee4fc818\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" Mar 18 12:44:22 crc kubenswrapper[4843]: I0318 12:44:22.943993 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" Mar 18 12:44:23 crc kubenswrapper[4843]: I0318 12:44:23.859851 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr"] Mar 18 12:44:24 crc kubenswrapper[4843]: I0318 12:44:24.558715 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" event={"ID":"6b700040-447e-4328-a5ac-f2a5ee4fc818","Type":"ContainerStarted","Data":"6acb365435b3c075ab186b6774d7340ecf7c5fa1c81a09c72db7f98767e91853"} Mar 18 12:44:24 crc kubenswrapper[4843]: I0318 12:44:24.559027 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" event={"ID":"6b700040-447e-4328-a5ac-f2a5ee4fc818","Type":"ContainerStarted","Data":"4e8151b64e31bd25063ce84d602782ade937a8738337f6025f02a4abf64c57a5"} Mar 18 12:44:24 crc kubenswrapper[4843]: I0318 12:44:24.588164 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" podStartSLOduration=2.4416965299999998 podStartE2EDuration="2.588137462s" podCreationTimestamp="2026-03-18 12:44:22 +0000 UTC" firstStartedPulling="2026-03-18 12:44:23.864816054 +0000 UTC m=+2097.580641578" lastFinishedPulling="2026-03-18 12:44:24.011256986 +0000 UTC m=+2097.727082510" observedRunningTime="2026-03-18 12:44:24.578000385 +0000 UTC m=+2098.293825909" watchObservedRunningTime="2026-03-18 12:44:24.588137462 +0000 UTC m=+2098.303962996" Mar 18 12:44:50 crc kubenswrapper[4843]: I0318 12:44:50.034516 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:44:50 crc kubenswrapper[4843]: I0318 12:44:50.035078 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.160820 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k"] Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.163098 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.168807 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.168991 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.172732 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k"] Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.282538 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51f70510-4798-4903-89e6-853498d3e42d-secret-volume\") pod \"collect-profiles-29563965-2mv2k\" (UID: \"51f70510-4798-4903-89e6-853498d3e42d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.282610 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nctlx\" (UniqueName: \"kubernetes.io/projected/51f70510-4798-4903-89e6-853498d3e42d-kube-api-access-nctlx\") pod \"collect-profiles-29563965-2mv2k\" (UID: \"51f70510-4798-4903-89e6-853498d3e42d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.282701 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51f70510-4798-4903-89e6-853498d3e42d-config-volume\") pod \"collect-profiles-29563965-2mv2k\" (UID: \"51f70510-4798-4903-89e6-853498d3e42d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.386247 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51f70510-4798-4903-89e6-853498d3e42d-config-volume\") pod \"collect-profiles-29563965-2mv2k\" (UID: \"51f70510-4798-4903-89e6-853498d3e42d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.386505 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51f70510-4798-4903-89e6-853498d3e42d-secret-volume\") pod \"collect-profiles-29563965-2mv2k\" (UID: \"51f70510-4798-4903-89e6-853498d3e42d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.386548 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nctlx\" (UniqueName: \"kubernetes.io/projected/51f70510-4798-4903-89e6-853498d3e42d-kube-api-access-nctlx\") pod \"collect-profiles-29563965-2mv2k\" (UID: \"51f70510-4798-4903-89e6-853498d3e42d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.387342 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51f70510-4798-4903-89e6-853498d3e42d-config-volume\") pod \"collect-profiles-29563965-2mv2k\" (UID: \"51f70510-4798-4903-89e6-853498d3e42d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.403773 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51f70510-4798-4903-89e6-853498d3e42d-secret-volume\") pod \"collect-profiles-29563965-2mv2k\" (UID: \"51f70510-4798-4903-89e6-853498d3e42d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.406877 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nctlx\" (UniqueName: \"kubernetes.io/projected/51f70510-4798-4903-89e6-853498d3e42d-kube-api-access-nctlx\") pod \"collect-profiles-29563965-2mv2k\" (UID: \"51f70510-4798-4903-89e6-853498d3e42d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.485837 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" Mar 18 12:45:00 crc kubenswrapper[4843]: I0318 12:45:00.961955 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k"] Mar 18 12:45:01 crc kubenswrapper[4843]: I0318 12:45:01.963078 4843 generic.go:334] "Generic (PLEG): container finished" podID="51f70510-4798-4903-89e6-853498d3e42d" containerID="7d5ec4e7f16d9bbb4470d6272da4f4b7db0b9ad0f164d8b2efa526527e5429a9" exitCode=0 Mar 18 12:45:01 crc kubenswrapper[4843]: I0318 12:45:01.963159 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" event={"ID":"51f70510-4798-4903-89e6-853498d3e42d","Type":"ContainerDied","Data":"7d5ec4e7f16d9bbb4470d6272da4f4b7db0b9ad0f164d8b2efa526527e5429a9"} Mar 18 12:45:01 crc kubenswrapper[4843]: I0318 12:45:01.963367 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" event={"ID":"51f70510-4798-4903-89e6-853498d3e42d","Type":"ContainerStarted","Data":"85ceb71f28c3030d4ad686943f2b8df3d77c4b92a6841ba06566b5da8921bc66"} Mar 18 12:45:03 crc kubenswrapper[4843]: I0318 12:45:03.407191 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" Mar 18 12:45:03 crc kubenswrapper[4843]: I0318 12:45:03.553548 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nctlx\" (UniqueName: \"kubernetes.io/projected/51f70510-4798-4903-89e6-853498d3e42d-kube-api-access-nctlx\") pod \"51f70510-4798-4903-89e6-853498d3e42d\" (UID: \"51f70510-4798-4903-89e6-853498d3e42d\") " Mar 18 12:45:03 crc kubenswrapper[4843]: I0318 12:45:03.553644 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51f70510-4798-4903-89e6-853498d3e42d-config-volume\") pod \"51f70510-4798-4903-89e6-853498d3e42d\" (UID: \"51f70510-4798-4903-89e6-853498d3e42d\") " Mar 18 12:45:03 crc kubenswrapper[4843]: I0318 12:45:03.553717 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51f70510-4798-4903-89e6-853498d3e42d-secret-volume\") pod \"51f70510-4798-4903-89e6-853498d3e42d\" (UID: \"51f70510-4798-4903-89e6-853498d3e42d\") " Mar 18 12:45:03 crc kubenswrapper[4843]: I0318 12:45:03.554266 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51f70510-4798-4903-89e6-853498d3e42d-config-volume" (OuterVolumeSpecName: "config-volume") pod "51f70510-4798-4903-89e6-853498d3e42d" (UID: "51f70510-4798-4903-89e6-853498d3e42d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:45:03 crc kubenswrapper[4843]: I0318 12:45:03.559159 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f70510-4798-4903-89e6-853498d3e42d-kube-api-access-nctlx" (OuterVolumeSpecName: "kube-api-access-nctlx") pod "51f70510-4798-4903-89e6-853498d3e42d" (UID: "51f70510-4798-4903-89e6-853498d3e42d"). InnerVolumeSpecName "kube-api-access-nctlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:45:03 crc kubenswrapper[4843]: I0318 12:45:03.560222 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f70510-4798-4903-89e6-853498d3e42d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "51f70510-4798-4903-89e6-853498d3e42d" (UID: "51f70510-4798-4903-89e6-853498d3e42d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:45:03 crc kubenswrapper[4843]: I0318 12:45:03.656642 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nctlx\" (UniqueName: \"kubernetes.io/projected/51f70510-4798-4903-89e6-853498d3e42d-kube-api-access-nctlx\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:03 crc kubenswrapper[4843]: I0318 12:45:03.656722 4843 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51f70510-4798-4903-89e6-853498d3e42d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:03 crc kubenswrapper[4843]: I0318 12:45:03.656734 4843 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51f70510-4798-4903-89e6-853498d3e42d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:03 crc kubenswrapper[4843]: I0318 12:45:03.983592 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" event={"ID":"51f70510-4798-4903-89e6-853498d3e42d","Type":"ContainerDied","Data":"85ceb71f28c3030d4ad686943f2b8df3d77c4b92a6841ba06566b5da8921bc66"} Mar 18 12:45:03 crc kubenswrapper[4843]: I0318 12:45:03.983633 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85ceb71f28c3030d4ad686943f2b8df3d77c4b92a6841ba06566b5da8921bc66" Mar 18 12:45:03 crc kubenswrapper[4843]: I0318 12:45:03.983719 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k" Mar 18 12:45:04 crc kubenswrapper[4843]: I0318 12:45:04.047805 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-7sn7w"] Mar 18 12:45:04 crc kubenswrapper[4843]: I0318 12:45:04.056741 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-7sn7w"] Mar 18 12:45:04 crc kubenswrapper[4843]: I0318 12:45:04.261250 4843 scope.go:117] "RemoveContainer" containerID="c9c983ab1d9a77b9028379369f22cbaef829f1c68ec2079dc8812ee4b0840a74" Mar 18 12:45:04 crc kubenswrapper[4843]: I0318 12:45:04.306551 4843 scope.go:117] "RemoveContainer" containerID="a298a077e860eb61e48e3f862d74c7029a9d42aac32c62c5faff068cf825f077" Mar 18 12:45:04 crc kubenswrapper[4843]: I0318 12:45:04.479083 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx"] Mar 18 12:45:04 crc kubenswrapper[4843]: I0318 12:45:04.487236 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-nnwnx"] Mar 18 12:45:05 crc kubenswrapper[4843]: I0318 12:45:05.008781 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1442992-fe43-43be-a43b-48f80db66418" path="/var/lib/kubelet/pods/b1442992-fe43-43be-a43b-48f80db66418/volumes" Mar 18 12:45:05 crc kubenswrapper[4843]: I0318 12:45:05.015422 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d981e7da-a5a9-4d42-94f4-f78aefb2a660" path="/var/lib/kubelet/pods/d981e7da-a5a9-4d42-94f4-f78aefb2a660/volumes" Mar 18 12:45:05 crc kubenswrapper[4843]: I0318 12:45:05.058006 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rkttr"] Mar 18 12:45:05 crc kubenswrapper[4843]: I0318 12:45:05.071350 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rkttr"] Mar 18 12:45:07 crc kubenswrapper[4843]: I0318 12:45:07.019427 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78c4600-3dac-4168-9aa0-a1be34986ba5" path="/var/lib/kubelet/pods/f78c4600-3dac-4168-9aa0-a1be34986ba5/volumes" Mar 18 12:45:13 crc kubenswrapper[4843]: I0318 12:45:13.076540 4843 generic.go:334] "Generic (PLEG): container finished" podID="6b700040-447e-4328-a5ac-f2a5ee4fc818" containerID="6acb365435b3c075ab186b6774d7340ecf7c5fa1c81a09c72db7f98767e91853" exitCode=0 Mar 18 12:45:13 crc kubenswrapper[4843]: I0318 12:45:13.076614 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" event={"ID":"6b700040-447e-4328-a5ac-f2a5ee4fc818","Type":"ContainerDied","Data":"6acb365435b3c075ab186b6774d7340ecf7c5fa1c81a09c72db7f98767e91853"} Mar 18 12:45:14 crc kubenswrapper[4843]: I0318 12:45:14.488102 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" Mar 18 12:45:14 crc kubenswrapper[4843]: I0318 12:45:14.585323 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n69tx\" (UniqueName: \"kubernetes.io/projected/6b700040-447e-4328-a5ac-f2a5ee4fc818-kube-api-access-n69tx\") pod \"6b700040-447e-4328-a5ac-f2a5ee4fc818\" (UID: \"6b700040-447e-4328-a5ac-f2a5ee4fc818\") " Mar 18 12:45:14 crc kubenswrapper[4843]: I0318 12:45:14.585483 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b700040-447e-4328-a5ac-f2a5ee4fc818-inventory\") pod \"6b700040-447e-4328-a5ac-f2a5ee4fc818\" (UID: \"6b700040-447e-4328-a5ac-f2a5ee4fc818\") " Mar 18 12:45:14 crc kubenswrapper[4843]: I0318 12:45:14.585630 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b700040-447e-4328-a5ac-f2a5ee4fc818-ssh-key-openstack-edpm-ipam\") pod \"6b700040-447e-4328-a5ac-f2a5ee4fc818\" (UID: \"6b700040-447e-4328-a5ac-f2a5ee4fc818\") " Mar 18 12:45:14 crc kubenswrapper[4843]: I0318 12:45:14.592937 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b700040-447e-4328-a5ac-f2a5ee4fc818-kube-api-access-n69tx" (OuterVolumeSpecName: "kube-api-access-n69tx") pod "6b700040-447e-4328-a5ac-f2a5ee4fc818" (UID: "6b700040-447e-4328-a5ac-f2a5ee4fc818"). InnerVolumeSpecName "kube-api-access-n69tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:45:14 crc kubenswrapper[4843]: I0318 12:45:14.620067 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b700040-447e-4328-a5ac-f2a5ee4fc818-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6b700040-447e-4328-a5ac-f2a5ee4fc818" (UID: "6b700040-447e-4328-a5ac-f2a5ee4fc818"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:45:14 crc kubenswrapper[4843]: I0318 12:45:14.629463 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b700040-447e-4328-a5ac-f2a5ee4fc818-inventory" (OuterVolumeSpecName: "inventory") pod "6b700040-447e-4328-a5ac-f2a5ee4fc818" (UID: "6b700040-447e-4328-a5ac-f2a5ee4fc818"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:45:14 crc kubenswrapper[4843]: I0318 12:45:14.687914 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b700040-447e-4328-a5ac-f2a5ee4fc818-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:14 crc kubenswrapper[4843]: I0318 12:45:14.687948 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n69tx\" (UniqueName: \"kubernetes.io/projected/6b700040-447e-4328-a5ac-f2a5ee4fc818-kube-api-access-n69tx\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:14 crc kubenswrapper[4843]: I0318 12:45:14.687959 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b700040-447e-4328-a5ac-f2a5ee4fc818-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.095402 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" event={"ID":"6b700040-447e-4328-a5ac-f2a5ee4fc818","Type":"ContainerDied","Data":"4e8151b64e31bd25063ce84d602782ade937a8738337f6025f02a4abf64c57a5"} Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.095447 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e8151b64e31bd25063ce84d602782ade937a8738337f6025f02a4abf64c57a5" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.095503 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.186627 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mxj5b"] Mar 18 12:45:15 crc kubenswrapper[4843]: E0318 12:45:15.187281 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f70510-4798-4903-89e6-853498d3e42d" containerName="collect-profiles" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.187313 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f70510-4798-4903-89e6-853498d3e42d" containerName="collect-profiles" Mar 18 12:45:15 crc kubenswrapper[4843]: E0318 12:45:15.187367 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b700040-447e-4328-a5ac-f2a5ee4fc818" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.187378 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b700040-447e-4328-a5ac-f2a5ee4fc818" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.187701 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b700040-447e-4328-a5ac-f2a5ee4fc818" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.187723 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f70510-4798-4903-89e6-853498d3e42d" containerName="collect-profiles" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.189143 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.192499 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.192942 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.193200 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.193305 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.198347 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mxj5b"] Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.307274 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcv9w\" (UniqueName: \"kubernetes.io/projected/f545b271-e719-4624-9fdf-c8e26500c8ab-kube-api-access-wcv9w\") pod \"ssh-known-hosts-edpm-deployment-mxj5b\" (UID: \"f545b271-e719-4624-9fdf-c8e26500c8ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.307409 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f545b271-e719-4624-9fdf-c8e26500c8ab-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mxj5b\" (UID: \"f545b271-e719-4624-9fdf-c8e26500c8ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.307476 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f545b271-e719-4624-9fdf-c8e26500c8ab-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mxj5b\" (UID: \"f545b271-e719-4624-9fdf-c8e26500c8ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.409791 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcv9w\" (UniqueName: \"kubernetes.io/projected/f545b271-e719-4624-9fdf-c8e26500c8ab-kube-api-access-wcv9w\") pod \"ssh-known-hosts-edpm-deployment-mxj5b\" (UID: \"f545b271-e719-4624-9fdf-c8e26500c8ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.410184 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f545b271-e719-4624-9fdf-c8e26500c8ab-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mxj5b\" (UID: \"f545b271-e719-4624-9fdf-c8e26500c8ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.410256 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f545b271-e719-4624-9fdf-c8e26500c8ab-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mxj5b\" (UID: \"f545b271-e719-4624-9fdf-c8e26500c8ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.415268 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f545b271-e719-4624-9fdf-c8e26500c8ab-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mxj5b\" (UID: \"f545b271-e719-4624-9fdf-c8e26500c8ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.415982 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f545b271-e719-4624-9fdf-c8e26500c8ab-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mxj5b\" (UID: \"f545b271-e719-4624-9fdf-c8e26500c8ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.431129 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcv9w\" (UniqueName: \"kubernetes.io/projected/f545b271-e719-4624-9fdf-c8e26500c8ab-kube-api-access-wcv9w\") pod \"ssh-known-hosts-edpm-deployment-mxj5b\" (UID: \"f545b271-e719-4624-9fdf-c8e26500c8ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" Mar 18 12:45:15 crc kubenswrapper[4843]: I0318 12:45:15.513134 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" Mar 18 12:45:16 crc kubenswrapper[4843]: I0318 12:45:16.113132 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mxj5b"] Mar 18 12:45:17 crc kubenswrapper[4843]: I0318 12:45:17.113682 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" event={"ID":"f545b271-e719-4624-9fdf-c8e26500c8ab","Type":"ContainerStarted","Data":"6157528c5a0709b4e593f6f32c45de573444188ff5a417c49c64150b433e5e48"} Mar 18 12:45:19 crc kubenswrapper[4843]: I0318 12:45:19.133925 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" event={"ID":"f545b271-e719-4624-9fdf-c8e26500c8ab","Type":"ContainerStarted","Data":"c5039acbd7af0004e1687a9c823b6cf3b52ccf9f2bfbf18622fe347d83093b20"} Mar 18 12:45:19 crc kubenswrapper[4843]: I0318 12:45:19.165159 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" podStartSLOduration=2.030489273 podStartE2EDuration="4.164637423s" podCreationTimestamp="2026-03-18 12:45:15 +0000 UTC" firstStartedPulling="2026-03-18 12:45:16.114996966 +0000 UTC m=+2149.830822500" lastFinishedPulling="2026-03-18 12:45:18.249145126 +0000 UTC m=+2151.964970650" observedRunningTime="2026-03-18 12:45:19.158284303 +0000 UTC m=+2152.874109867" watchObservedRunningTime="2026-03-18 12:45:19.164637423 +0000 UTC m=+2152.880462947" Mar 18 12:45:20 crc kubenswrapper[4843]: I0318 12:45:20.035333 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:45:20 crc kubenswrapper[4843]: I0318 12:45:20.035410 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:45:25 crc kubenswrapper[4843]: I0318 12:45:25.190727 4843 generic.go:334] "Generic (PLEG): container finished" podID="f545b271-e719-4624-9fdf-c8e26500c8ab" containerID="c5039acbd7af0004e1687a9c823b6cf3b52ccf9f2bfbf18622fe347d83093b20" exitCode=0 Mar 18 12:45:25 crc kubenswrapper[4843]: I0318 12:45:25.190912 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" event={"ID":"f545b271-e719-4624-9fdf-c8e26500c8ab","Type":"ContainerDied","Data":"c5039acbd7af0004e1687a9c823b6cf3b52ccf9f2bfbf18622fe347d83093b20"} Mar 18 12:45:26 crc kubenswrapper[4843]: I0318 12:45:26.624161 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" Mar 18 12:45:26 crc kubenswrapper[4843]: I0318 12:45:26.728722 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f545b271-e719-4624-9fdf-c8e26500c8ab-inventory-0\") pod \"f545b271-e719-4624-9fdf-c8e26500c8ab\" (UID: \"f545b271-e719-4624-9fdf-c8e26500c8ab\") " Mar 18 12:45:26 crc kubenswrapper[4843]: I0318 12:45:26.728973 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcv9w\" (UniqueName: \"kubernetes.io/projected/f545b271-e719-4624-9fdf-c8e26500c8ab-kube-api-access-wcv9w\") pod \"f545b271-e719-4624-9fdf-c8e26500c8ab\" (UID: \"f545b271-e719-4624-9fdf-c8e26500c8ab\") " Mar 18 12:45:26 crc kubenswrapper[4843]: I0318 12:45:26.729195 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f545b271-e719-4624-9fdf-c8e26500c8ab-ssh-key-openstack-edpm-ipam\") pod \"f545b271-e719-4624-9fdf-c8e26500c8ab\" (UID: \"f545b271-e719-4624-9fdf-c8e26500c8ab\") " Mar 18 12:45:26 crc kubenswrapper[4843]: I0318 12:45:26.735000 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f545b271-e719-4624-9fdf-c8e26500c8ab-kube-api-access-wcv9w" (OuterVolumeSpecName: "kube-api-access-wcv9w") pod "f545b271-e719-4624-9fdf-c8e26500c8ab" (UID: "f545b271-e719-4624-9fdf-c8e26500c8ab"). InnerVolumeSpecName "kube-api-access-wcv9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:45:26 crc kubenswrapper[4843]: I0318 12:45:26.755466 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f545b271-e719-4624-9fdf-c8e26500c8ab-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f545b271-e719-4624-9fdf-c8e26500c8ab" (UID: "f545b271-e719-4624-9fdf-c8e26500c8ab"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:45:26 crc kubenswrapper[4843]: I0318 12:45:26.762686 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f545b271-e719-4624-9fdf-c8e26500c8ab-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f545b271-e719-4624-9fdf-c8e26500c8ab" (UID: "f545b271-e719-4624-9fdf-c8e26500c8ab"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:45:26 crc kubenswrapper[4843]: I0318 12:45:26.831591 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f545b271-e719-4624-9fdf-c8e26500c8ab-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:26 crc kubenswrapper[4843]: I0318 12:45:26.831624 4843 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f545b271-e719-4624-9fdf-c8e26500c8ab-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:26 crc kubenswrapper[4843]: I0318 12:45:26.831633 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcv9w\" (UniqueName: \"kubernetes.io/projected/f545b271-e719-4624-9fdf-c8e26500c8ab-kube-api-access-wcv9w\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.220131 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" event={"ID":"f545b271-e719-4624-9fdf-c8e26500c8ab","Type":"ContainerDied","Data":"6157528c5a0709b4e593f6f32c45de573444188ff5a417c49c64150b433e5e48"} Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.220694 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6157528c5a0709b4e593f6f32c45de573444188ff5a417c49c64150b433e5e48" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.220889 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mxj5b" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.306237 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn"] Mar 18 12:45:27 crc kubenswrapper[4843]: E0318 12:45:27.306618 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f545b271-e719-4624-9fdf-c8e26500c8ab" containerName="ssh-known-hosts-edpm-deployment" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.306635 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f545b271-e719-4624-9fdf-c8e26500c8ab" containerName="ssh-known-hosts-edpm-deployment" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.306837 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f545b271-e719-4624-9fdf-c8e26500c8ab" containerName="ssh-known-hosts-edpm-deployment" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.307464 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.316196 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.316339 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.316401 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.319715 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn"] Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.319963 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.340432 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch769\" (UniqueName: \"kubernetes.io/projected/c769322f-958a-43ca-b4f6-7379465f6276-kube-api-access-ch769\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kwfgn\" (UID: \"c769322f-958a-43ca-b4f6-7379465f6276\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.340506 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c769322f-958a-43ca-b4f6-7379465f6276-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kwfgn\" (UID: \"c769322f-958a-43ca-b4f6-7379465f6276\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.340544 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c769322f-958a-43ca-b4f6-7379465f6276-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kwfgn\" (UID: \"c769322f-958a-43ca-b4f6-7379465f6276\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.442312 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch769\" (UniqueName: \"kubernetes.io/projected/c769322f-958a-43ca-b4f6-7379465f6276-kube-api-access-ch769\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kwfgn\" (UID: \"c769322f-958a-43ca-b4f6-7379465f6276\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.442389 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c769322f-958a-43ca-b4f6-7379465f6276-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kwfgn\" (UID: \"c769322f-958a-43ca-b4f6-7379465f6276\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.442432 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c769322f-958a-43ca-b4f6-7379465f6276-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kwfgn\" (UID: \"c769322f-958a-43ca-b4f6-7379465f6276\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.449167 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c769322f-958a-43ca-b4f6-7379465f6276-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kwfgn\" (UID: \"c769322f-958a-43ca-b4f6-7379465f6276\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.449552 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c769322f-958a-43ca-b4f6-7379465f6276-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kwfgn\" (UID: \"c769322f-958a-43ca-b4f6-7379465f6276\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.461497 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch769\" (UniqueName: \"kubernetes.io/projected/c769322f-958a-43ca-b4f6-7379465f6276-kube-api-access-ch769\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kwfgn\" (UID: \"c769322f-958a-43ca-b4f6-7379465f6276\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" Mar 18 12:45:27 crc kubenswrapper[4843]: I0318 12:45:27.625701 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" Mar 18 12:45:28 crc kubenswrapper[4843]: I0318 12:45:28.206795 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn"] Mar 18 12:45:28 crc kubenswrapper[4843]: I0318 12:45:28.227493 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" event={"ID":"c769322f-958a-43ca-b4f6-7379465f6276","Type":"ContainerStarted","Data":"8e4beddd3013bc07cc1e50acff87fa84517e516cc33392136c246d241ff8c9e0"} Mar 18 12:45:30 crc kubenswrapper[4843]: I0318 12:45:30.255156 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" event={"ID":"c769322f-958a-43ca-b4f6-7379465f6276","Type":"ContainerStarted","Data":"ff3dae50dec4fc0d46b932029a87a46132044668f71baf8f1da90465dc1c1ae5"} Mar 18 12:45:30 crc kubenswrapper[4843]: I0318 12:45:30.294135 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" podStartSLOduration=2.424815658 podStartE2EDuration="3.294104665s" podCreationTimestamp="2026-03-18 12:45:27 +0000 UTC" firstStartedPulling="2026-03-18 12:45:28.216321224 +0000 UTC m=+2161.932146738" lastFinishedPulling="2026-03-18 12:45:29.085610221 +0000 UTC m=+2162.801435745" observedRunningTime="2026-03-18 12:45:30.274735036 +0000 UTC m=+2163.990560570" watchObservedRunningTime="2026-03-18 12:45:30.294104665 +0000 UTC m=+2164.009930219" Mar 18 12:45:37 crc kubenswrapper[4843]: I0318 12:45:37.340725 4843 generic.go:334] "Generic (PLEG): container finished" podID="c769322f-958a-43ca-b4f6-7379465f6276" containerID="ff3dae50dec4fc0d46b932029a87a46132044668f71baf8f1da90465dc1c1ae5" exitCode=0 Mar 18 12:45:37 crc kubenswrapper[4843]: I0318 12:45:37.340794 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" event={"ID":"c769322f-958a-43ca-b4f6-7379465f6276","Type":"ContainerDied","Data":"ff3dae50dec4fc0d46b932029a87a46132044668f71baf8f1da90465dc1c1ae5"} Mar 18 12:45:38 crc kubenswrapper[4843]: I0318 12:45:38.766290 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" Mar 18 12:45:38 crc kubenswrapper[4843]: I0318 12:45:38.857152 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch769\" (UniqueName: \"kubernetes.io/projected/c769322f-958a-43ca-b4f6-7379465f6276-kube-api-access-ch769\") pod \"c769322f-958a-43ca-b4f6-7379465f6276\" (UID: \"c769322f-958a-43ca-b4f6-7379465f6276\") " Mar 18 12:45:38 crc kubenswrapper[4843]: I0318 12:45:38.857327 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c769322f-958a-43ca-b4f6-7379465f6276-ssh-key-openstack-edpm-ipam\") pod \"c769322f-958a-43ca-b4f6-7379465f6276\" (UID: \"c769322f-958a-43ca-b4f6-7379465f6276\") " Mar 18 12:45:38 crc kubenswrapper[4843]: I0318 12:45:38.857376 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c769322f-958a-43ca-b4f6-7379465f6276-inventory\") pod \"c769322f-958a-43ca-b4f6-7379465f6276\" (UID: \"c769322f-958a-43ca-b4f6-7379465f6276\") " Mar 18 12:45:38 crc kubenswrapper[4843]: I0318 12:45:38.863181 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c769322f-958a-43ca-b4f6-7379465f6276-kube-api-access-ch769" (OuterVolumeSpecName: "kube-api-access-ch769") pod "c769322f-958a-43ca-b4f6-7379465f6276" (UID: "c769322f-958a-43ca-b4f6-7379465f6276"). InnerVolumeSpecName "kube-api-access-ch769". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:45:38 crc kubenswrapper[4843]: I0318 12:45:38.884305 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c769322f-958a-43ca-b4f6-7379465f6276-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c769322f-958a-43ca-b4f6-7379465f6276" (UID: "c769322f-958a-43ca-b4f6-7379465f6276"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:45:38 crc kubenswrapper[4843]: I0318 12:45:38.886155 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c769322f-958a-43ca-b4f6-7379465f6276-inventory" (OuterVolumeSpecName: "inventory") pod "c769322f-958a-43ca-b4f6-7379465f6276" (UID: "c769322f-958a-43ca-b4f6-7379465f6276"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:45:38 crc kubenswrapper[4843]: I0318 12:45:38.960529 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch769\" (UniqueName: \"kubernetes.io/projected/c769322f-958a-43ca-b4f6-7379465f6276-kube-api-access-ch769\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:38 crc kubenswrapper[4843]: I0318 12:45:38.960570 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c769322f-958a-43ca-b4f6-7379465f6276-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:38 crc kubenswrapper[4843]: I0318 12:45:38.960585 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c769322f-958a-43ca-b4f6-7379465f6276-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.368517 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" event={"ID":"c769322f-958a-43ca-b4f6-7379465f6276","Type":"ContainerDied","Data":"8e4beddd3013bc07cc1e50acff87fa84517e516cc33392136c246d241ff8c9e0"} Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.368955 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e4beddd3013bc07cc1e50acff87fa84517e516cc33392136c246d241ff8c9e0" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.368557 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kwfgn" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.489570 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj"] Mar 18 12:45:39 crc kubenswrapper[4843]: E0318 12:45:39.490274 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c769322f-958a-43ca-b4f6-7379465f6276" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.490297 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c769322f-958a-43ca-b4f6-7379465f6276" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.490537 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="c769322f-958a-43ca-b4f6-7379465f6276" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.491637 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.494347 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.494460 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.494950 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.495938 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.502593 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj"] Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.572786 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a80f3a2-c501-4f2f-8304-41fd472da368-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj\" (UID: \"6a80f3a2-c501-4f2f-8304-41fd472da368\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.572940 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a80f3a2-c501-4f2f-8304-41fd472da368-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj\" (UID: \"6a80f3a2-c501-4f2f-8304-41fd472da368\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.573015 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fll7g\" (UniqueName: \"kubernetes.io/projected/6a80f3a2-c501-4f2f-8304-41fd472da368-kube-api-access-fll7g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj\" (UID: \"6a80f3a2-c501-4f2f-8304-41fd472da368\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.675906 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a80f3a2-c501-4f2f-8304-41fd472da368-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj\" (UID: \"6a80f3a2-c501-4f2f-8304-41fd472da368\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.676000 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a80f3a2-c501-4f2f-8304-41fd472da368-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj\" (UID: \"6a80f3a2-c501-4f2f-8304-41fd472da368\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.676031 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fll7g\" (UniqueName: \"kubernetes.io/projected/6a80f3a2-c501-4f2f-8304-41fd472da368-kube-api-access-fll7g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj\" (UID: \"6a80f3a2-c501-4f2f-8304-41fd472da368\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.686259 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a80f3a2-c501-4f2f-8304-41fd472da368-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj\" (UID: \"6a80f3a2-c501-4f2f-8304-41fd472da368\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.692604 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a80f3a2-c501-4f2f-8304-41fd472da368-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj\" (UID: \"6a80f3a2-c501-4f2f-8304-41fd472da368\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.693443 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fll7g\" (UniqueName: \"kubernetes.io/projected/6a80f3a2-c501-4f2f-8304-41fd472da368-kube-api-access-fll7g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj\" (UID: \"6a80f3a2-c501-4f2f-8304-41fd472da368\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" Mar 18 12:45:39 crc kubenswrapper[4843]: I0318 12:45:39.815343 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" Mar 18 12:45:40 crc kubenswrapper[4843]: I0318 12:45:40.223231 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj"] Mar 18 12:45:40 crc kubenswrapper[4843]: I0318 12:45:40.380272 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" event={"ID":"6a80f3a2-c501-4f2f-8304-41fd472da368","Type":"ContainerStarted","Data":"7bd0c11ba600d4e43ced53eb8a7917c249d9cd230dfd0db4a92ca4b237d9c86e"} Mar 18 12:45:41 crc kubenswrapper[4843]: I0318 12:45:41.392154 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" event={"ID":"6a80f3a2-c501-4f2f-8304-41fd472da368","Type":"ContainerStarted","Data":"deb109b0b0f26e7f81330e4215e2bc6a613b0c0e307b91c65f2a18b4d5417b82"} Mar 18 12:45:41 crc kubenswrapper[4843]: I0318 12:45:41.416793 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" podStartSLOduration=2.233823315 podStartE2EDuration="2.416771432s" podCreationTimestamp="2026-03-18 12:45:39 +0000 UTC" firstStartedPulling="2026-03-18 12:45:40.226578708 +0000 UTC m=+2173.942404242" lastFinishedPulling="2026-03-18 12:45:40.409526825 +0000 UTC m=+2174.125352359" observedRunningTime="2026-03-18 12:45:41.413311514 +0000 UTC m=+2175.129137038" watchObservedRunningTime="2026-03-18 12:45:41.416771432 +0000 UTC m=+2175.132596956" Mar 18 12:45:48 crc kubenswrapper[4843]: I0318 12:45:48.044923 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-pt4mg"] Mar 18 12:45:48 crc kubenswrapper[4843]: I0318 12:45:48.055330 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-pt4mg"] Mar 18 12:45:48 crc kubenswrapper[4843]: I0318 12:45:48.995070 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7187efa4-74d7-4162-89d7-5b9368c1924e" path="/var/lib/kubelet/pods/7187efa4-74d7-4162-89d7-5b9368c1924e/volumes" Mar 18 12:45:49 crc kubenswrapper[4843]: I0318 12:45:49.470564 4843 generic.go:334] "Generic (PLEG): container finished" podID="6a80f3a2-c501-4f2f-8304-41fd472da368" containerID="deb109b0b0f26e7f81330e4215e2bc6a613b0c0e307b91c65f2a18b4d5417b82" exitCode=0 Mar 18 12:45:49 crc kubenswrapper[4843]: I0318 12:45:49.470606 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" event={"ID":"6a80f3a2-c501-4f2f-8304-41fd472da368","Type":"ContainerDied","Data":"deb109b0b0f26e7f81330e4215e2bc6a613b0c0e307b91c65f2a18b4d5417b82"} Mar 18 12:45:50 crc kubenswrapper[4843]: I0318 12:45:50.035292 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:45:50 crc kubenswrapper[4843]: I0318 12:45:50.035345 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:45:50 crc kubenswrapper[4843]: I0318 12:45:50.035386 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:45:50 crc kubenswrapper[4843]: I0318 12:45:50.036177 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"adf4e676a33d49fbbd7d287fa0d1930ae6c4bf2cf91763487c46260a988c52c0"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:45:50 crc kubenswrapper[4843]: I0318 12:45:50.036250 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://adf4e676a33d49fbbd7d287fa0d1930ae6c4bf2cf91763487c46260a988c52c0" gracePeriod=600 Mar 18 12:45:50 crc kubenswrapper[4843]: I0318 12:45:50.485931 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="adf4e676a33d49fbbd7d287fa0d1930ae6c4bf2cf91763487c46260a988c52c0" exitCode=0 Mar 18 12:45:50 crc kubenswrapper[4843]: I0318 12:45:50.485996 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"adf4e676a33d49fbbd7d287fa0d1930ae6c4bf2cf91763487c46260a988c52c0"} Mar 18 12:45:50 crc kubenswrapper[4843]: I0318 12:45:50.486748 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276"} Mar 18 12:45:50 crc kubenswrapper[4843]: I0318 12:45:50.486777 4843 scope.go:117] "RemoveContainer" containerID="b9950cd1f979e3a12c23e1dfccfbb1974180f557595e7752cbd15131e12b4f24" Mar 18 12:45:50 crc kubenswrapper[4843]: I0318 12:45:50.955541 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.124976 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fll7g\" (UniqueName: \"kubernetes.io/projected/6a80f3a2-c501-4f2f-8304-41fd472da368-kube-api-access-fll7g\") pod \"6a80f3a2-c501-4f2f-8304-41fd472da368\" (UID: \"6a80f3a2-c501-4f2f-8304-41fd472da368\") " Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.125117 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a80f3a2-c501-4f2f-8304-41fd472da368-inventory\") pod \"6a80f3a2-c501-4f2f-8304-41fd472da368\" (UID: \"6a80f3a2-c501-4f2f-8304-41fd472da368\") " Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.125164 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a80f3a2-c501-4f2f-8304-41fd472da368-ssh-key-openstack-edpm-ipam\") pod \"6a80f3a2-c501-4f2f-8304-41fd472da368\" (UID: \"6a80f3a2-c501-4f2f-8304-41fd472da368\") " Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.131777 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a80f3a2-c501-4f2f-8304-41fd472da368-kube-api-access-fll7g" (OuterVolumeSpecName: "kube-api-access-fll7g") pod "6a80f3a2-c501-4f2f-8304-41fd472da368" (UID: "6a80f3a2-c501-4f2f-8304-41fd472da368"). InnerVolumeSpecName "kube-api-access-fll7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.151837 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a80f3a2-c501-4f2f-8304-41fd472da368-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6a80f3a2-c501-4f2f-8304-41fd472da368" (UID: "6a80f3a2-c501-4f2f-8304-41fd472da368"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.152963 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a80f3a2-c501-4f2f-8304-41fd472da368-inventory" (OuterVolumeSpecName: "inventory") pod "6a80f3a2-c501-4f2f-8304-41fd472da368" (UID: "6a80f3a2-c501-4f2f-8304-41fd472da368"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.227323 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fll7g\" (UniqueName: \"kubernetes.io/projected/6a80f3a2-c501-4f2f-8304-41fd472da368-kube-api-access-fll7g\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.227364 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a80f3a2-c501-4f2f-8304-41fd472da368-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.227375 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a80f3a2-c501-4f2f-8304-41fd472da368-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.508289 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" event={"ID":"6a80f3a2-c501-4f2f-8304-41fd472da368","Type":"ContainerDied","Data":"7bd0c11ba600d4e43ced53eb8a7917c249d9cd230dfd0db4a92ca4b237d9c86e"} Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.508351 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bd0c11ba600d4e43ced53eb8a7917c249d9cd230dfd0db4a92ca4b237d9c86e" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.510763 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.612291 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs"] Mar 18 12:45:51 crc kubenswrapper[4843]: E0318 12:45:51.612706 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a80f3a2-c501-4f2f-8304-41fd472da368" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.612727 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a80f3a2-c501-4f2f-8304-41fd472da368" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.612952 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a80f3a2-c501-4f2f-8304-41fd472da368" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.613626 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.616317 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.616570 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.616689 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.617129 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.626827 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.626902 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.627279 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.627487 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.638707 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs"] Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648125 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648224 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648270 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648335 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zqhq\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-kube-api-access-9zqhq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648379 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648434 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648466 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648553 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648596 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648642 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648701 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648748 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648814 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.648875 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.750762 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.750834 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.750909 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.750958 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.751053 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.751090 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.751315 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zqhq\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-kube-api-access-9zqhq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.751362 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.751415 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.751454 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.751608 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.751715 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.751766 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.751814 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.757153 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.757427 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.757990 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.758222 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.758526 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.759808 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.760331 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.760890 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.760968 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.766152 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.768772 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.770105 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.771828 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.771860 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zqhq\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-kube-api-access-9zqhq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-g26vs\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:51 crc kubenswrapper[4843]: I0318 12:45:51.944327 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:45:52 crc kubenswrapper[4843]: I0318 12:45:52.600088 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs"] Mar 18 12:45:52 crc kubenswrapper[4843]: W0318 12:45:52.604747 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86207772_fe8f_4753_a658_3827b5cc18b2.slice/crio-4e17f4f31ac4b4259c6c4bdea019a081402a88e71d1dbf1d0cd9db34c6be5719 WatchSource:0}: Error finding container 4e17f4f31ac4b4259c6c4bdea019a081402a88e71d1dbf1d0cd9db34c6be5719: Status 404 returned error can't find the container with id 4e17f4f31ac4b4259c6c4bdea019a081402a88e71d1dbf1d0cd9db34c6be5719 Mar 18 12:45:53 crc kubenswrapper[4843]: I0318 12:45:53.531894 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" event={"ID":"86207772-fe8f-4753-a658-3827b5cc18b2","Type":"ContainerStarted","Data":"77b1d793a740c53fce23129496af4d178075f6e80cd8976e7f0c1b5460988d9d"} Mar 18 12:45:53 crc kubenswrapper[4843]: I0318 12:45:53.532279 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" event={"ID":"86207772-fe8f-4753-a658-3827b5cc18b2","Type":"ContainerStarted","Data":"4e17f4f31ac4b4259c6c4bdea019a081402a88e71d1dbf1d0cd9db34c6be5719"} Mar 18 12:45:53 crc kubenswrapper[4843]: I0318 12:45:53.567059 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" podStartSLOduration=2.330325266 podStartE2EDuration="2.567037658s" podCreationTimestamp="2026-03-18 12:45:51 +0000 UTC" firstStartedPulling="2026-03-18 12:45:52.60744012 +0000 UTC m=+2186.323265654" lastFinishedPulling="2026-03-18 12:45:52.844152512 +0000 UTC m=+2186.559978046" observedRunningTime="2026-03-18 12:45:53.564558068 +0000 UTC m=+2187.280383592" watchObservedRunningTime="2026-03-18 12:45:53.567037658 +0000 UTC m=+2187.282863182" Mar 18 12:46:00 crc kubenswrapper[4843]: I0318 12:46:00.141406 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563966-v82gk"] Mar 18 12:46:00 crc kubenswrapper[4843]: I0318 12:46:00.143333 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563966-v82gk" Mar 18 12:46:00 crc kubenswrapper[4843]: I0318 12:46:00.145624 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:46:00 crc kubenswrapper[4843]: I0318 12:46:00.145729 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:46:00 crc kubenswrapper[4843]: I0318 12:46:00.145734 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:46:00 crc kubenswrapper[4843]: I0318 12:46:00.154078 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563966-v82gk"] Mar 18 12:46:00 crc kubenswrapper[4843]: I0318 12:46:00.249755 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdsx7\" (UniqueName: \"kubernetes.io/projected/7787f4ee-769c-4b2e-94e6-975db57ccc1b-kube-api-access-vdsx7\") pod \"auto-csr-approver-29563966-v82gk\" (UID: \"7787f4ee-769c-4b2e-94e6-975db57ccc1b\") " pod="openshift-infra/auto-csr-approver-29563966-v82gk" Mar 18 12:46:00 crc kubenswrapper[4843]: I0318 12:46:00.352182 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdsx7\" (UniqueName: \"kubernetes.io/projected/7787f4ee-769c-4b2e-94e6-975db57ccc1b-kube-api-access-vdsx7\") pod \"auto-csr-approver-29563966-v82gk\" (UID: \"7787f4ee-769c-4b2e-94e6-975db57ccc1b\") " pod="openshift-infra/auto-csr-approver-29563966-v82gk" Mar 18 12:46:00 crc kubenswrapper[4843]: I0318 12:46:00.371967 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdsx7\" (UniqueName: \"kubernetes.io/projected/7787f4ee-769c-4b2e-94e6-975db57ccc1b-kube-api-access-vdsx7\") pod \"auto-csr-approver-29563966-v82gk\" (UID: \"7787f4ee-769c-4b2e-94e6-975db57ccc1b\") " pod="openshift-infra/auto-csr-approver-29563966-v82gk" Mar 18 12:46:00 crc kubenswrapper[4843]: I0318 12:46:00.464870 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563966-v82gk" Mar 18 12:46:00 crc kubenswrapper[4843]: I0318 12:46:00.966258 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563966-v82gk"] Mar 18 12:46:00 crc kubenswrapper[4843]: I0318 12:46:00.969498 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:46:01 crc kubenswrapper[4843]: I0318 12:46:01.699761 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563966-v82gk" event={"ID":"7787f4ee-769c-4b2e-94e6-975db57ccc1b","Type":"ContainerStarted","Data":"0cbdec1d057c34781bdaba67b1e7181a88df6b564d166414e62b278e3c122930"} Mar 18 12:46:02 crc kubenswrapper[4843]: I0318 12:46:02.709120 4843 generic.go:334] "Generic (PLEG): container finished" podID="7787f4ee-769c-4b2e-94e6-975db57ccc1b" containerID="43d6b902c62ad5e5d9365c36b9f9e52eae37d7f6c526bc50146dfb8d85afc67a" exitCode=0 Mar 18 12:46:02 crc kubenswrapper[4843]: I0318 12:46:02.709173 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563966-v82gk" event={"ID":"7787f4ee-769c-4b2e-94e6-975db57ccc1b","Type":"ContainerDied","Data":"43d6b902c62ad5e5d9365c36b9f9e52eae37d7f6c526bc50146dfb8d85afc67a"} Mar 18 12:46:04 crc kubenswrapper[4843]: I0318 12:46:04.069797 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563966-v82gk" Mar 18 12:46:04 crc kubenswrapper[4843]: I0318 12:46:04.174341 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdsx7\" (UniqueName: \"kubernetes.io/projected/7787f4ee-769c-4b2e-94e6-975db57ccc1b-kube-api-access-vdsx7\") pod \"7787f4ee-769c-4b2e-94e6-975db57ccc1b\" (UID: \"7787f4ee-769c-4b2e-94e6-975db57ccc1b\") " Mar 18 12:46:04 crc kubenswrapper[4843]: I0318 12:46:04.180677 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7787f4ee-769c-4b2e-94e6-975db57ccc1b-kube-api-access-vdsx7" (OuterVolumeSpecName: "kube-api-access-vdsx7") pod "7787f4ee-769c-4b2e-94e6-975db57ccc1b" (UID: "7787f4ee-769c-4b2e-94e6-975db57ccc1b"). InnerVolumeSpecName "kube-api-access-vdsx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:46:04 crc kubenswrapper[4843]: I0318 12:46:04.276303 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdsx7\" (UniqueName: \"kubernetes.io/projected/7787f4ee-769c-4b2e-94e6-975db57ccc1b-kube-api-access-vdsx7\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:04 crc kubenswrapper[4843]: I0318 12:46:04.430178 4843 scope.go:117] "RemoveContainer" containerID="bcdf1634e918c197bd848d59680a4a1bd7ebec66d9798689e2470cd359e4a791" Mar 18 12:46:04 crc kubenswrapper[4843]: I0318 12:46:04.473929 4843 scope.go:117] "RemoveContainer" containerID="926b6c02a1435f03f9a3e3c7259af931bba559a9e5a902d2674188ca909817cd" Mar 18 12:46:04 crc kubenswrapper[4843]: I0318 12:46:04.569544 4843 scope.go:117] "RemoveContainer" containerID="52a1fe188bd855ee8cb9efe1e288e0c092fae75c0c11b7bc53e2cb23a689c157" Mar 18 12:46:04 crc kubenswrapper[4843]: I0318 12:46:04.617385 4843 scope.go:117] "RemoveContainer" containerID="dc622d8f50c34a532c7b1d0b1f69be3eaa1d5bf177a148c1c9d1ab8243b1e839" Mar 18 12:46:04 crc kubenswrapper[4843]: I0318 12:46:04.736305 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563966-v82gk" Mar 18 12:46:04 crc kubenswrapper[4843]: I0318 12:46:04.736317 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563966-v82gk" event={"ID":"7787f4ee-769c-4b2e-94e6-975db57ccc1b","Type":"ContainerDied","Data":"0cbdec1d057c34781bdaba67b1e7181a88df6b564d166414e62b278e3c122930"} Mar 18 12:46:04 crc kubenswrapper[4843]: I0318 12:46:04.736363 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cbdec1d057c34781bdaba67b1e7181a88df6b564d166414e62b278e3c122930" Mar 18 12:46:05 crc kubenswrapper[4843]: I0318 12:46:05.143388 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563960-nwkgt"] Mar 18 12:46:05 crc kubenswrapper[4843]: I0318 12:46:05.167018 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563960-nwkgt"] Mar 18 12:46:06 crc kubenswrapper[4843]: I0318 12:46:06.996357 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a42fe4bf-05f6-40d6-a447-93b30093c598" path="/var/lib/kubelet/pods/a42fe4bf-05f6-40d6-a447-93b30093c598/volumes" Mar 18 12:46:31 crc kubenswrapper[4843]: I0318 12:46:31.015926 4843 generic.go:334] "Generic (PLEG): container finished" podID="86207772-fe8f-4753-a658-3827b5cc18b2" containerID="77b1d793a740c53fce23129496af4d178075f6e80cd8976e7f0c1b5460988d9d" exitCode=0 Mar 18 12:46:31 crc kubenswrapper[4843]: I0318 12:46:31.016016 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" event={"ID":"86207772-fe8f-4753-a658-3827b5cc18b2","Type":"ContainerDied","Data":"77b1d793a740c53fce23129496af4d178075f6e80cd8976e7f0c1b5460988d9d"} Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.573418 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.719537 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.719717 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-inventory\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.719763 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-ovn-combined-ca-bundle\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.719823 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-ssh-key-openstack-edpm-ipam\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.719873 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.719911 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-repo-setup-combined-ca-bundle\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.719933 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.719990 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-bootstrap-combined-ca-bundle\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.720026 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-nova-combined-ca-bundle\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.720057 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-neutron-metadata-combined-ca-bundle\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.720079 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-libvirt-combined-ca-bundle\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.720112 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-telemetry-combined-ca-bundle\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.720132 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zqhq\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-kube-api-access-9zqhq\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.720170 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"86207772-fe8f-4753-a658-3827b5cc18b2\" (UID: \"86207772-fe8f-4753-a658-3827b5cc18b2\") " Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.726564 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.726895 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.729071 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.729284 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.730083 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.731362 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.731392 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.731431 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.731779 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.731780 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.731951 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-kube-api-access-9zqhq" (OuterVolumeSpecName: "kube-api-access-9zqhq") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "kube-api-access-9zqhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.742023 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.751865 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-inventory" (OuterVolumeSpecName: "inventory") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.755165 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "86207772-fe8f-4753-a658-3827b5cc18b2" (UID: "86207772-fe8f-4753-a658-3827b5cc18b2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.822552 4843 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.822798 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.822861 4843 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.822926 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.822990 4843 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.823051 4843 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.823109 4843 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.823170 4843 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.823230 4843 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.823292 4843 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.823348 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.823403 4843 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86207772-fe8f-4753-a658-3827b5cc18b2-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.823461 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zqhq\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-kube-api-access-9zqhq\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:32 crc kubenswrapper[4843]: I0318 12:46:32.823520 4843 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/86207772-fe8f-4753-a658-3827b5cc18b2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.038044 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" event={"ID":"86207772-fe8f-4753-a658-3827b5cc18b2","Type":"ContainerDied","Data":"4e17f4f31ac4b4259c6c4bdea019a081402a88e71d1dbf1d0cd9db34c6be5719"} Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.038100 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e17f4f31ac4b4259c6c4bdea019a081402a88e71d1dbf1d0cd9db34c6be5719" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.038154 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-g26vs" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.195061 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls"] Mar 18 12:46:33 crc kubenswrapper[4843]: E0318 12:46:33.195525 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86207772-fe8f-4753-a658-3827b5cc18b2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.195545 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="86207772-fe8f-4753-a658-3827b5cc18b2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 12:46:33 crc kubenswrapper[4843]: E0318 12:46:33.195578 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7787f4ee-769c-4b2e-94e6-975db57ccc1b" containerName="oc" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.195589 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7787f4ee-769c-4b2e-94e6-975db57ccc1b" containerName="oc" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.195963 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="86207772-fe8f-4753-a658-3827b5cc18b2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.195987 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7787f4ee-769c-4b2e-94e6-975db57ccc1b" containerName="oc" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.197072 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.201835 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.205967 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.205993 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.206009 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.206827 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls"] Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.206536 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.353404 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.353565 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.353606 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.354111 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.354215 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wghh2\" (UniqueName: \"kubernetes.io/projected/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-kube-api-access-wghh2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.456342 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.456404 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.456585 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.456619 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wghh2\" (UniqueName: \"kubernetes.io/projected/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-kube-api-access-wghh2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.456675 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.457725 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.460849 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.461091 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.461234 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.478094 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wghh2\" (UniqueName: \"kubernetes.io/projected/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-kube-api-access-wghh2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rhwls\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:33 crc kubenswrapper[4843]: I0318 12:46:33.556117 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:46:34 crc kubenswrapper[4843]: I0318 12:46:34.259112 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls"] Mar 18 12:46:34 crc kubenswrapper[4843]: W0318 12:46:34.263266 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd49d6abb_b38e_42a3_a374_054ddbd3d2f7.slice/crio-af8a81abbe462f66204cc07356539effc58c13baf470c1226ff41cfdc03b4fb8 WatchSource:0}: Error finding container af8a81abbe462f66204cc07356539effc58c13baf470c1226ff41cfdc03b4fb8: Status 404 returned error can't find the container with id af8a81abbe462f66204cc07356539effc58c13baf470c1226ff41cfdc03b4fb8 Mar 18 12:46:35 crc kubenswrapper[4843]: I0318 12:46:35.056902 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" event={"ID":"d49d6abb-b38e-42a3-a374-054ddbd3d2f7","Type":"ContainerStarted","Data":"d61639974269dfb19f0d7d8d8d144ececfd3e9f78ee7ccad22e2fcf30417b7c7"} Mar 18 12:46:35 crc kubenswrapper[4843]: I0318 12:46:35.057165 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" event={"ID":"d49d6abb-b38e-42a3-a374-054ddbd3d2f7","Type":"ContainerStarted","Data":"af8a81abbe462f66204cc07356539effc58c13baf470c1226ff41cfdc03b4fb8"} Mar 18 12:46:35 crc kubenswrapper[4843]: I0318 12:46:35.080508 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" podStartSLOduration=1.931985984 podStartE2EDuration="2.080485094s" podCreationTimestamp="2026-03-18 12:46:33 +0000 UTC" firstStartedPulling="2026-03-18 12:46:34.265728294 +0000 UTC m=+2227.981553818" lastFinishedPulling="2026-03-18 12:46:34.414227374 +0000 UTC m=+2228.130052928" observedRunningTime="2026-03-18 12:46:35.07648692 +0000 UTC m=+2228.792312454" watchObservedRunningTime="2026-03-18 12:46:35.080485094 +0000 UTC m=+2228.796310628" Mar 18 12:47:04 crc kubenswrapper[4843]: I0318 12:47:04.741103 4843 scope.go:117] "RemoveContainer" containerID="76548f0e424c05e0b37150d6e10f8475de4b9f42aca7a126784505369ac724fe" Mar 18 12:47:37 crc kubenswrapper[4843]: I0318 12:47:37.725843 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fd8gb"] Mar 18 12:47:37 crc kubenswrapper[4843]: I0318 12:47:37.732662 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:37 crc kubenswrapper[4843]: I0318 12:47:37.780510 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fd8gb"] Mar 18 12:47:37 crc kubenswrapper[4843]: I0318 12:47:37.837399 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed042ad-1436-449a-8407-c23e3c42a56d-utilities\") pod \"community-operators-fd8gb\" (UID: \"1ed042ad-1436-449a-8407-c23e3c42a56d\") " pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:37 crc kubenswrapper[4843]: I0318 12:47:37.837522 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed042ad-1436-449a-8407-c23e3c42a56d-catalog-content\") pod \"community-operators-fd8gb\" (UID: \"1ed042ad-1436-449a-8407-c23e3c42a56d\") " pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:37 crc kubenswrapper[4843]: I0318 12:47:37.837591 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9hbl\" (UniqueName: \"kubernetes.io/projected/1ed042ad-1436-449a-8407-c23e3c42a56d-kube-api-access-n9hbl\") pod \"community-operators-fd8gb\" (UID: \"1ed042ad-1436-449a-8407-c23e3c42a56d\") " pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:37 crc kubenswrapper[4843]: I0318 12:47:37.940076 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed042ad-1436-449a-8407-c23e3c42a56d-utilities\") pod \"community-operators-fd8gb\" (UID: \"1ed042ad-1436-449a-8407-c23e3c42a56d\") " pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:37 crc kubenswrapper[4843]: I0318 12:47:37.940199 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed042ad-1436-449a-8407-c23e3c42a56d-catalog-content\") pod \"community-operators-fd8gb\" (UID: \"1ed042ad-1436-449a-8407-c23e3c42a56d\") " pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:37 crc kubenswrapper[4843]: I0318 12:47:37.940263 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9hbl\" (UniqueName: \"kubernetes.io/projected/1ed042ad-1436-449a-8407-c23e3c42a56d-kube-api-access-n9hbl\") pod \"community-operators-fd8gb\" (UID: \"1ed042ad-1436-449a-8407-c23e3c42a56d\") " pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:37 crc kubenswrapper[4843]: I0318 12:47:37.940727 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed042ad-1436-449a-8407-c23e3c42a56d-utilities\") pod \"community-operators-fd8gb\" (UID: \"1ed042ad-1436-449a-8407-c23e3c42a56d\") " pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:37 crc kubenswrapper[4843]: I0318 12:47:37.940804 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed042ad-1436-449a-8407-c23e3c42a56d-catalog-content\") pod \"community-operators-fd8gb\" (UID: \"1ed042ad-1436-449a-8407-c23e3c42a56d\") " pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:37 crc kubenswrapper[4843]: I0318 12:47:37.965504 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9hbl\" (UniqueName: \"kubernetes.io/projected/1ed042ad-1436-449a-8407-c23e3c42a56d-kube-api-access-n9hbl\") pod \"community-operators-fd8gb\" (UID: \"1ed042ad-1436-449a-8407-c23e3c42a56d\") " pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:38 crc kubenswrapper[4843]: I0318 12:47:38.077619 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:39 crc kubenswrapper[4843]: I0318 12:47:39.552163 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fd8gb"] Mar 18 12:47:40 crc kubenswrapper[4843]: I0318 12:47:40.081347 4843 generic.go:334] "Generic (PLEG): container finished" podID="1ed042ad-1436-449a-8407-c23e3c42a56d" containerID="b4cf5ebcfbc5bf46f9354b28916fa888c11a379a5825fc67d2ac0840050fa155" exitCode=0 Mar 18 12:47:40 crc kubenswrapper[4843]: I0318 12:47:40.081366 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd8gb" event={"ID":"1ed042ad-1436-449a-8407-c23e3c42a56d","Type":"ContainerDied","Data":"b4cf5ebcfbc5bf46f9354b28916fa888c11a379a5825fc67d2ac0840050fa155"} Mar 18 12:47:40 crc kubenswrapper[4843]: I0318 12:47:40.081672 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd8gb" event={"ID":"1ed042ad-1436-449a-8407-c23e3c42a56d","Type":"ContainerStarted","Data":"8836660a87d379df5c58a68b4450f54c3d2a6a6e057d176fd17e2d27327b269e"} Mar 18 12:47:43 crc kubenswrapper[4843]: I0318 12:47:43.110785 4843 generic.go:334] "Generic (PLEG): container finished" podID="1ed042ad-1436-449a-8407-c23e3c42a56d" containerID="2d5b9adb5569b9a48b1c52f2ff48b7a262def1a2e4d85fc8b3f96bdb4a009846" exitCode=0 Mar 18 12:47:43 crc kubenswrapper[4843]: I0318 12:47:43.110888 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd8gb" event={"ID":"1ed042ad-1436-449a-8407-c23e3c42a56d","Type":"ContainerDied","Data":"2d5b9adb5569b9a48b1c52f2ff48b7a262def1a2e4d85fc8b3f96bdb4a009846"} Mar 18 12:47:44 crc kubenswrapper[4843]: I0318 12:47:44.123465 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd8gb" event={"ID":"1ed042ad-1436-449a-8407-c23e3c42a56d","Type":"ContainerStarted","Data":"18c3f6b534908e61ecb43a286815e66579f990d844714c6128f28f5008e2af28"} Mar 18 12:47:44 crc kubenswrapper[4843]: I0318 12:47:44.151872 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fd8gb" podStartSLOduration=3.718317328 podStartE2EDuration="7.151849906s" podCreationTimestamp="2026-03-18 12:47:37 +0000 UTC" firstStartedPulling="2026-03-18 12:47:40.08282309 +0000 UTC m=+2293.798648614" lastFinishedPulling="2026-03-18 12:47:43.516355668 +0000 UTC m=+2297.232181192" observedRunningTime="2026-03-18 12:47:44.142328786 +0000 UTC m=+2297.858154310" watchObservedRunningTime="2026-03-18 12:47:44.151849906 +0000 UTC m=+2297.867675430" Mar 18 12:47:46 crc kubenswrapper[4843]: I0318 12:47:46.141367 4843 generic.go:334] "Generic (PLEG): container finished" podID="d49d6abb-b38e-42a3-a374-054ddbd3d2f7" containerID="d61639974269dfb19f0d7d8d8d144ececfd3e9f78ee7ccad22e2fcf30417b7c7" exitCode=0 Mar 18 12:47:46 crc kubenswrapper[4843]: I0318 12:47:46.141447 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" event={"ID":"d49d6abb-b38e-42a3-a374-054ddbd3d2f7","Type":"ContainerDied","Data":"d61639974269dfb19f0d7d8d8d144ececfd3e9f78ee7ccad22e2fcf30417b7c7"} Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.592300 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.733491 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ovn-combined-ca-bundle\") pod \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.733899 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ovncontroller-config-0\") pod \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.734100 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wghh2\" (UniqueName: \"kubernetes.io/projected/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-kube-api-access-wghh2\") pod \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.734315 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ssh-key-openstack-edpm-ipam\") pod \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.734461 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-inventory\") pod \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\" (UID: \"d49d6abb-b38e-42a3-a374-054ddbd3d2f7\") " Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.741620 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d49d6abb-b38e-42a3-a374-054ddbd3d2f7" (UID: "d49d6abb-b38e-42a3-a374-054ddbd3d2f7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.749208 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-kube-api-access-wghh2" (OuterVolumeSpecName: "kube-api-access-wghh2") pod "d49d6abb-b38e-42a3-a374-054ddbd3d2f7" (UID: "d49d6abb-b38e-42a3-a374-054ddbd3d2f7"). InnerVolumeSpecName "kube-api-access-wghh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.762583 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "d49d6abb-b38e-42a3-a374-054ddbd3d2f7" (UID: "d49d6abb-b38e-42a3-a374-054ddbd3d2f7"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.763695 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-inventory" (OuterVolumeSpecName: "inventory") pod "d49d6abb-b38e-42a3-a374-054ddbd3d2f7" (UID: "d49d6abb-b38e-42a3-a374-054ddbd3d2f7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.766553 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d49d6abb-b38e-42a3-a374-054ddbd3d2f7" (UID: "d49d6abb-b38e-42a3-a374-054ddbd3d2f7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.836851 4843 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.836942 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wghh2\" (UniqueName: \"kubernetes.io/projected/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-kube-api-access-wghh2\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.836955 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.836968 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:47 crc kubenswrapper[4843]: I0318 12:47:47.836980 4843 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d49d6abb-b38e-42a3-a374-054ddbd3d2f7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.078275 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.078394 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.129418 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.162108 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" event={"ID":"d49d6abb-b38e-42a3-a374-054ddbd3d2f7","Type":"ContainerDied","Data":"af8a81abbe462f66204cc07356539effc58c13baf470c1226ff41cfdc03b4fb8"} Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.162171 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rhwls" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.162186 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af8a81abbe462f66204cc07356539effc58c13baf470c1226ff41cfdc03b4fb8" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.217030 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.271464 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw"] Mar 18 12:47:48 crc kubenswrapper[4843]: E0318 12:47:48.272186 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d49d6abb-b38e-42a3-a374-054ddbd3d2f7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.272209 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d49d6abb-b38e-42a3-a374-054ddbd3d2f7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.272439 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d49d6abb-b38e-42a3-a374-054ddbd3d2f7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.273422 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.277233 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.277234 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.277245 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.277701 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.277814 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.278325 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.298291 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw"] Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.449499 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.449601 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.449727 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.449765 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.449825 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48gxz\" (UniqueName: \"kubernetes.io/projected/8894c42c-6e17-4c87-84e1-1888c1800b04-kube-api-access-48gxz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.449876 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.551481 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.551625 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.551922 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.552014 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.552088 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48gxz\" (UniqueName: \"kubernetes.io/projected/8894c42c-6e17-4c87-84e1-1888c1800b04-kube-api-access-48gxz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.552171 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.557408 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.557620 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.557793 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.558452 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.560460 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.578944 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48gxz\" (UniqueName: \"kubernetes.io/projected/8894c42c-6e17-4c87-84e1-1888c1800b04-kube-api-access-48gxz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:48 crc kubenswrapper[4843]: I0318 12:47:48.590899 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:47:49 crc kubenswrapper[4843]: I0318 12:47:49.160736 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw"] Mar 18 12:47:49 crc kubenswrapper[4843]: W0318 12:47:49.167089 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8894c42c_6e17_4c87_84e1_1888c1800b04.slice/crio-24967846c4214ed467b7523f7b5336661ef97ba7418adab6cbbae460014ca81b WatchSource:0}: Error finding container 24967846c4214ed467b7523f7b5336661ef97ba7418adab6cbbae460014ca81b: Status 404 returned error can't find the container with id 24967846c4214ed467b7523f7b5336661ef97ba7418adab6cbbae460014ca81b Mar 18 12:47:49 crc kubenswrapper[4843]: I0318 12:47:49.188644 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fd8gb"] Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.035105 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.035449 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.181803 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fd8gb" podUID="1ed042ad-1436-449a-8407-c23e3c42a56d" containerName="registry-server" containerID="cri-o://18c3f6b534908e61ecb43a286815e66579f990d844714c6128f28f5008e2af28" gracePeriod=2 Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.182450 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" event={"ID":"8894c42c-6e17-4c87-84e1-1888c1800b04","Type":"ContainerStarted","Data":"3984d7913f66c60fcefd709dd49bc640e937a34299a0be9b184e6d2779fb1e5a"} Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.182499 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" event={"ID":"8894c42c-6e17-4c87-84e1-1888c1800b04","Type":"ContainerStarted","Data":"24967846c4214ed467b7523f7b5336661ef97ba7418adab6cbbae460014ca81b"} Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.204618 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" podStartSLOduration=2.049678957 podStartE2EDuration="2.204595317s" podCreationTimestamp="2026-03-18 12:47:48 +0000 UTC" firstStartedPulling="2026-03-18 12:47:49.169292119 +0000 UTC m=+2302.885117633" lastFinishedPulling="2026-03-18 12:47:49.324208469 +0000 UTC m=+2303.040033993" observedRunningTime="2026-03-18 12:47:50.202506237 +0000 UTC m=+2303.918331771" watchObservedRunningTime="2026-03-18 12:47:50.204595317 +0000 UTC m=+2303.920420841" Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.619405 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.699568 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed042ad-1436-449a-8407-c23e3c42a56d-catalog-content\") pod \"1ed042ad-1436-449a-8407-c23e3c42a56d\" (UID: \"1ed042ad-1436-449a-8407-c23e3c42a56d\") " Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.699920 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9hbl\" (UniqueName: \"kubernetes.io/projected/1ed042ad-1436-449a-8407-c23e3c42a56d-kube-api-access-n9hbl\") pod \"1ed042ad-1436-449a-8407-c23e3c42a56d\" (UID: \"1ed042ad-1436-449a-8407-c23e3c42a56d\") " Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.700047 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed042ad-1436-449a-8407-c23e3c42a56d-utilities\") pod \"1ed042ad-1436-449a-8407-c23e3c42a56d\" (UID: \"1ed042ad-1436-449a-8407-c23e3c42a56d\") " Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.701126 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed042ad-1436-449a-8407-c23e3c42a56d-utilities" (OuterVolumeSpecName: "utilities") pod "1ed042ad-1436-449a-8407-c23e3c42a56d" (UID: "1ed042ad-1436-449a-8407-c23e3c42a56d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.706886 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed042ad-1436-449a-8407-c23e3c42a56d-kube-api-access-n9hbl" (OuterVolumeSpecName: "kube-api-access-n9hbl") pod "1ed042ad-1436-449a-8407-c23e3c42a56d" (UID: "1ed042ad-1436-449a-8407-c23e3c42a56d"). InnerVolumeSpecName "kube-api-access-n9hbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.757736 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed042ad-1436-449a-8407-c23e3c42a56d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ed042ad-1436-449a-8407-c23e3c42a56d" (UID: "1ed042ad-1436-449a-8407-c23e3c42a56d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.802270 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed042ad-1436-449a-8407-c23e3c42a56d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.802309 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9hbl\" (UniqueName: \"kubernetes.io/projected/1ed042ad-1436-449a-8407-c23e3c42a56d-kube-api-access-n9hbl\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:50 crc kubenswrapper[4843]: I0318 12:47:50.802345 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed042ad-1436-449a-8407-c23e3c42a56d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.195252 4843 generic.go:334] "Generic (PLEG): container finished" podID="1ed042ad-1436-449a-8407-c23e3c42a56d" containerID="18c3f6b534908e61ecb43a286815e66579f990d844714c6128f28f5008e2af28" exitCode=0 Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.195546 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd8gb" event={"ID":"1ed042ad-1436-449a-8407-c23e3c42a56d","Type":"ContainerDied","Data":"18c3f6b534908e61ecb43a286815e66579f990d844714c6128f28f5008e2af28"} Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.195610 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd8gb" event={"ID":"1ed042ad-1436-449a-8407-c23e3c42a56d","Type":"ContainerDied","Data":"8836660a87d379df5c58a68b4450f54c3d2a6a6e057d176fd17e2d27327b269e"} Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.195631 4843 scope.go:117] "RemoveContainer" containerID="18c3f6b534908e61ecb43a286815e66579f990d844714c6128f28f5008e2af28" Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.195518 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd8gb" Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.222264 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fd8gb"] Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.227881 4843 scope.go:117] "RemoveContainer" containerID="2d5b9adb5569b9a48b1c52f2ff48b7a262def1a2e4d85fc8b3f96bdb4a009846" Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.231790 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fd8gb"] Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.250630 4843 scope.go:117] "RemoveContainer" containerID="b4cf5ebcfbc5bf46f9354b28916fa888c11a379a5825fc67d2ac0840050fa155" Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.305897 4843 scope.go:117] "RemoveContainer" containerID="18c3f6b534908e61ecb43a286815e66579f990d844714c6128f28f5008e2af28" Mar 18 12:47:51 crc kubenswrapper[4843]: E0318 12:47:51.306617 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18c3f6b534908e61ecb43a286815e66579f990d844714c6128f28f5008e2af28\": container with ID starting with 18c3f6b534908e61ecb43a286815e66579f990d844714c6128f28f5008e2af28 not found: ID does not exist" containerID="18c3f6b534908e61ecb43a286815e66579f990d844714c6128f28f5008e2af28" Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.306686 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18c3f6b534908e61ecb43a286815e66579f990d844714c6128f28f5008e2af28"} err="failed to get container status \"18c3f6b534908e61ecb43a286815e66579f990d844714c6128f28f5008e2af28\": rpc error: code = NotFound desc = could not find container \"18c3f6b534908e61ecb43a286815e66579f990d844714c6128f28f5008e2af28\": container with ID starting with 18c3f6b534908e61ecb43a286815e66579f990d844714c6128f28f5008e2af28 not found: ID does not exist" Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.306723 4843 scope.go:117] "RemoveContainer" containerID="2d5b9adb5569b9a48b1c52f2ff48b7a262def1a2e4d85fc8b3f96bdb4a009846" Mar 18 12:47:51 crc kubenswrapper[4843]: E0318 12:47:51.307254 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5b9adb5569b9a48b1c52f2ff48b7a262def1a2e4d85fc8b3f96bdb4a009846\": container with ID starting with 2d5b9adb5569b9a48b1c52f2ff48b7a262def1a2e4d85fc8b3f96bdb4a009846 not found: ID does not exist" containerID="2d5b9adb5569b9a48b1c52f2ff48b7a262def1a2e4d85fc8b3f96bdb4a009846" Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.307293 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5b9adb5569b9a48b1c52f2ff48b7a262def1a2e4d85fc8b3f96bdb4a009846"} err="failed to get container status \"2d5b9adb5569b9a48b1c52f2ff48b7a262def1a2e4d85fc8b3f96bdb4a009846\": rpc error: code = NotFound desc = could not find container \"2d5b9adb5569b9a48b1c52f2ff48b7a262def1a2e4d85fc8b3f96bdb4a009846\": container with ID starting with 2d5b9adb5569b9a48b1c52f2ff48b7a262def1a2e4d85fc8b3f96bdb4a009846 not found: ID does not exist" Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.307322 4843 scope.go:117] "RemoveContainer" containerID="b4cf5ebcfbc5bf46f9354b28916fa888c11a379a5825fc67d2ac0840050fa155" Mar 18 12:47:51 crc kubenswrapper[4843]: E0318 12:47:51.307688 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4cf5ebcfbc5bf46f9354b28916fa888c11a379a5825fc67d2ac0840050fa155\": container with ID starting with b4cf5ebcfbc5bf46f9354b28916fa888c11a379a5825fc67d2ac0840050fa155 not found: ID does not exist" containerID="b4cf5ebcfbc5bf46f9354b28916fa888c11a379a5825fc67d2ac0840050fa155" Mar 18 12:47:51 crc kubenswrapper[4843]: I0318 12:47:51.307718 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4cf5ebcfbc5bf46f9354b28916fa888c11a379a5825fc67d2ac0840050fa155"} err="failed to get container status \"b4cf5ebcfbc5bf46f9354b28916fa888c11a379a5825fc67d2ac0840050fa155\": rpc error: code = NotFound desc = could not find container \"b4cf5ebcfbc5bf46f9354b28916fa888c11a379a5825fc67d2ac0840050fa155\": container with ID starting with b4cf5ebcfbc5bf46f9354b28916fa888c11a379a5825fc67d2ac0840050fa155 not found: ID does not exist" Mar 18 12:47:53 crc kubenswrapper[4843]: I0318 12:47:53.269021 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed042ad-1436-449a-8407-c23e3c42a56d" path="/var/lib/kubelet/pods/1ed042ad-1436-449a-8407-c23e3c42a56d/volumes" Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.147435 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563968-66h99"] Mar 18 12:48:00 crc kubenswrapper[4843]: E0318 12:48:00.149251 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed042ad-1436-449a-8407-c23e3c42a56d" containerName="extract-utilities" Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.149277 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed042ad-1436-449a-8407-c23e3c42a56d" containerName="extract-utilities" Mar 18 12:48:00 crc kubenswrapper[4843]: E0318 12:48:00.149312 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed042ad-1436-449a-8407-c23e3c42a56d" containerName="registry-server" Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.149320 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed042ad-1436-449a-8407-c23e3c42a56d" containerName="registry-server" Mar 18 12:48:00 crc kubenswrapper[4843]: E0318 12:48:00.149353 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed042ad-1436-449a-8407-c23e3c42a56d" containerName="extract-content" Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.149360 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed042ad-1436-449a-8407-c23e3c42a56d" containerName="extract-content" Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.149635 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed042ad-1436-449a-8407-c23e3c42a56d" containerName="registry-server" Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.150743 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563968-66h99" Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.153001 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.153384 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.153417 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.163883 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563968-66h99"] Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.175950 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mggk\" (UniqueName: \"kubernetes.io/projected/a22105f2-fd27-4d7a-8b2a-26770e6ddfab-kube-api-access-8mggk\") pod \"auto-csr-approver-29563968-66h99\" (UID: \"a22105f2-fd27-4d7a-8b2a-26770e6ddfab\") " pod="openshift-infra/auto-csr-approver-29563968-66h99" Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.277853 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mggk\" (UniqueName: \"kubernetes.io/projected/a22105f2-fd27-4d7a-8b2a-26770e6ddfab-kube-api-access-8mggk\") pod \"auto-csr-approver-29563968-66h99\" (UID: \"a22105f2-fd27-4d7a-8b2a-26770e6ddfab\") " pod="openshift-infra/auto-csr-approver-29563968-66h99" Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.302617 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mggk\" (UniqueName: \"kubernetes.io/projected/a22105f2-fd27-4d7a-8b2a-26770e6ddfab-kube-api-access-8mggk\") pod \"auto-csr-approver-29563968-66h99\" (UID: \"a22105f2-fd27-4d7a-8b2a-26770e6ddfab\") " pod="openshift-infra/auto-csr-approver-29563968-66h99" Mar 18 12:48:00 crc kubenswrapper[4843]: I0318 12:48:00.473429 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563968-66h99" Mar 18 12:48:00 crc kubenswrapper[4843]: W0318 12:48:00.984263 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda22105f2_fd27_4d7a_8b2a_26770e6ddfab.slice/crio-065eb887e71b69a1d28f2c23dbd3034be3c65a336b685f2eb382926e2009a825 WatchSource:0}: Error finding container 065eb887e71b69a1d28f2c23dbd3034be3c65a336b685f2eb382926e2009a825: Status 404 returned error can't find the container with id 065eb887e71b69a1d28f2c23dbd3034be3c65a336b685f2eb382926e2009a825 Mar 18 12:48:01 crc kubenswrapper[4843]: I0318 12:48:01.011786 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563968-66h99"] Mar 18 12:48:01 crc kubenswrapper[4843]: I0318 12:48:01.330819 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563968-66h99" event={"ID":"a22105f2-fd27-4d7a-8b2a-26770e6ddfab","Type":"ContainerStarted","Data":"065eb887e71b69a1d28f2c23dbd3034be3c65a336b685f2eb382926e2009a825"} Mar 18 12:48:03 crc kubenswrapper[4843]: I0318 12:48:03.353384 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563968-66h99" event={"ID":"a22105f2-fd27-4d7a-8b2a-26770e6ddfab","Type":"ContainerStarted","Data":"a9b50367b26fab31d281d048dfe6854d683091c8b0632122aaa08bf861cc864c"} Mar 18 12:48:03 crc kubenswrapper[4843]: I0318 12:48:03.370134 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563968-66h99" podStartSLOduration=1.396437903 podStartE2EDuration="3.370113823s" podCreationTimestamp="2026-03-18 12:48:00 +0000 UTC" firstStartedPulling="2026-03-18 12:48:00.986815985 +0000 UTC m=+2314.702641509" lastFinishedPulling="2026-03-18 12:48:02.960491905 +0000 UTC m=+2316.676317429" observedRunningTime="2026-03-18 12:48:03.366541131 +0000 UTC m=+2317.082366705" watchObservedRunningTime="2026-03-18 12:48:03.370113823 +0000 UTC m=+2317.085939347" Mar 18 12:48:04 crc kubenswrapper[4843]: I0318 12:48:04.363092 4843 generic.go:334] "Generic (PLEG): container finished" podID="a22105f2-fd27-4d7a-8b2a-26770e6ddfab" containerID="a9b50367b26fab31d281d048dfe6854d683091c8b0632122aaa08bf861cc864c" exitCode=0 Mar 18 12:48:04 crc kubenswrapper[4843]: I0318 12:48:04.363188 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563968-66h99" event={"ID":"a22105f2-fd27-4d7a-8b2a-26770e6ddfab","Type":"ContainerDied","Data":"a9b50367b26fab31d281d048dfe6854d683091c8b0632122aaa08bf861cc864c"} Mar 18 12:48:05 crc kubenswrapper[4843]: I0318 12:48:05.800540 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563968-66h99" Mar 18 12:48:05 crc kubenswrapper[4843]: I0318 12:48:05.866380 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mggk\" (UniqueName: \"kubernetes.io/projected/a22105f2-fd27-4d7a-8b2a-26770e6ddfab-kube-api-access-8mggk\") pod \"a22105f2-fd27-4d7a-8b2a-26770e6ddfab\" (UID: \"a22105f2-fd27-4d7a-8b2a-26770e6ddfab\") " Mar 18 12:48:05 crc kubenswrapper[4843]: I0318 12:48:05.873903 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a22105f2-fd27-4d7a-8b2a-26770e6ddfab-kube-api-access-8mggk" (OuterVolumeSpecName: "kube-api-access-8mggk") pod "a22105f2-fd27-4d7a-8b2a-26770e6ddfab" (UID: "a22105f2-fd27-4d7a-8b2a-26770e6ddfab"). InnerVolumeSpecName "kube-api-access-8mggk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:48:05 crc kubenswrapper[4843]: I0318 12:48:05.969012 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mggk\" (UniqueName: \"kubernetes.io/projected/a22105f2-fd27-4d7a-8b2a-26770e6ddfab-kube-api-access-8mggk\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:06 crc kubenswrapper[4843]: I0318 12:48:06.389017 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563968-66h99" Mar 18 12:48:06 crc kubenswrapper[4843]: I0318 12:48:06.389040 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563968-66h99" event={"ID":"a22105f2-fd27-4d7a-8b2a-26770e6ddfab","Type":"ContainerDied","Data":"065eb887e71b69a1d28f2c23dbd3034be3c65a336b685f2eb382926e2009a825"} Mar 18 12:48:06 crc kubenswrapper[4843]: I0318 12:48:06.389125 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="065eb887e71b69a1d28f2c23dbd3034be3c65a336b685f2eb382926e2009a825" Mar 18 12:48:06 crc kubenswrapper[4843]: I0318 12:48:06.458661 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563962-k5ln6"] Mar 18 12:48:06 crc kubenswrapper[4843]: I0318 12:48:06.466132 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563962-k5ln6"] Mar 18 12:48:07 crc kubenswrapper[4843]: I0318 12:48:07.002211 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15b06ed2-150d-4ded-97d8-24201f827e09" path="/var/lib/kubelet/pods/15b06ed2-150d-4ded-97d8-24201f827e09/volumes" Mar 18 12:48:20 crc kubenswrapper[4843]: I0318 12:48:20.035058 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:48:20 crc kubenswrapper[4843]: I0318 12:48:20.035711 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.697123 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n99xv"] Mar 18 12:48:32 crc kubenswrapper[4843]: E0318 12:48:32.698748 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22105f2-fd27-4d7a-8b2a-26770e6ddfab" containerName="oc" Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.698782 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22105f2-fd27-4d7a-8b2a-26770e6ddfab" containerName="oc" Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.699281 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22105f2-fd27-4d7a-8b2a-26770e6ddfab" containerName="oc" Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.703048 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.717574 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n99xv"] Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.746880 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-utilities\") pod \"certified-operators-n99xv\" (UID: \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\") " pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.747038 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-catalog-content\") pod \"certified-operators-n99xv\" (UID: \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\") " pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.747083 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppxtf\" (UniqueName: \"kubernetes.io/projected/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-kube-api-access-ppxtf\") pod \"certified-operators-n99xv\" (UID: \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\") " pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.849277 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-utilities\") pod \"certified-operators-n99xv\" (UID: \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\") " pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.849405 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-catalog-content\") pod \"certified-operators-n99xv\" (UID: \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\") " pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.849439 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppxtf\" (UniqueName: \"kubernetes.io/projected/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-kube-api-access-ppxtf\") pod \"certified-operators-n99xv\" (UID: \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\") " pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.849829 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-utilities\") pod \"certified-operators-n99xv\" (UID: \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\") " pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.850040 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-catalog-content\") pod \"certified-operators-n99xv\" (UID: \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\") " pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:32 crc kubenswrapper[4843]: I0318 12:48:32.872254 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppxtf\" (UniqueName: \"kubernetes.io/projected/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-kube-api-access-ppxtf\") pod \"certified-operators-n99xv\" (UID: \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\") " pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:33 crc kubenswrapper[4843]: I0318 12:48:33.024297 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:33 crc kubenswrapper[4843]: I0318 12:48:33.649113 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n99xv"] Mar 18 12:48:34 crc kubenswrapper[4843]: I0318 12:48:34.155822 4843 generic.go:334] "Generic (PLEG): container finished" podID="3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" containerID="bf5ea51ca1dea2b807519b793278c4fd37e0d8f3513c388ca7c41252c74911fa" exitCode=0 Mar 18 12:48:34 crc kubenswrapper[4843]: I0318 12:48:34.155911 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n99xv" event={"ID":"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46","Type":"ContainerDied","Data":"bf5ea51ca1dea2b807519b793278c4fd37e0d8f3513c388ca7c41252c74911fa"} Mar 18 12:48:34 crc kubenswrapper[4843]: I0318 12:48:34.156166 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n99xv" event={"ID":"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46","Type":"ContainerStarted","Data":"381be94dd71b05dce4968f26f190f8e5ac0b5d4e50cb9bb5d8ed56f6d31f7141"} Mar 18 12:48:36 crc kubenswrapper[4843]: I0318 12:48:36.175734 4843 generic.go:334] "Generic (PLEG): container finished" podID="3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" containerID="0ed631be61c364cf7a29b3d63c97deae288ba429ae222c61c2df94b1f906fae3" exitCode=0 Mar 18 12:48:36 crc kubenswrapper[4843]: I0318 12:48:36.175939 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n99xv" event={"ID":"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46","Type":"ContainerDied","Data":"0ed631be61c364cf7a29b3d63c97deae288ba429ae222c61c2df94b1f906fae3"} Mar 18 12:48:37 crc kubenswrapper[4843]: I0318 12:48:37.196901 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n99xv" event={"ID":"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46","Type":"ContainerStarted","Data":"b72b25e18e289e6fd24aeae0a49b2c289f99c018f130f096b7d61bb3aa60cc1c"} Mar 18 12:48:37 crc kubenswrapper[4843]: I0318 12:48:37.224300 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n99xv" podStartSLOduration=2.566553802 podStartE2EDuration="5.224276555s" podCreationTimestamp="2026-03-18 12:48:32 +0000 UTC" firstStartedPulling="2026-03-18 12:48:34.158271082 +0000 UTC m=+2347.874096616" lastFinishedPulling="2026-03-18 12:48:36.815993845 +0000 UTC m=+2350.531819369" observedRunningTime="2026-03-18 12:48:37.217541434 +0000 UTC m=+2350.933366968" watchObservedRunningTime="2026-03-18 12:48:37.224276555 +0000 UTC m=+2350.940102099" Mar 18 12:48:43 crc kubenswrapper[4843]: I0318 12:48:43.025194 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:43 crc kubenswrapper[4843]: I0318 12:48:43.025742 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:43 crc kubenswrapper[4843]: I0318 12:48:43.119192 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:43 crc kubenswrapper[4843]: I0318 12:48:43.632448 4843 generic.go:334] "Generic (PLEG): container finished" podID="8894c42c-6e17-4c87-84e1-1888c1800b04" containerID="3984d7913f66c60fcefd709dd49bc640e937a34299a0be9b184e6d2779fb1e5a" exitCode=0 Mar 18 12:48:43 crc kubenswrapper[4843]: I0318 12:48:43.632563 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" event={"ID":"8894c42c-6e17-4c87-84e1-1888c1800b04","Type":"ContainerDied","Data":"3984d7913f66c60fcefd709dd49bc640e937a34299a0be9b184e6d2779fb1e5a"} Mar 18 12:48:43 crc kubenswrapper[4843]: I0318 12:48:43.689305 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:43 crc kubenswrapper[4843]: I0318 12:48:43.743895 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n99xv"] Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.042960 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.158984 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-nova-metadata-neutron-config-0\") pod \"8894c42c-6e17-4c87-84e1-1888c1800b04\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.159474 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-inventory\") pod \"8894c42c-6e17-4c87-84e1-1888c1800b04\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.159971 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-neutron-ovn-metadata-agent-neutron-config-0\") pod \"8894c42c-6e17-4c87-84e1-1888c1800b04\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.160285 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48gxz\" (UniqueName: \"kubernetes.io/projected/8894c42c-6e17-4c87-84e1-1888c1800b04-kube-api-access-48gxz\") pod \"8894c42c-6e17-4c87-84e1-1888c1800b04\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.160384 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-neutron-metadata-combined-ca-bundle\") pod \"8894c42c-6e17-4c87-84e1-1888c1800b04\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.160462 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-ssh-key-openstack-edpm-ipam\") pod \"8894c42c-6e17-4c87-84e1-1888c1800b04\" (UID: \"8894c42c-6e17-4c87-84e1-1888c1800b04\") " Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.166318 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8894c42c-6e17-4c87-84e1-1888c1800b04-kube-api-access-48gxz" (OuterVolumeSpecName: "kube-api-access-48gxz") pod "8894c42c-6e17-4c87-84e1-1888c1800b04" (UID: "8894c42c-6e17-4c87-84e1-1888c1800b04"). InnerVolumeSpecName "kube-api-access-48gxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.166418 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8894c42c-6e17-4c87-84e1-1888c1800b04" (UID: "8894c42c-6e17-4c87-84e1-1888c1800b04"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.194427 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8894c42c-6e17-4c87-84e1-1888c1800b04" (UID: "8894c42c-6e17-4c87-84e1-1888c1800b04"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.194808 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "8894c42c-6e17-4c87-84e1-1888c1800b04" (UID: "8894c42c-6e17-4c87-84e1-1888c1800b04"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.197848 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-inventory" (OuterVolumeSpecName: "inventory") pod "8894c42c-6e17-4c87-84e1-1888c1800b04" (UID: "8894c42c-6e17-4c87-84e1-1888c1800b04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.211314 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "8894c42c-6e17-4c87-84e1-1888c1800b04" (UID: "8894c42c-6e17-4c87-84e1-1888c1800b04"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.271300 4843 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.271350 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.271362 4843 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.271372 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.271383 4843 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8894c42c-6e17-4c87-84e1-1888c1800b04-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.271393 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48gxz\" (UniqueName: \"kubernetes.io/projected/8894c42c-6e17-4c87-84e1-1888c1800b04-kube-api-access-48gxz\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.650628 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" event={"ID":"8894c42c-6e17-4c87-84e1-1888c1800b04","Type":"ContainerDied","Data":"24967846c4214ed467b7523f7b5336661ef97ba7418adab6cbbae460014ca81b"} Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.650722 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24967846c4214ed467b7523f7b5336661ef97ba7418adab6cbbae460014ca81b" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.650672 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.650764 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n99xv" podUID="3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" containerName="registry-server" containerID="cri-o://b72b25e18e289e6fd24aeae0a49b2c289f99c018f130f096b7d61bb3aa60cc1c" gracePeriod=2 Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.785410 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8"] Mar 18 12:48:45 crc kubenswrapper[4843]: E0318 12:48:45.785905 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8894c42c-6e17-4c87-84e1-1888c1800b04" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.785932 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8894c42c-6e17-4c87-84e1-1888c1800b04" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.786209 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="8894c42c-6e17-4c87-84e1-1888c1800b04" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.786883 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.791696 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.791912 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.791991 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.792093 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.799380 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.809072 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8"] Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.885890 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc2kb\" (UniqueName: \"kubernetes.io/projected/bbe53df4-b4eb-4ff6-ac59-f6532974af67-kube-api-access-qc2kb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.886314 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.886362 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.886448 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.886491 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.988272 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.988333 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.989111 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.989155 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.989199 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc2kb\" (UniqueName: \"kubernetes.io/projected/bbe53df4-b4eb-4ff6-ac59-f6532974af67-kube-api-access-qc2kb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.994280 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.994290 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:45 crc kubenswrapper[4843]: I0318 12:48:45.997073 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.008400 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.009597 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc2kb\" (UniqueName: \"kubernetes.io/projected/bbe53df4-b4eb-4ff6-ac59-f6532974af67-kube-api-access-qc2kb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.158440 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.258325 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.397313 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-utilities\") pod \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\" (UID: \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\") " Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.400961 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-utilities" (OuterVolumeSpecName: "utilities") pod "3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" (UID: "3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.401080 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-catalog-content\") pod \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\" (UID: \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\") " Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.401250 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppxtf\" (UniqueName: \"kubernetes.io/projected/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-kube-api-access-ppxtf\") pod \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\" (UID: \"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46\") " Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.402087 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.412636 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-kube-api-access-ppxtf" (OuterVolumeSpecName: "kube-api-access-ppxtf") pod "3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" (UID: "3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46"). InnerVolumeSpecName "kube-api-access-ppxtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.479314 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" (UID: "3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.503839 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.503882 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppxtf\" (UniqueName: \"kubernetes.io/projected/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46-kube-api-access-ppxtf\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.666933 4843 generic.go:334] "Generic (PLEG): container finished" podID="3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" containerID="b72b25e18e289e6fd24aeae0a49b2c289f99c018f130f096b7d61bb3aa60cc1c" exitCode=0 Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.667056 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n99xv" event={"ID":"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46","Type":"ContainerDied","Data":"b72b25e18e289e6fd24aeae0a49b2c289f99c018f130f096b7d61bb3aa60cc1c"} Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.667096 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n99xv" event={"ID":"3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46","Type":"ContainerDied","Data":"381be94dd71b05dce4968f26f190f8e5ac0b5d4e50cb9bb5d8ed56f6d31f7141"} Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.667119 4843 scope.go:117] "RemoveContainer" containerID="b72b25e18e289e6fd24aeae0a49b2c289f99c018f130f096b7d61bb3aa60cc1c" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.667322 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n99xv" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.688447 4843 scope.go:117] "RemoveContainer" containerID="0ed631be61c364cf7a29b3d63c97deae288ba429ae222c61c2df94b1f906fae3" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.748373 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n99xv"] Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.769266 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n99xv"] Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.770206 4843 scope.go:117] "RemoveContainer" containerID="bf5ea51ca1dea2b807519b793278c4fd37e0d8f3513c388ca7c41252c74911fa" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.779711 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8"] Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.794792 4843 scope.go:117] "RemoveContainer" containerID="b72b25e18e289e6fd24aeae0a49b2c289f99c018f130f096b7d61bb3aa60cc1c" Mar 18 12:48:46 crc kubenswrapper[4843]: E0318 12:48:46.795198 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b72b25e18e289e6fd24aeae0a49b2c289f99c018f130f096b7d61bb3aa60cc1c\": container with ID starting with b72b25e18e289e6fd24aeae0a49b2c289f99c018f130f096b7d61bb3aa60cc1c not found: ID does not exist" containerID="b72b25e18e289e6fd24aeae0a49b2c289f99c018f130f096b7d61bb3aa60cc1c" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.795242 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72b25e18e289e6fd24aeae0a49b2c289f99c018f130f096b7d61bb3aa60cc1c"} err="failed to get container status \"b72b25e18e289e6fd24aeae0a49b2c289f99c018f130f096b7d61bb3aa60cc1c\": rpc error: code = NotFound desc = could not find container \"b72b25e18e289e6fd24aeae0a49b2c289f99c018f130f096b7d61bb3aa60cc1c\": container with ID starting with b72b25e18e289e6fd24aeae0a49b2c289f99c018f130f096b7d61bb3aa60cc1c not found: ID does not exist" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.795271 4843 scope.go:117] "RemoveContainer" containerID="0ed631be61c364cf7a29b3d63c97deae288ba429ae222c61c2df94b1f906fae3" Mar 18 12:48:46 crc kubenswrapper[4843]: E0318 12:48:46.795548 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed631be61c364cf7a29b3d63c97deae288ba429ae222c61c2df94b1f906fae3\": container with ID starting with 0ed631be61c364cf7a29b3d63c97deae288ba429ae222c61c2df94b1f906fae3 not found: ID does not exist" containerID="0ed631be61c364cf7a29b3d63c97deae288ba429ae222c61c2df94b1f906fae3" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.795590 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed631be61c364cf7a29b3d63c97deae288ba429ae222c61c2df94b1f906fae3"} err="failed to get container status \"0ed631be61c364cf7a29b3d63c97deae288ba429ae222c61c2df94b1f906fae3\": rpc error: code = NotFound desc = could not find container \"0ed631be61c364cf7a29b3d63c97deae288ba429ae222c61c2df94b1f906fae3\": container with ID starting with 0ed631be61c364cf7a29b3d63c97deae288ba429ae222c61c2df94b1f906fae3 not found: ID does not exist" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.795618 4843 scope.go:117] "RemoveContainer" containerID="bf5ea51ca1dea2b807519b793278c4fd37e0d8f3513c388ca7c41252c74911fa" Mar 18 12:48:46 crc kubenswrapper[4843]: E0318 12:48:46.796033 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf5ea51ca1dea2b807519b793278c4fd37e0d8f3513c388ca7c41252c74911fa\": container with ID starting with bf5ea51ca1dea2b807519b793278c4fd37e0d8f3513c388ca7c41252c74911fa not found: ID does not exist" containerID="bf5ea51ca1dea2b807519b793278c4fd37e0d8f3513c388ca7c41252c74911fa" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.796081 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf5ea51ca1dea2b807519b793278c4fd37e0d8f3513c388ca7c41252c74911fa"} err="failed to get container status \"bf5ea51ca1dea2b807519b793278c4fd37e0d8f3513c388ca7c41252c74911fa\": rpc error: code = NotFound desc = could not find container \"bf5ea51ca1dea2b807519b793278c4fd37e0d8f3513c388ca7c41252c74911fa\": container with ID starting with bf5ea51ca1dea2b807519b793278c4fd37e0d8f3513c388ca7c41252c74911fa not found: ID does not exist" Mar 18 12:48:46 crc kubenswrapper[4843]: I0318 12:48:46.999412 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" path="/var/lib/kubelet/pods/3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46/volumes" Mar 18 12:48:47 crc kubenswrapper[4843]: I0318 12:48:47.690761 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" event={"ID":"bbe53df4-b4eb-4ff6-ac59-f6532974af67","Type":"ContainerStarted","Data":"e9b94c49308bb88995d8adcefb0c6ff40c4ded3cd312323f541b4bbd6be36fc1"} Mar 18 12:48:48 crc kubenswrapper[4843]: I0318 12:48:48.701099 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" event={"ID":"bbe53df4-b4eb-4ff6-ac59-f6532974af67","Type":"ContainerStarted","Data":"d83ce1748f9709fe4c2f0c2948b6a598a1de9b07614114e1600f7e7650643985"} Mar 18 12:48:50 crc kubenswrapper[4843]: I0318 12:48:50.035548 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:48:50 crc kubenswrapper[4843]: I0318 12:48:50.035941 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:48:50 crc kubenswrapper[4843]: I0318 12:48:50.035994 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:48:50 crc kubenswrapper[4843]: I0318 12:48:50.036875 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:48:50 crc kubenswrapper[4843]: I0318 12:48:50.036940 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" gracePeriod=600 Mar 18 12:48:50 crc kubenswrapper[4843]: E0318 12:48:50.173113 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:48:50 crc kubenswrapper[4843]: I0318 12:48:50.766964 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" exitCode=0 Mar 18 12:48:50 crc kubenswrapper[4843]: I0318 12:48:50.767264 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276"} Mar 18 12:48:50 crc kubenswrapper[4843]: I0318 12:48:50.767301 4843 scope.go:117] "RemoveContainer" containerID="adf4e676a33d49fbbd7d287fa0d1930ae6c4bf2cf91763487c46260a988c52c0" Mar 18 12:48:50 crc kubenswrapper[4843]: I0318 12:48:50.772676 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:48:50 crc kubenswrapper[4843]: E0318 12:48:50.773174 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:48:50 crc kubenswrapper[4843]: I0318 12:48:50.843822 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" podStartSLOduration=4.861810252 podStartE2EDuration="5.843794759s" podCreationTimestamp="2026-03-18 12:48:45 +0000 UTC" firstStartedPulling="2026-03-18 12:48:46.770306147 +0000 UTC m=+2360.486131671" lastFinishedPulling="2026-03-18 12:48:47.752290644 +0000 UTC m=+2361.468116178" observedRunningTime="2026-03-18 12:48:48.724963997 +0000 UTC m=+2362.440789531" watchObservedRunningTime="2026-03-18 12:48:50.843794759 +0000 UTC m=+2364.559620283" Mar 18 12:49:02 crc kubenswrapper[4843]: I0318 12:49:02.984145 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:49:02 crc kubenswrapper[4843]: E0318 12:49:02.984984 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:49:04 crc kubenswrapper[4843]: I0318 12:49:04.849825 4843 scope.go:117] "RemoveContainer" containerID="85a062706b3b829cd2163da569e59875da2b30444fccd8da01899ee16000ef28" Mar 18 12:49:13 crc kubenswrapper[4843]: I0318 12:49:13.989767 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:49:13 crc kubenswrapper[4843]: E0318 12:49:13.991905 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:49:24 crc kubenswrapper[4843]: I0318 12:49:24.984396 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:49:24 crc kubenswrapper[4843]: E0318 12:49:24.985479 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:49:38 crc kubenswrapper[4843]: I0318 12:49:38.984288 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:49:38 crc kubenswrapper[4843]: E0318 12:49:38.985112 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:49:50 crc kubenswrapper[4843]: I0318 12:49:50.984593 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:49:50 crc kubenswrapper[4843]: E0318 12:49:50.985408 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.152514 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563970-n6nzp"] Mar 18 12:50:00 crc kubenswrapper[4843]: E0318 12:50:00.153283 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" containerName="extract-utilities" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.153294 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" containerName="extract-utilities" Mar 18 12:50:00 crc kubenswrapper[4843]: E0318 12:50:00.153314 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" containerName="extract-content" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.153321 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" containerName="extract-content" Mar 18 12:50:00 crc kubenswrapper[4843]: E0318 12:50:00.153341 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" containerName="registry-server" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.153347 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" containerName="registry-server" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.153528 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c3211da-4cc5-4f02-9f1f-1ec6c86b3c46" containerName="registry-server" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.154212 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563970-n6nzp" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.164042 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.164486 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563970-n6nzp"] Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.165212 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.165365 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.245091 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8z22\" (UniqueName: \"kubernetes.io/projected/d0d24420-11b7-4859-9b97-99bc5a58c8a1-kube-api-access-s8z22\") pod \"auto-csr-approver-29563970-n6nzp\" (UID: \"d0d24420-11b7-4859-9b97-99bc5a58c8a1\") " pod="openshift-infra/auto-csr-approver-29563970-n6nzp" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.346643 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8z22\" (UniqueName: \"kubernetes.io/projected/d0d24420-11b7-4859-9b97-99bc5a58c8a1-kube-api-access-s8z22\") pod \"auto-csr-approver-29563970-n6nzp\" (UID: \"d0d24420-11b7-4859-9b97-99bc5a58c8a1\") " pod="openshift-infra/auto-csr-approver-29563970-n6nzp" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.378888 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8z22\" (UniqueName: \"kubernetes.io/projected/d0d24420-11b7-4859-9b97-99bc5a58c8a1-kube-api-access-s8z22\") pod \"auto-csr-approver-29563970-n6nzp\" (UID: \"d0d24420-11b7-4859-9b97-99bc5a58c8a1\") " pod="openshift-infra/auto-csr-approver-29563970-n6nzp" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.474238 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563970-n6nzp" Mar 18 12:50:00 crc kubenswrapper[4843]: I0318 12:50:00.956237 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563970-n6nzp"] Mar 18 12:50:01 crc kubenswrapper[4843]: I0318 12:50:01.543527 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563970-n6nzp" event={"ID":"d0d24420-11b7-4859-9b97-99bc5a58c8a1","Type":"ContainerStarted","Data":"3eba267b31e4479aa7ab18818d2761a45cb5082a102ae431db9e3d3d1a5a3737"} Mar 18 12:50:02 crc kubenswrapper[4843]: I0318 12:50:02.984174 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:50:02 crc kubenswrapper[4843]: E0318 12:50:02.984707 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:50:03 crc kubenswrapper[4843]: I0318 12:50:03.563979 4843 generic.go:334] "Generic (PLEG): container finished" podID="d0d24420-11b7-4859-9b97-99bc5a58c8a1" containerID="47bf1a38d829c1cf2ebae9958013f33b7c55f6d3563c72a06132d0d1a973466a" exitCode=0 Mar 18 12:50:03 crc kubenswrapper[4843]: I0318 12:50:03.564031 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563970-n6nzp" event={"ID":"d0d24420-11b7-4859-9b97-99bc5a58c8a1","Type":"ContainerDied","Data":"47bf1a38d829c1cf2ebae9958013f33b7c55f6d3563c72a06132d0d1a973466a"} Mar 18 12:50:04 crc kubenswrapper[4843]: I0318 12:50:04.904330 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563970-n6nzp" Mar 18 12:50:05 crc kubenswrapper[4843]: I0318 12:50:05.038760 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8z22\" (UniqueName: \"kubernetes.io/projected/d0d24420-11b7-4859-9b97-99bc5a58c8a1-kube-api-access-s8z22\") pod \"d0d24420-11b7-4859-9b97-99bc5a58c8a1\" (UID: \"d0d24420-11b7-4859-9b97-99bc5a58c8a1\") " Mar 18 12:50:05 crc kubenswrapper[4843]: I0318 12:50:05.044349 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0d24420-11b7-4859-9b97-99bc5a58c8a1-kube-api-access-s8z22" (OuterVolumeSpecName: "kube-api-access-s8z22") pod "d0d24420-11b7-4859-9b97-99bc5a58c8a1" (UID: "d0d24420-11b7-4859-9b97-99bc5a58c8a1"). InnerVolumeSpecName "kube-api-access-s8z22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:50:05 crc kubenswrapper[4843]: I0318 12:50:05.142108 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8z22\" (UniqueName: \"kubernetes.io/projected/d0d24420-11b7-4859-9b97-99bc5a58c8a1-kube-api-access-s8z22\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:05 crc kubenswrapper[4843]: I0318 12:50:05.592013 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563970-n6nzp" event={"ID":"d0d24420-11b7-4859-9b97-99bc5a58c8a1","Type":"ContainerDied","Data":"3eba267b31e4479aa7ab18818d2761a45cb5082a102ae431db9e3d3d1a5a3737"} Mar 18 12:50:05 crc kubenswrapper[4843]: I0318 12:50:05.592315 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eba267b31e4479aa7ab18818d2761a45cb5082a102ae431db9e3d3d1a5a3737" Mar 18 12:50:05 crc kubenswrapper[4843]: I0318 12:50:05.592379 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563970-n6nzp" Mar 18 12:50:06 crc kubenswrapper[4843]: I0318 12:50:06.004966 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563964-7vc65"] Mar 18 12:50:06 crc kubenswrapper[4843]: I0318 12:50:06.012639 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563964-7vc65"] Mar 18 12:50:06 crc kubenswrapper[4843]: I0318 12:50:06.995342 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddb370c2-9293-469a-bfbf-3f2ed7906e0d" path="/var/lib/kubelet/pods/ddb370c2-9293-469a-bfbf-3f2ed7906e0d/volumes" Mar 18 12:50:15 crc kubenswrapper[4843]: I0318 12:50:15.983634 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:50:15 crc kubenswrapper[4843]: E0318 12:50:15.984597 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:50:30 crc kubenswrapper[4843]: I0318 12:50:30.984212 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:50:30 crc kubenswrapper[4843]: E0318 12:50:30.985534 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:50:42 crc kubenswrapper[4843]: I0318 12:50:42.984727 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:50:42 crc kubenswrapper[4843]: E0318 12:50:42.985363 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:50:53 crc kubenswrapper[4843]: I0318 12:50:53.983924 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:50:53 crc kubenswrapper[4843]: E0318 12:50:53.984825 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:51:04 crc kubenswrapper[4843]: I0318 12:51:04.983990 4843 scope.go:117] "RemoveContainer" containerID="fb09b19d4bc1217603d3cf98120ebab664e4c6a5cf05157ae90fcf1b4e58c83b" Mar 18 12:51:08 crc kubenswrapper[4843]: I0318 12:51:08.984704 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:51:08 crc kubenswrapper[4843]: E0318 12:51:08.985206 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:51:23 crc kubenswrapper[4843]: I0318 12:51:23.984251 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:51:23 crc kubenswrapper[4843]: E0318 12:51:23.985084 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:51:37 crc kubenswrapper[4843]: I0318 12:51:37.985246 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:51:37 crc kubenswrapper[4843]: E0318 12:51:37.986508 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:51:52 crc kubenswrapper[4843]: I0318 12:51:52.084866 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:51:52 crc kubenswrapper[4843]: E0318 12:51:52.086396 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:52:00 crc kubenswrapper[4843]: I0318 12:52:00.175138 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563972-2hl2b"] Mar 18 12:52:00 crc kubenswrapper[4843]: E0318 12:52:00.176378 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d24420-11b7-4859-9b97-99bc5a58c8a1" containerName="oc" Mar 18 12:52:00 crc kubenswrapper[4843]: I0318 12:52:00.176406 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d24420-11b7-4859-9b97-99bc5a58c8a1" containerName="oc" Mar 18 12:52:00 crc kubenswrapper[4843]: I0318 12:52:00.176898 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d24420-11b7-4859-9b97-99bc5a58c8a1" containerName="oc" Mar 18 12:52:00 crc kubenswrapper[4843]: I0318 12:52:00.178732 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563972-2hl2b" Mar 18 12:52:00 crc kubenswrapper[4843]: I0318 12:52:00.181024 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:52:00 crc kubenswrapper[4843]: I0318 12:52:00.181563 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:52:00 crc kubenswrapper[4843]: I0318 12:52:00.183049 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:52:00 crc kubenswrapper[4843]: I0318 12:52:00.193541 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563972-2hl2b"] Mar 18 12:52:00 crc kubenswrapper[4843]: I0318 12:52:00.235838 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2x7p\" (UniqueName: \"kubernetes.io/projected/c952cae7-b1db-4558-9084-3294be4a4d57-kube-api-access-w2x7p\") pod \"auto-csr-approver-29563972-2hl2b\" (UID: \"c952cae7-b1db-4558-9084-3294be4a4d57\") " pod="openshift-infra/auto-csr-approver-29563972-2hl2b" Mar 18 12:52:00 crc kubenswrapper[4843]: I0318 12:52:00.337572 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2x7p\" (UniqueName: \"kubernetes.io/projected/c952cae7-b1db-4558-9084-3294be4a4d57-kube-api-access-w2x7p\") pod \"auto-csr-approver-29563972-2hl2b\" (UID: \"c952cae7-b1db-4558-9084-3294be4a4d57\") " pod="openshift-infra/auto-csr-approver-29563972-2hl2b" Mar 18 12:52:00 crc kubenswrapper[4843]: I0318 12:52:00.361550 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2x7p\" (UniqueName: \"kubernetes.io/projected/c952cae7-b1db-4558-9084-3294be4a4d57-kube-api-access-w2x7p\") pod \"auto-csr-approver-29563972-2hl2b\" (UID: \"c952cae7-b1db-4558-9084-3294be4a4d57\") " pod="openshift-infra/auto-csr-approver-29563972-2hl2b" Mar 18 12:52:00 crc kubenswrapper[4843]: I0318 12:52:00.536524 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563972-2hl2b" Mar 18 12:52:01 crc kubenswrapper[4843]: I0318 12:52:01.042831 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563972-2hl2b"] Mar 18 12:52:01 crc kubenswrapper[4843]: I0318 12:52:01.065879 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:52:01 crc kubenswrapper[4843]: I0318 12:52:01.906790 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563972-2hl2b" event={"ID":"c952cae7-b1db-4558-9084-3294be4a4d57","Type":"ContainerStarted","Data":"85cd87f96b92588d476f951f6096ee23caf546455d2dd1d71ef6add2a259fbd3"} Mar 18 12:52:02 crc kubenswrapper[4843]: I0318 12:52:02.921015 4843 generic.go:334] "Generic (PLEG): container finished" podID="c952cae7-b1db-4558-9084-3294be4a4d57" containerID="2be2299fce2da948e3385cbc0877e14046b370b7491ac52cc5bc156998f560c0" exitCode=0 Mar 18 12:52:02 crc kubenswrapper[4843]: I0318 12:52:02.921124 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563972-2hl2b" event={"ID":"c952cae7-b1db-4558-9084-3294be4a4d57","Type":"ContainerDied","Data":"2be2299fce2da948e3385cbc0877e14046b370b7491ac52cc5bc156998f560c0"} Mar 18 12:52:03 crc kubenswrapper[4843]: I0318 12:52:03.984697 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:52:03 crc kubenswrapper[4843]: E0318 12:52:03.985647 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:52:04 crc kubenswrapper[4843]: I0318 12:52:04.291196 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563972-2hl2b" Mar 18 12:52:04 crc kubenswrapper[4843]: I0318 12:52:04.455771 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2x7p\" (UniqueName: \"kubernetes.io/projected/c952cae7-b1db-4558-9084-3294be4a4d57-kube-api-access-w2x7p\") pod \"c952cae7-b1db-4558-9084-3294be4a4d57\" (UID: \"c952cae7-b1db-4558-9084-3294be4a4d57\") " Mar 18 12:52:04 crc kubenswrapper[4843]: I0318 12:52:04.463851 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c952cae7-b1db-4558-9084-3294be4a4d57-kube-api-access-w2x7p" (OuterVolumeSpecName: "kube-api-access-w2x7p") pod "c952cae7-b1db-4558-9084-3294be4a4d57" (UID: "c952cae7-b1db-4558-9084-3294be4a4d57"). InnerVolumeSpecName "kube-api-access-w2x7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:52:04 crc kubenswrapper[4843]: I0318 12:52:04.558622 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2x7p\" (UniqueName: \"kubernetes.io/projected/c952cae7-b1db-4558-9084-3294be4a4d57-kube-api-access-w2x7p\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:04 crc kubenswrapper[4843]: I0318 12:52:04.946378 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563972-2hl2b" event={"ID":"c952cae7-b1db-4558-9084-3294be4a4d57","Type":"ContainerDied","Data":"85cd87f96b92588d476f951f6096ee23caf546455d2dd1d71ef6add2a259fbd3"} Mar 18 12:52:04 crc kubenswrapper[4843]: I0318 12:52:04.946466 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563972-2hl2b" Mar 18 12:52:04 crc kubenswrapper[4843]: I0318 12:52:04.946471 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85cd87f96b92588d476f951f6096ee23caf546455d2dd1d71ef6add2a259fbd3" Mar 18 12:52:05 crc kubenswrapper[4843]: I0318 12:52:05.398820 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563966-v82gk"] Mar 18 12:52:05 crc kubenswrapper[4843]: I0318 12:52:05.404170 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563966-v82gk"] Mar 18 12:52:06 crc kubenswrapper[4843]: I0318 12:52:06.998281 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7787f4ee-769c-4b2e-94e6-975db57ccc1b" path="/var/lib/kubelet/pods/7787f4ee-769c-4b2e-94e6-975db57ccc1b/volumes" Mar 18 12:52:14 crc kubenswrapper[4843]: I0318 12:52:14.984262 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:52:14 crc kubenswrapper[4843]: E0318 12:52:14.984828 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:52:15 crc kubenswrapper[4843]: I0318 12:52:15.184559 4843 generic.go:334] "Generic (PLEG): container finished" podID="bbe53df4-b4eb-4ff6-ac59-f6532974af67" containerID="d83ce1748f9709fe4c2f0c2948b6a598a1de9b07614114e1600f7e7650643985" exitCode=2 Mar 18 12:52:15 crc kubenswrapper[4843]: I0318 12:52:15.184915 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" event={"ID":"bbe53df4-b4eb-4ff6-ac59-f6532974af67","Type":"ContainerDied","Data":"d83ce1748f9709fe4c2f0c2948b6a598a1de9b07614114e1600f7e7650643985"} Mar 18 12:52:16 crc kubenswrapper[4843]: I0318 12:52:16.800292 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:52:16 crc kubenswrapper[4843]: I0318 12:52:16.902898 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-libvirt-secret-0\") pod \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " Mar 18 12:52:16 crc kubenswrapper[4843]: I0318 12:52:16.902952 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-libvirt-combined-ca-bundle\") pod \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " Mar 18 12:52:16 crc kubenswrapper[4843]: I0318 12:52:16.903030 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-inventory\") pod \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " Mar 18 12:52:16 crc kubenswrapper[4843]: I0318 12:52:16.903118 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc2kb\" (UniqueName: \"kubernetes.io/projected/bbe53df4-b4eb-4ff6-ac59-f6532974af67-kube-api-access-qc2kb\") pod \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " Mar 18 12:52:16 crc kubenswrapper[4843]: I0318 12:52:16.903220 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-ssh-key-openstack-edpm-ipam\") pod \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\" (UID: \"bbe53df4-b4eb-4ff6-ac59-f6532974af67\") " Mar 18 12:52:16 crc kubenswrapper[4843]: I0318 12:52:16.909788 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "bbe53df4-b4eb-4ff6-ac59-f6532974af67" (UID: "bbe53df4-b4eb-4ff6-ac59-f6532974af67"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:52:16 crc kubenswrapper[4843]: I0318 12:52:16.911728 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbe53df4-b4eb-4ff6-ac59-f6532974af67-kube-api-access-qc2kb" (OuterVolumeSpecName: "kube-api-access-qc2kb") pod "bbe53df4-b4eb-4ff6-ac59-f6532974af67" (UID: "bbe53df4-b4eb-4ff6-ac59-f6532974af67"). InnerVolumeSpecName "kube-api-access-qc2kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:52:16 crc kubenswrapper[4843]: I0318 12:52:16.930868 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "bbe53df4-b4eb-4ff6-ac59-f6532974af67" (UID: "bbe53df4-b4eb-4ff6-ac59-f6532974af67"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:52:16 crc kubenswrapper[4843]: I0318 12:52:16.940862 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-inventory" (OuterVolumeSpecName: "inventory") pod "bbe53df4-b4eb-4ff6-ac59-f6532974af67" (UID: "bbe53df4-b4eb-4ff6-ac59-f6532974af67"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:52:16 crc kubenswrapper[4843]: I0318 12:52:16.960644 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bbe53df4-b4eb-4ff6-ac59-f6532974af67" (UID: "bbe53df4-b4eb-4ff6-ac59-f6532974af67"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:52:17 crc kubenswrapper[4843]: I0318 12:52:17.015017 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:17 crc kubenswrapper[4843]: I0318 12:52:17.015062 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:17 crc kubenswrapper[4843]: I0318 12:52:17.015073 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:17 crc kubenswrapper[4843]: I0318 12:52:17.015083 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc2kb\" (UniqueName: \"kubernetes.io/projected/bbe53df4-b4eb-4ff6-ac59-f6532974af67-kube-api-access-qc2kb\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:17 crc kubenswrapper[4843]: I0318 12:52:17.015091 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbe53df4-b4eb-4ff6-ac59-f6532974af67-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:17 crc kubenswrapper[4843]: I0318 12:52:17.213988 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" event={"ID":"bbe53df4-b4eb-4ff6-ac59-f6532974af67","Type":"ContainerDied","Data":"e9b94c49308bb88995d8adcefb0c6ff40c4ded3cd312323f541b4bbd6be36fc1"} Mar 18 12:52:17 crc kubenswrapper[4843]: I0318 12:52:17.214032 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9b94c49308bb88995d8adcefb0c6ff40c4ded3cd312323f541b4bbd6be36fc1" Mar 18 12:52:17 crc kubenswrapper[4843]: I0318 12:52:17.214201 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.055809 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8"] Mar 18 12:52:24 crc kubenswrapper[4843]: E0318 12:52:24.062873 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe53df4-b4eb-4ff6-ac59-f6532974af67" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.062897 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe53df4-b4eb-4ff6-ac59-f6532974af67" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:52:24 crc kubenswrapper[4843]: E0318 12:52:24.062911 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c952cae7-b1db-4558-9084-3294be4a4d57" containerName="oc" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.062918 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c952cae7-b1db-4558-9084-3294be4a4d57" containerName="oc" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.063137 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="c952cae7-b1db-4558-9084-3294be4a4d57" containerName="oc" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.063172 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe53df4-b4eb-4ff6-ac59-f6532974af67" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.063975 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.082382 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.082388 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.082534 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.086301 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.086948 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.090229 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8"] Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.142328 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.142378 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.142540 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.142718 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.142788 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkqw7\" (UniqueName: \"kubernetes.io/projected/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-kube-api-access-fkqw7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.243894 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.243954 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkqw7\" (UniqueName: \"kubernetes.io/projected/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-kube-api-access-fkqw7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.244006 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.244025 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.244096 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.250140 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.250525 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.252401 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.252441 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.265684 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkqw7\" (UniqueName: \"kubernetes.io/projected/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-kube-api-access-fkqw7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:24 crc kubenswrapper[4843]: I0318 12:52:24.396795 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:52:25 crc kubenswrapper[4843]: I0318 12:52:25.035377 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8"] Mar 18 12:52:25 crc kubenswrapper[4843]: I0318 12:52:25.306440 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" event={"ID":"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3","Type":"ContainerStarted","Data":"6ee135263cfbd89dd522c30fcc18d0a69f8f58cc6287ed0356f7ff7549610126"} Mar 18 12:52:26 crc kubenswrapper[4843]: I0318 12:52:26.319862 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" event={"ID":"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3","Type":"ContainerStarted","Data":"c867b95a9a1acc6b85429262ed83ce4492692b3ec99cc5d9e25611f500b1d2e4"} Mar 18 12:52:26 crc kubenswrapper[4843]: I0318 12:52:26.342378 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" podStartSLOduration=2.099423728 podStartE2EDuration="2.342331961s" podCreationTimestamp="2026-03-18 12:52:24 +0000 UTC" firstStartedPulling="2026-03-18 12:52:25.041133759 +0000 UTC m=+2578.756959283" lastFinishedPulling="2026-03-18 12:52:25.284041992 +0000 UTC m=+2578.999867516" observedRunningTime="2026-03-18 12:52:26.338321438 +0000 UTC m=+2580.054146962" watchObservedRunningTime="2026-03-18 12:52:26.342331961 +0000 UTC m=+2580.058157485" Mar 18 12:52:26 crc kubenswrapper[4843]: I0318 12:52:26.989490 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:52:26 crc kubenswrapper[4843]: E0318 12:52:26.989770 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:52:38 crc kubenswrapper[4843]: I0318 12:52:38.984170 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:52:38 crc kubenswrapper[4843]: E0318 12:52:38.986309 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:52:50 crc kubenswrapper[4843]: I0318 12:52:50.984593 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:52:50 crc kubenswrapper[4843]: E0318 12:52:50.985424 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:53:05 crc kubenswrapper[4843]: I0318 12:53:05.091758 4843 scope.go:117] "RemoveContainer" containerID="43d6b902c62ad5e5d9365c36b9f9e52eae37d7f6c526bc50146dfb8d85afc67a" Mar 18 12:53:05 crc kubenswrapper[4843]: I0318 12:53:05.988865 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:53:05 crc kubenswrapper[4843]: E0318 12:53:05.989622 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:53:19 crc kubenswrapper[4843]: I0318 12:53:19.984458 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:53:19 crc kubenswrapper[4843]: E0318 12:53:19.986347 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:53:33 crc kubenswrapper[4843]: I0318 12:53:33.984730 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:53:33 crc kubenswrapper[4843]: E0318 12:53:33.986195 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:53:44 crc kubenswrapper[4843]: I0318 12:53:44.867596 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jwh7g"] Mar 18 12:53:44 crc kubenswrapper[4843]: I0318 12:53:44.870892 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:53:44 crc kubenswrapper[4843]: I0318 12:53:44.888978 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jwh7g"] Mar 18 12:53:44 crc kubenswrapper[4843]: I0318 12:53:44.896328 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw9hk\" (UniqueName: \"kubernetes.io/projected/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-kube-api-access-zw9hk\") pod \"redhat-operators-jwh7g\" (UID: \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\") " pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:53:44 crc kubenswrapper[4843]: I0318 12:53:44.896466 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-utilities\") pod \"redhat-operators-jwh7g\" (UID: \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\") " pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:53:44 crc kubenswrapper[4843]: I0318 12:53:44.896558 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-catalog-content\") pod \"redhat-operators-jwh7g\" (UID: \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\") " pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:53:44 crc kubenswrapper[4843]: I0318 12:53:44.999237 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-utilities\") pod \"redhat-operators-jwh7g\" (UID: \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\") " pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:53:44 crc kubenswrapper[4843]: I0318 12:53:44.999288 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-catalog-content\") pod \"redhat-operators-jwh7g\" (UID: \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\") " pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:53:44 crc kubenswrapper[4843]: I0318 12:53:44.999467 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw9hk\" (UniqueName: \"kubernetes.io/projected/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-kube-api-access-zw9hk\") pod \"redhat-operators-jwh7g\" (UID: \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\") " pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:53:44 crc kubenswrapper[4843]: I0318 12:53:44.999956 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-utilities\") pod \"redhat-operators-jwh7g\" (UID: \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\") " pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:53:45 crc kubenswrapper[4843]: I0318 12:53:44.999986 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-catalog-content\") pod \"redhat-operators-jwh7g\" (UID: \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\") " pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:53:45 crc kubenswrapper[4843]: I0318 12:53:45.020693 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw9hk\" (UniqueName: \"kubernetes.io/projected/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-kube-api-access-zw9hk\") pod \"redhat-operators-jwh7g\" (UID: \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\") " pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:53:45 crc kubenswrapper[4843]: I0318 12:53:45.201031 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:53:45 crc kubenswrapper[4843]: I0318 12:53:45.666824 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jwh7g"] Mar 18 12:53:45 crc kubenswrapper[4843]: I0318 12:53:45.688582 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwh7g" event={"ID":"74f5c9c8-063a-4b13-9b5b-3616a53bdd33","Type":"ContainerStarted","Data":"b75ddb532174d2d504fadb6a01ae979b3f764b5bfd36ab3e173ce0278c41e2e2"} Mar 18 12:53:46 crc kubenswrapper[4843]: I0318 12:53:46.699406 4843 generic.go:334] "Generic (PLEG): container finished" podID="74f5c9c8-063a-4b13-9b5b-3616a53bdd33" containerID="15aab03ae06677939651124ce211c13e3f3fe23ba8f7882495e3988db9b9cf16" exitCode=0 Mar 18 12:53:46 crc kubenswrapper[4843]: I0318 12:53:46.699460 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwh7g" event={"ID":"74f5c9c8-063a-4b13-9b5b-3616a53bdd33","Type":"ContainerDied","Data":"15aab03ae06677939651124ce211c13e3f3fe23ba8f7882495e3988db9b9cf16"} Mar 18 12:53:47 crc kubenswrapper[4843]: I0318 12:53:47.984181 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:53:47 crc kubenswrapper[4843]: E0318 12:53:47.984772 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 12:53:48 crc kubenswrapper[4843]: I0318 12:53:48.722030 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwh7g" event={"ID":"74f5c9c8-063a-4b13-9b5b-3616a53bdd33","Type":"ContainerStarted","Data":"e6afdb3e278036d58b498f494a32e515b0c6b0bda64fceaef4f8143833b38eee"} Mar 18 12:53:49 crc kubenswrapper[4843]: I0318 12:53:49.736696 4843 generic.go:334] "Generic (PLEG): container finished" podID="74f5c9c8-063a-4b13-9b5b-3616a53bdd33" containerID="e6afdb3e278036d58b498f494a32e515b0c6b0bda64fceaef4f8143833b38eee" exitCode=0 Mar 18 12:53:49 crc kubenswrapper[4843]: I0318 12:53:49.736743 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwh7g" event={"ID":"74f5c9c8-063a-4b13-9b5b-3616a53bdd33","Type":"ContainerDied","Data":"e6afdb3e278036d58b498f494a32e515b0c6b0bda64fceaef4f8143833b38eee"} Mar 18 12:53:50 crc kubenswrapper[4843]: I0318 12:53:50.749307 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwh7g" event={"ID":"74f5c9c8-063a-4b13-9b5b-3616a53bdd33","Type":"ContainerStarted","Data":"0925f7b91932de91f5cf9906e4305dc2872ae06afa4693111f45aefc65a4cd5f"} Mar 18 12:53:50 crc kubenswrapper[4843]: I0318 12:53:50.783729 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jwh7g" podStartSLOduration=3.163647791 podStartE2EDuration="6.783630292s" podCreationTimestamp="2026-03-18 12:53:44 +0000 UTC" firstStartedPulling="2026-03-18 12:53:46.70132651 +0000 UTC m=+2660.417152034" lastFinishedPulling="2026-03-18 12:53:50.321309011 +0000 UTC m=+2664.037134535" observedRunningTime="2026-03-18 12:53:50.774072111 +0000 UTC m=+2664.489897635" watchObservedRunningTime="2026-03-18 12:53:50.783630292 +0000 UTC m=+2664.499455816" Mar 18 12:53:55 crc kubenswrapper[4843]: I0318 12:53:55.201451 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:53:55 crc kubenswrapper[4843]: I0318 12:53:55.202018 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:53:56 crc kubenswrapper[4843]: I0318 12:53:56.243326 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jwh7g" podUID="74f5c9c8-063a-4b13-9b5b-3616a53bdd33" containerName="registry-server" probeResult="failure" output=< Mar 18 12:53:56 crc kubenswrapper[4843]: timeout: failed to connect service ":50051" within 1s Mar 18 12:53:56 crc kubenswrapper[4843]: > Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.158294 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563974-6wbq2"] Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.160578 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563974-6wbq2" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.163862 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.164227 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.164418 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.174823 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563974-6wbq2"] Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.198808 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj6hn\" (UniqueName: \"kubernetes.io/projected/727dae08-13f0-4310-806c-969324946ac6-kube-api-access-fj6hn\") pod \"auto-csr-approver-29563974-6wbq2\" (UID: \"727dae08-13f0-4310-806c-969324946ac6\") " pod="openshift-infra/auto-csr-approver-29563974-6wbq2" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.300590 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj6hn\" (UniqueName: \"kubernetes.io/projected/727dae08-13f0-4310-806c-969324946ac6-kube-api-access-fj6hn\") pod \"auto-csr-approver-29563974-6wbq2\" (UID: \"727dae08-13f0-4310-806c-969324946ac6\") " pod="openshift-infra/auto-csr-approver-29563974-6wbq2" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.331623 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj6hn\" (UniqueName: \"kubernetes.io/projected/727dae08-13f0-4310-806c-969324946ac6-kube-api-access-fj6hn\") pod \"auto-csr-approver-29563974-6wbq2\" (UID: \"727dae08-13f0-4310-806c-969324946ac6\") " pod="openshift-infra/auto-csr-approver-29563974-6wbq2" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.455372 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lp28m"] Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.458179 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.475511 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lp28m"] Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.481806 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563974-6wbq2" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.769194 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kt6g\" (UniqueName: \"kubernetes.io/projected/4ce59e36-3692-45b8-a868-5883a11dc1c0-kube-api-access-7kt6g\") pod \"redhat-marketplace-lp28m\" (UID: \"4ce59e36-3692-45b8-a868-5883a11dc1c0\") " pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.770900 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce59e36-3692-45b8-a868-5883a11dc1c0-catalog-content\") pod \"redhat-marketplace-lp28m\" (UID: \"4ce59e36-3692-45b8-a868-5883a11dc1c0\") " pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.771103 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce59e36-3692-45b8-a868-5883a11dc1c0-utilities\") pod \"redhat-marketplace-lp28m\" (UID: \"4ce59e36-3692-45b8-a868-5883a11dc1c0\") " pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.873383 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kt6g\" (UniqueName: \"kubernetes.io/projected/4ce59e36-3692-45b8-a868-5883a11dc1c0-kube-api-access-7kt6g\") pod \"redhat-marketplace-lp28m\" (UID: \"4ce59e36-3692-45b8-a868-5883a11dc1c0\") " pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.873489 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce59e36-3692-45b8-a868-5883a11dc1c0-catalog-content\") pod \"redhat-marketplace-lp28m\" (UID: \"4ce59e36-3692-45b8-a868-5883a11dc1c0\") " pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.873662 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce59e36-3692-45b8-a868-5883a11dc1c0-utilities\") pod \"redhat-marketplace-lp28m\" (UID: \"4ce59e36-3692-45b8-a868-5883a11dc1c0\") " pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.873986 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce59e36-3692-45b8-a868-5883a11dc1c0-catalog-content\") pod \"redhat-marketplace-lp28m\" (UID: \"4ce59e36-3692-45b8-a868-5883a11dc1c0\") " pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.874261 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce59e36-3692-45b8-a868-5883a11dc1c0-utilities\") pod \"redhat-marketplace-lp28m\" (UID: \"4ce59e36-3692-45b8-a868-5883a11dc1c0\") " pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.898781 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kt6g\" (UniqueName: \"kubernetes.io/projected/4ce59e36-3692-45b8-a868-5883a11dc1c0-kube-api-access-7kt6g\") pod \"redhat-marketplace-lp28m\" (UID: \"4ce59e36-3692-45b8-a868-5883a11dc1c0\") " pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:00 crc kubenswrapper[4843]: I0318 12:54:00.989614 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:54:01 crc kubenswrapper[4843]: I0318 12:54:01.108529 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:01 crc kubenswrapper[4843]: I0318 12:54:01.262595 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563974-6wbq2"] Mar 18 12:54:01 crc kubenswrapper[4843]: I0318 12:54:01.470104 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lp28m"] Mar 18 12:54:01 crc kubenswrapper[4843]: I0318 12:54:01.931616 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"8039e005f14b8978061b52f5bdd6ef705088f9029a2fdca0fced087b8ccadae4"} Mar 18 12:54:01 crc kubenswrapper[4843]: I0318 12:54:01.933022 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563974-6wbq2" event={"ID":"727dae08-13f0-4310-806c-969324946ac6","Type":"ContainerStarted","Data":"40e98a46ccc6c6666e106f0a4460fe47a31db380e831376ffba09284d30fe6b2"} Mar 18 12:54:01 crc kubenswrapper[4843]: I0318 12:54:01.935004 4843 generic.go:334] "Generic (PLEG): container finished" podID="4ce59e36-3692-45b8-a868-5883a11dc1c0" containerID="a334d16e643de3d50bf130d06fb2a9841604bfcf3dc70563ebe0882577b7f280" exitCode=0 Mar 18 12:54:01 crc kubenswrapper[4843]: I0318 12:54:01.935067 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lp28m" event={"ID":"4ce59e36-3692-45b8-a868-5883a11dc1c0","Type":"ContainerDied","Data":"a334d16e643de3d50bf130d06fb2a9841604bfcf3dc70563ebe0882577b7f280"} Mar 18 12:54:01 crc kubenswrapper[4843]: I0318 12:54:01.935091 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lp28m" event={"ID":"4ce59e36-3692-45b8-a868-5883a11dc1c0","Type":"ContainerStarted","Data":"80ca29bbe67d7d02cfa5ad3c556f56ba10ab719aaa5c10670f8e1fdf4783b980"} Mar 18 12:54:03 crc kubenswrapper[4843]: I0318 12:54:03.953150 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563974-6wbq2" event={"ID":"727dae08-13f0-4310-806c-969324946ac6","Type":"ContainerStarted","Data":"e4c1b5761d82579e78041f8594d2db667f5aafffb7db8bbfbe0c88e163d75b0a"} Mar 18 12:54:03 crc kubenswrapper[4843]: I0318 12:54:03.955251 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lp28m" event={"ID":"4ce59e36-3692-45b8-a868-5883a11dc1c0","Type":"ContainerStarted","Data":"cd11fcc426718c23c2a4d73051473ded205b240ab46b3d20fc97c0d5bea9eace"} Mar 18 12:54:03 crc kubenswrapper[4843]: I0318 12:54:03.972991 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563974-6wbq2" podStartSLOduration=1.80327079 podStartE2EDuration="3.972972933s" podCreationTimestamp="2026-03-18 12:54:00 +0000 UTC" firstStartedPulling="2026-03-18 12:54:01.315789745 +0000 UTC m=+2675.031615269" lastFinishedPulling="2026-03-18 12:54:03.485491888 +0000 UTC m=+2677.201317412" observedRunningTime="2026-03-18 12:54:03.965360957 +0000 UTC m=+2677.681186491" watchObservedRunningTime="2026-03-18 12:54:03.972972933 +0000 UTC m=+2677.688798457" Mar 18 12:54:04 crc kubenswrapper[4843]: I0318 12:54:04.970681 4843 generic.go:334] "Generic (PLEG): container finished" podID="727dae08-13f0-4310-806c-969324946ac6" containerID="e4c1b5761d82579e78041f8594d2db667f5aafffb7db8bbfbe0c88e163d75b0a" exitCode=0 Mar 18 12:54:04 crc kubenswrapper[4843]: I0318 12:54:04.970800 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563974-6wbq2" event={"ID":"727dae08-13f0-4310-806c-969324946ac6","Type":"ContainerDied","Data":"e4c1b5761d82579e78041f8594d2db667f5aafffb7db8bbfbe0c88e163d75b0a"} Mar 18 12:54:04 crc kubenswrapper[4843]: I0318 12:54:04.979284 4843 generic.go:334] "Generic (PLEG): container finished" podID="4ce59e36-3692-45b8-a868-5883a11dc1c0" containerID="cd11fcc426718c23c2a4d73051473ded205b240ab46b3d20fc97c0d5bea9eace" exitCode=0 Mar 18 12:54:04 crc kubenswrapper[4843]: I0318 12:54:04.979330 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lp28m" event={"ID":"4ce59e36-3692-45b8-a868-5883a11dc1c0","Type":"ContainerDied","Data":"cd11fcc426718c23c2a4d73051473ded205b240ab46b3d20fc97c0d5bea9eace"} Mar 18 12:54:05 crc kubenswrapper[4843]: I0318 12:54:05.250774 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:54:05 crc kubenswrapper[4843]: I0318 12:54:05.306688 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:54:06 crc kubenswrapper[4843]: I0318 12:54:06.433136 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563974-6wbq2" Mar 18 12:54:06 crc kubenswrapper[4843]: I0318 12:54:06.595225 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj6hn\" (UniqueName: \"kubernetes.io/projected/727dae08-13f0-4310-806c-969324946ac6-kube-api-access-fj6hn\") pod \"727dae08-13f0-4310-806c-969324946ac6\" (UID: \"727dae08-13f0-4310-806c-969324946ac6\") " Mar 18 12:54:06 crc kubenswrapper[4843]: I0318 12:54:06.603273 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/727dae08-13f0-4310-806c-969324946ac6-kube-api-access-fj6hn" (OuterVolumeSpecName: "kube-api-access-fj6hn") pod "727dae08-13f0-4310-806c-969324946ac6" (UID: "727dae08-13f0-4310-806c-969324946ac6"). InnerVolumeSpecName "kube-api-access-fj6hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:54:06 crc kubenswrapper[4843]: I0318 12:54:06.698013 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj6hn\" (UniqueName: \"kubernetes.io/projected/727dae08-13f0-4310-806c-969324946ac6-kube-api-access-fj6hn\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:06 crc kubenswrapper[4843]: I0318 12:54:06.823330 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jwh7g"] Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.021937 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563974-6wbq2" event={"ID":"727dae08-13f0-4310-806c-969324946ac6","Type":"ContainerDied","Data":"40e98a46ccc6c6666e106f0a4460fe47a31db380e831376ffba09284d30fe6b2"} Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.021994 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40e98a46ccc6c6666e106f0a4460fe47a31db380e831376ffba09284d30fe6b2" Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.022072 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563974-6wbq2" Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.028376 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lp28m" event={"ID":"4ce59e36-3692-45b8-a868-5883a11dc1c0","Type":"ContainerStarted","Data":"28b4461703201d98e127fd3365a21e8287d19fb3022a341c4154fc80f8689924"} Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.028523 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jwh7g" podUID="74f5c9c8-063a-4b13-9b5b-3616a53bdd33" containerName="registry-server" containerID="cri-o://0925f7b91932de91f5cf9906e4305dc2872ae06afa4693111f45aefc65a4cd5f" gracePeriod=2 Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.061483 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563968-66h99"] Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.062278 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lp28m" podStartSLOduration=2.751844777 podStartE2EDuration="7.062266685s" podCreationTimestamp="2026-03-18 12:54:00 +0000 UTC" firstStartedPulling="2026-03-18 12:54:01.936507143 +0000 UTC m=+2675.652332657" lastFinishedPulling="2026-03-18 12:54:06.246929041 +0000 UTC m=+2679.962754565" observedRunningTime="2026-03-18 12:54:07.058464838 +0000 UTC m=+2680.774290362" watchObservedRunningTime="2026-03-18 12:54:07.062266685 +0000 UTC m=+2680.778092209" Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.087020 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563968-66h99"] Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.463828 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.516696 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-utilities\") pod \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\" (UID: \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\") " Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.516755 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-catalog-content\") pod \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\" (UID: \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\") " Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.516778 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw9hk\" (UniqueName: \"kubernetes.io/projected/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-kube-api-access-zw9hk\") pod \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\" (UID: \"74f5c9c8-063a-4b13-9b5b-3616a53bdd33\") " Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.518087 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-utilities" (OuterVolumeSpecName: "utilities") pod "74f5c9c8-063a-4b13-9b5b-3616a53bdd33" (UID: "74f5c9c8-063a-4b13-9b5b-3616a53bdd33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.524561 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-kube-api-access-zw9hk" (OuterVolumeSpecName: "kube-api-access-zw9hk") pod "74f5c9c8-063a-4b13-9b5b-3616a53bdd33" (UID: "74f5c9c8-063a-4b13-9b5b-3616a53bdd33"). InnerVolumeSpecName "kube-api-access-zw9hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.618625 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.618731 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw9hk\" (UniqueName: \"kubernetes.io/projected/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-kube-api-access-zw9hk\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.637582 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74f5c9c8-063a-4b13-9b5b-3616a53bdd33" (UID: "74f5c9c8-063a-4b13-9b5b-3616a53bdd33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:54:07 crc kubenswrapper[4843]: I0318 12:54:07.719980 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74f5c9c8-063a-4b13-9b5b-3616a53bdd33-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.037864 4843 generic.go:334] "Generic (PLEG): container finished" podID="74f5c9c8-063a-4b13-9b5b-3616a53bdd33" containerID="0925f7b91932de91f5cf9906e4305dc2872ae06afa4693111f45aefc65a4cd5f" exitCode=0 Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.037913 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwh7g" event={"ID":"74f5c9c8-063a-4b13-9b5b-3616a53bdd33","Type":"ContainerDied","Data":"0925f7b91932de91f5cf9906e4305dc2872ae06afa4693111f45aefc65a4cd5f"} Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.038222 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwh7g" event={"ID":"74f5c9c8-063a-4b13-9b5b-3616a53bdd33","Type":"ContainerDied","Data":"b75ddb532174d2d504fadb6a01ae979b3f764b5bfd36ab3e173ce0278c41e2e2"} Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.038256 4843 scope.go:117] "RemoveContainer" containerID="0925f7b91932de91f5cf9906e4305dc2872ae06afa4693111f45aefc65a4cd5f" Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.037954 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwh7g" Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.069611 4843 scope.go:117] "RemoveContainer" containerID="e6afdb3e278036d58b498f494a32e515b0c6b0bda64fceaef4f8143833b38eee" Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.078015 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jwh7g"] Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.088211 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jwh7g"] Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.105280 4843 scope.go:117] "RemoveContainer" containerID="15aab03ae06677939651124ce211c13e3f3fe23ba8f7882495e3988db9b9cf16" Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.138617 4843 scope.go:117] "RemoveContainer" containerID="0925f7b91932de91f5cf9906e4305dc2872ae06afa4693111f45aefc65a4cd5f" Mar 18 12:54:08 crc kubenswrapper[4843]: E0318 12:54:08.139209 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0925f7b91932de91f5cf9906e4305dc2872ae06afa4693111f45aefc65a4cd5f\": container with ID starting with 0925f7b91932de91f5cf9906e4305dc2872ae06afa4693111f45aefc65a4cd5f not found: ID does not exist" containerID="0925f7b91932de91f5cf9906e4305dc2872ae06afa4693111f45aefc65a4cd5f" Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.139265 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0925f7b91932de91f5cf9906e4305dc2872ae06afa4693111f45aefc65a4cd5f"} err="failed to get container status \"0925f7b91932de91f5cf9906e4305dc2872ae06afa4693111f45aefc65a4cd5f\": rpc error: code = NotFound desc = could not find container \"0925f7b91932de91f5cf9906e4305dc2872ae06afa4693111f45aefc65a4cd5f\": container with ID starting with 0925f7b91932de91f5cf9906e4305dc2872ae06afa4693111f45aefc65a4cd5f not found: ID does not exist" Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.139294 4843 scope.go:117] "RemoveContainer" containerID="e6afdb3e278036d58b498f494a32e515b0c6b0bda64fceaef4f8143833b38eee" Mar 18 12:54:08 crc kubenswrapper[4843]: E0318 12:54:08.139622 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6afdb3e278036d58b498f494a32e515b0c6b0bda64fceaef4f8143833b38eee\": container with ID starting with e6afdb3e278036d58b498f494a32e515b0c6b0bda64fceaef4f8143833b38eee not found: ID does not exist" containerID="e6afdb3e278036d58b498f494a32e515b0c6b0bda64fceaef4f8143833b38eee" Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.140600 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6afdb3e278036d58b498f494a32e515b0c6b0bda64fceaef4f8143833b38eee"} err="failed to get container status \"e6afdb3e278036d58b498f494a32e515b0c6b0bda64fceaef4f8143833b38eee\": rpc error: code = NotFound desc = could not find container \"e6afdb3e278036d58b498f494a32e515b0c6b0bda64fceaef4f8143833b38eee\": container with ID starting with e6afdb3e278036d58b498f494a32e515b0c6b0bda64fceaef4f8143833b38eee not found: ID does not exist" Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.140625 4843 scope.go:117] "RemoveContainer" containerID="15aab03ae06677939651124ce211c13e3f3fe23ba8f7882495e3988db9b9cf16" Mar 18 12:54:08 crc kubenswrapper[4843]: E0318 12:54:08.144017 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15aab03ae06677939651124ce211c13e3f3fe23ba8f7882495e3988db9b9cf16\": container with ID starting with 15aab03ae06677939651124ce211c13e3f3fe23ba8f7882495e3988db9b9cf16 not found: ID does not exist" containerID="15aab03ae06677939651124ce211c13e3f3fe23ba8f7882495e3988db9b9cf16" Mar 18 12:54:08 crc kubenswrapper[4843]: I0318 12:54:08.144055 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15aab03ae06677939651124ce211c13e3f3fe23ba8f7882495e3988db9b9cf16"} err="failed to get container status \"15aab03ae06677939651124ce211c13e3f3fe23ba8f7882495e3988db9b9cf16\": rpc error: code = NotFound desc = could not find container \"15aab03ae06677939651124ce211c13e3f3fe23ba8f7882495e3988db9b9cf16\": container with ID starting with 15aab03ae06677939651124ce211c13e3f3fe23ba8f7882495e3988db9b9cf16 not found: ID does not exist" Mar 18 12:54:09 crc kubenswrapper[4843]: I0318 12:54:09.000460 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74f5c9c8-063a-4b13-9b5b-3616a53bdd33" path="/var/lib/kubelet/pods/74f5c9c8-063a-4b13-9b5b-3616a53bdd33/volumes" Mar 18 12:54:09 crc kubenswrapper[4843]: I0318 12:54:09.001192 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a22105f2-fd27-4d7a-8b2a-26770e6ddfab" path="/var/lib/kubelet/pods/a22105f2-fd27-4d7a-8b2a-26770e6ddfab/volumes" Mar 18 12:54:11 crc kubenswrapper[4843]: I0318 12:54:11.110508 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:11 crc kubenswrapper[4843]: I0318 12:54:11.110582 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:11 crc kubenswrapper[4843]: I0318 12:54:11.161301 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:12 crc kubenswrapper[4843]: I0318 12:54:12.140042 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:12 crc kubenswrapper[4843]: I0318 12:54:12.193898 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lp28m"] Mar 18 12:54:14 crc kubenswrapper[4843]: I0318 12:54:14.091452 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lp28m" podUID="4ce59e36-3692-45b8-a868-5883a11dc1c0" containerName="registry-server" containerID="cri-o://28b4461703201d98e127fd3365a21e8287d19fb3022a341c4154fc80f8689924" gracePeriod=2 Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.104009 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.106064 4843 generic.go:334] "Generic (PLEG): container finished" podID="4ce59e36-3692-45b8-a868-5883a11dc1c0" containerID="28b4461703201d98e127fd3365a21e8287d19fb3022a341c4154fc80f8689924" exitCode=0 Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.106133 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lp28m" event={"ID":"4ce59e36-3692-45b8-a868-5883a11dc1c0","Type":"ContainerDied","Data":"28b4461703201d98e127fd3365a21e8287d19fb3022a341c4154fc80f8689924"} Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.106342 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lp28m" event={"ID":"4ce59e36-3692-45b8-a868-5883a11dc1c0","Type":"ContainerDied","Data":"80ca29bbe67d7d02cfa5ad3c556f56ba10ab719aaa5c10670f8e1fdf4783b980"} Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.106414 4843 scope.go:117] "RemoveContainer" containerID="28b4461703201d98e127fd3365a21e8287d19fb3022a341c4154fc80f8689924" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.130824 4843 scope.go:117] "RemoveContainer" containerID="cd11fcc426718c23c2a4d73051473ded205b240ab46b3d20fc97c0d5bea9eace" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.154986 4843 scope.go:117] "RemoveContainer" containerID="a334d16e643de3d50bf130d06fb2a9841604bfcf3dc70563ebe0882577b7f280" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.207100 4843 scope.go:117] "RemoveContainer" containerID="28b4461703201d98e127fd3365a21e8287d19fb3022a341c4154fc80f8689924" Mar 18 12:54:15 crc kubenswrapper[4843]: E0318 12:54:15.207626 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28b4461703201d98e127fd3365a21e8287d19fb3022a341c4154fc80f8689924\": container with ID starting with 28b4461703201d98e127fd3365a21e8287d19fb3022a341c4154fc80f8689924 not found: ID does not exist" containerID="28b4461703201d98e127fd3365a21e8287d19fb3022a341c4154fc80f8689924" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.207685 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28b4461703201d98e127fd3365a21e8287d19fb3022a341c4154fc80f8689924"} err="failed to get container status \"28b4461703201d98e127fd3365a21e8287d19fb3022a341c4154fc80f8689924\": rpc error: code = NotFound desc = could not find container \"28b4461703201d98e127fd3365a21e8287d19fb3022a341c4154fc80f8689924\": container with ID starting with 28b4461703201d98e127fd3365a21e8287d19fb3022a341c4154fc80f8689924 not found: ID does not exist" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.207715 4843 scope.go:117] "RemoveContainer" containerID="cd11fcc426718c23c2a4d73051473ded205b240ab46b3d20fc97c0d5bea9eace" Mar 18 12:54:15 crc kubenswrapper[4843]: E0318 12:54:15.208119 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd11fcc426718c23c2a4d73051473ded205b240ab46b3d20fc97c0d5bea9eace\": container with ID starting with cd11fcc426718c23c2a4d73051473ded205b240ab46b3d20fc97c0d5bea9eace not found: ID does not exist" containerID="cd11fcc426718c23c2a4d73051473ded205b240ab46b3d20fc97c0d5bea9eace" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.208154 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd11fcc426718c23c2a4d73051473ded205b240ab46b3d20fc97c0d5bea9eace"} err="failed to get container status \"cd11fcc426718c23c2a4d73051473ded205b240ab46b3d20fc97c0d5bea9eace\": rpc error: code = NotFound desc = could not find container \"cd11fcc426718c23c2a4d73051473ded205b240ab46b3d20fc97c0d5bea9eace\": container with ID starting with cd11fcc426718c23c2a4d73051473ded205b240ab46b3d20fc97c0d5bea9eace not found: ID does not exist" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.208175 4843 scope.go:117] "RemoveContainer" containerID="a334d16e643de3d50bf130d06fb2a9841604bfcf3dc70563ebe0882577b7f280" Mar 18 12:54:15 crc kubenswrapper[4843]: E0318 12:54:15.208436 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a334d16e643de3d50bf130d06fb2a9841604bfcf3dc70563ebe0882577b7f280\": container with ID starting with a334d16e643de3d50bf130d06fb2a9841604bfcf3dc70563ebe0882577b7f280 not found: ID does not exist" containerID="a334d16e643de3d50bf130d06fb2a9841604bfcf3dc70563ebe0882577b7f280" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.208460 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a334d16e643de3d50bf130d06fb2a9841604bfcf3dc70563ebe0882577b7f280"} err="failed to get container status \"a334d16e643de3d50bf130d06fb2a9841604bfcf3dc70563ebe0882577b7f280\": rpc error: code = NotFound desc = could not find container \"a334d16e643de3d50bf130d06fb2a9841604bfcf3dc70563ebe0882577b7f280\": container with ID starting with a334d16e643de3d50bf130d06fb2a9841604bfcf3dc70563ebe0882577b7f280 not found: ID does not exist" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.281455 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kt6g\" (UniqueName: \"kubernetes.io/projected/4ce59e36-3692-45b8-a868-5883a11dc1c0-kube-api-access-7kt6g\") pod \"4ce59e36-3692-45b8-a868-5883a11dc1c0\" (UID: \"4ce59e36-3692-45b8-a868-5883a11dc1c0\") " Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.281549 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce59e36-3692-45b8-a868-5883a11dc1c0-utilities\") pod \"4ce59e36-3692-45b8-a868-5883a11dc1c0\" (UID: \"4ce59e36-3692-45b8-a868-5883a11dc1c0\") " Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.281686 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce59e36-3692-45b8-a868-5883a11dc1c0-catalog-content\") pod \"4ce59e36-3692-45b8-a868-5883a11dc1c0\" (UID: \"4ce59e36-3692-45b8-a868-5883a11dc1c0\") " Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.282356 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce59e36-3692-45b8-a868-5883a11dc1c0-utilities" (OuterVolumeSpecName: "utilities") pod "4ce59e36-3692-45b8-a868-5883a11dc1c0" (UID: "4ce59e36-3692-45b8-a868-5883a11dc1c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.289931 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce59e36-3692-45b8-a868-5883a11dc1c0-kube-api-access-7kt6g" (OuterVolumeSpecName: "kube-api-access-7kt6g") pod "4ce59e36-3692-45b8-a868-5883a11dc1c0" (UID: "4ce59e36-3692-45b8-a868-5883a11dc1c0"). InnerVolumeSpecName "kube-api-access-7kt6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.308007 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce59e36-3692-45b8-a868-5883a11dc1c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ce59e36-3692-45b8-a868-5883a11dc1c0" (UID: "4ce59e36-3692-45b8-a868-5883a11dc1c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.383995 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kt6g\" (UniqueName: \"kubernetes.io/projected/4ce59e36-3692-45b8-a868-5883a11dc1c0-kube-api-access-7kt6g\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.384036 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce59e36-3692-45b8-a868-5883a11dc1c0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:15 crc kubenswrapper[4843]: I0318 12:54:15.384049 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce59e36-3692-45b8-a868-5883a11dc1c0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:16 crc kubenswrapper[4843]: I0318 12:54:16.115781 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lp28m" Mar 18 12:54:16 crc kubenswrapper[4843]: I0318 12:54:16.152688 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lp28m"] Mar 18 12:54:16 crc kubenswrapper[4843]: I0318 12:54:16.163871 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lp28m"] Mar 18 12:54:16 crc kubenswrapper[4843]: I0318 12:54:16.993571 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce59e36-3692-45b8-a868-5883a11dc1c0" path="/var/lib/kubelet/pods/4ce59e36-3692-45b8-a868-5883a11dc1c0/volumes" Mar 18 12:54:17 crc kubenswrapper[4843]: I0318 12:54:17.130752 4843 generic.go:334] "Generic (PLEG): container finished" podID="9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3" containerID="c867b95a9a1acc6b85429262ed83ce4492692b3ec99cc5d9e25611f500b1d2e4" exitCode=2 Mar 18 12:54:17 crc kubenswrapper[4843]: I0318 12:54:17.130803 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" event={"ID":"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3","Type":"ContainerDied","Data":"c867b95a9a1acc6b85429262ed83ce4492692b3ec99cc5d9e25611f500b1d2e4"} Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.597294 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.760948 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-ssh-key-openstack-edpm-ipam\") pod \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.761107 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkqw7\" (UniqueName: \"kubernetes.io/projected/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-kube-api-access-fkqw7\") pod \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.761134 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-libvirt-secret-0\") pod \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.761183 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-inventory\") pod \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.761299 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-libvirt-combined-ca-bundle\") pod \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\" (UID: \"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3\") " Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.775669 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-kube-api-access-fkqw7" (OuterVolumeSpecName: "kube-api-access-fkqw7") pod "9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3" (UID: "9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3"). InnerVolumeSpecName "kube-api-access-fkqw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.775790 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3" (UID: "9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.790006 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-inventory" (OuterVolumeSpecName: "inventory") pod "9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3" (UID: "9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.791952 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3" (UID: "9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.795956 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3" (UID: "9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.864276 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkqw7\" (UniqueName: \"kubernetes.io/projected/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-kube-api-access-fkqw7\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.864326 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.864339 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.864351 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:18 crc kubenswrapper[4843]: I0318 12:54:18.864365 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:19 crc kubenswrapper[4843]: I0318 12:54:19.165073 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" event={"ID":"9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3","Type":"ContainerDied","Data":"6ee135263cfbd89dd522c30fcc18d0a69f8f58cc6287ed0356f7ff7549610126"} Mar 18 12:54:19 crc kubenswrapper[4843]: I0318 12:54:19.165115 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ee135263cfbd89dd522c30fcc18d0a69f8f58cc6287ed0356f7ff7549610126" Mar 18 12:54:19 crc kubenswrapper[4843]: I0318 12:54:19.165134 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.040462 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47"] Mar 18 12:54:36 crc kubenswrapper[4843]: E0318 12:54:36.041811 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f5c9c8-063a-4b13-9b5b-3616a53bdd33" containerName="registry-server" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.041835 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f5c9c8-063a-4b13-9b5b-3616a53bdd33" containerName="registry-server" Mar 18 12:54:36 crc kubenswrapper[4843]: E0318 12:54:36.041863 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f5c9c8-063a-4b13-9b5b-3616a53bdd33" containerName="extract-utilities" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.041875 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f5c9c8-063a-4b13-9b5b-3616a53bdd33" containerName="extract-utilities" Mar 18 12:54:36 crc kubenswrapper[4843]: E0318 12:54:36.041899 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce59e36-3692-45b8-a868-5883a11dc1c0" containerName="registry-server" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.041910 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce59e36-3692-45b8-a868-5883a11dc1c0" containerName="registry-server" Mar 18 12:54:36 crc kubenswrapper[4843]: E0318 12:54:36.041934 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce59e36-3692-45b8-a868-5883a11dc1c0" containerName="extract-content" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.041945 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce59e36-3692-45b8-a868-5883a11dc1c0" containerName="extract-content" Mar 18 12:54:36 crc kubenswrapper[4843]: E0318 12:54:36.041971 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce59e36-3692-45b8-a868-5883a11dc1c0" containerName="extract-utilities" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.041981 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce59e36-3692-45b8-a868-5883a11dc1c0" containerName="extract-utilities" Mar 18 12:54:36 crc kubenswrapper[4843]: E0318 12:54:36.042014 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="727dae08-13f0-4310-806c-969324946ac6" containerName="oc" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.042025 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="727dae08-13f0-4310-806c-969324946ac6" containerName="oc" Mar 18 12:54:36 crc kubenswrapper[4843]: E0318 12:54:36.042036 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.042048 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:54:36 crc kubenswrapper[4843]: E0318 12:54:36.042070 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f5c9c8-063a-4b13-9b5b-3616a53bdd33" containerName="extract-content" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.042080 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f5c9c8-063a-4b13-9b5b-3616a53bdd33" containerName="extract-content" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.042389 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.042420 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f5c9c8-063a-4b13-9b5b-3616a53bdd33" containerName="registry-server" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.042435 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="727dae08-13f0-4310-806c-969324946ac6" containerName="oc" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.042457 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce59e36-3692-45b8-a868-5883a11dc1c0" containerName="registry-server" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.043624 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.047753 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.048222 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.048627 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.049729 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.056745 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.066115 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47"] Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.110281 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9vrq\" (UniqueName: \"kubernetes.io/projected/17aa29d2-9988-4fa7-86b4-e62e6f879817-kube-api-access-k9vrq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.110571 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.110640 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.110706 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.110744 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.212221 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.212335 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9vrq\" (UniqueName: \"kubernetes.io/projected/17aa29d2-9988-4fa7-86b4-e62e6f879817-kube-api-access-k9vrq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.212460 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.212489 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.212514 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.218116 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.218804 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.225082 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.225856 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.234752 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9vrq\" (UniqueName: \"kubernetes.io/projected/17aa29d2-9988-4fa7-86b4-e62e6f879817-kube-api-access-k9vrq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftv47\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:36 crc kubenswrapper[4843]: I0318 12:54:36.370542 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:54:37 crc kubenswrapper[4843]: I0318 12:54:37.004332 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47"] Mar 18 12:54:37 crc kubenswrapper[4843]: I0318 12:54:37.319541 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" event={"ID":"17aa29d2-9988-4fa7-86b4-e62e6f879817","Type":"ContainerStarted","Data":"57658d77766721e5ac6ff0e4f6bc9bea33d1e6a77f7b6cf408c4821c157ac6b6"} Mar 18 12:54:39 crc kubenswrapper[4843]: I0318 12:54:39.382019 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" event={"ID":"17aa29d2-9988-4fa7-86b4-e62e6f879817","Type":"ContainerStarted","Data":"768a0f45982b86419a1acebc633f59c2271cb2d43a9a24f0d7e76558cec5fc4a"} Mar 18 12:54:39 crc kubenswrapper[4843]: I0318 12:54:39.407044 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" podStartSLOduration=2.21796246 podStartE2EDuration="3.407014085s" podCreationTimestamp="2026-03-18 12:54:36 +0000 UTC" firstStartedPulling="2026-03-18 12:54:36.993142442 +0000 UTC m=+2710.708967966" lastFinishedPulling="2026-03-18 12:54:38.182194077 +0000 UTC m=+2711.898019591" observedRunningTime="2026-03-18 12:54:39.401674784 +0000 UTC m=+2713.117500328" watchObservedRunningTime="2026-03-18 12:54:39.407014085 +0000 UTC m=+2713.122839619" Mar 18 12:55:05 crc kubenswrapper[4843]: I0318 12:55:05.206673 4843 scope.go:117] "RemoveContainer" containerID="a9b50367b26fab31d281d048dfe6854d683091c8b0632122aaa08bf861cc864c" Mar 18 12:56:00 crc kubenswrapper[4843]: I0318 12:56:00.159444 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563976-jp28w"] Mar 18 12:56:00 crc kubenswrapper[4843]: I0318 12:56:00.166417 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563976-jp28w" Mar 18 12:56:00 crc kubenswrapper[4843]: I0318 12:56:00.169706 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:56:00 crc kubenswrapper[4843]: I0318 12:56:00.170397 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:56:00 crc kubenswrapper[4843]: I0318 12:56:00.171485 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:56:00 crc kubenswrapper[4843]: I0318 12:56:00.185327 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563976-jp28w"] Mar 18 12:56:00 crc kubenswrapper[4843]: I0318 12:56:00.371331 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwpln\" (UniqueName: \"kubernetes.io/projected/54b44e3a-b093-415f-97ce-cd77f27c01cb-kube-api-access-zwpln\") pod \"auto-csr-approver-29563976-jp28w\" (UID: \"54b44e3a-b093-415f-97ce-cd77f27c01cb\") " pod="openshift-infra/auto-csr-approver-29563976-jp28w" Mar 18 12:56:00 crc kubenswrapper[4843]: I0318 12:56:00.473367 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwpln\" (UniqueName: \"kubernetes.io/projected/54b44e3a-b093-415f-97ce-cd77f27c01cb-kube-api-access-zwpln\") pod \"auto-csr-approver-29563976-jp28w\" (UID: \"54b44e3a-b093-415f-97ce-cd77f27c01cb\") " pod="openshift-infra/auto-csr-approver-29563976-jp28w" Mar 18 12:56:00 crc kubenswrapper[4843]: I0318 12:56:00.495303 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwpln\" (UniqueName: \"kubernetes.io/projected/54b44e3a-b093-415f-97ce-cd77f27c01cb-kube-api-access-zwpln\") pod \"auto-csr-approver-29563976-jp28w\" (UID: \"54b44e3a-b093-415f-97ce-cd77f27c01cb\") " pod="openshift-infra/auto-csr-approver-29563976-jp28w" Mar 18 12:56:00 crc kubenswrapper[4843]: I0318 12:56:00.671310 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563976-jp28w" Mar 18 12:56:01 crc kubenswrapper[4843]: I0318 12:56:01.229565 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563976-jp28w"] Mar 18 12:56:01 crc kubenswrapper[4843]: I0318 12:56:01.392743 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563976-jp28w" event={"ID":"54b44e3a-b093-415f-97ce-cd77f27c01cb","Type":"ContainerStarted","Data":"ac0ef36bf0b4c47899709c91d9ccbcde84d13a4f0c7160f947aa81dffbace9a9"} Mar 18 12:56:03 crc kubenswrapper[4843]: I0318 12:56:03.410157 4843 generic.go:334] "Generic (PLEG): container finished" podID="54b44e3a-b093-415f-97ce-cd77f27c01cb" containerID="3b5ee2d0b4d6d6b98ba504f73c2baa53c919a6086a9e06e55710ca697d11b8d8" exitCode=0 Mar 18 12:56:03 crc kubenswrapper[4843]: I0318 12:56:03.410266 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563976-jp28w" event={"ID":"54b44e3a-b093-415f-97ce-cd77f27c01cb","Type":"ContainerDied","Data":"3b5ee2d0b4d6d6b98ba504f73c2baa53c919a6086a9e06e55710ca697d11b8d8"} Mar 18 12:56:04 crc kubenswrapper[4843]: I0318 12:56:04.836342 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563976-jp28w" Mar 18 12:56:04 crc kubenswrapper[4843]: I0318 12:56:04.866636 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwpln\" (UniqueName: \"kubernetes.io/projected/54b44e3a-b093-415f-97ce-cd77f27c01cb-kube-api-access-zwpln\") pod \"54b44e3a-b093-415f-97ce-cd77f27c01cb\" (UID: \"54b44e3a-b093-415f-97ce-cd77f27c01cb\") " Mar 18 12:56:04 crc kubenswrapper[4843]: I0318 12:56:04.876772 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b44e3a-b093-415f-97ce-cd77f27c01cb-kube-api-access-zwpln" (OuterVolumeSpecName: "kube-api-access-zwpln") pod "54b44e3a-b093-415f-97ce-cd77f27c01cb" (UID: "54b44e3a-b093-415f-97ce-cd77f27c01cb"). InnerVolumeSpecName "kube-api-access-zwpln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:56:04 crc kubenswrapper[4843]: I0318 12:56:04.968887 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwpln\" (UniqueName: \"kubernetes.io/projected/54b44e3a-b093-415f-97ce-cd77f27c01cb-kube-api-access-zwpln\") on node \"crc\" DevicePath \"\"" Mar 18 12:56:05 crc kubenswrapper[4843]: I0318 12:56:05.432010 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563976-jp28w" event={"ID":"54b44e3a-b093-415f-97ce-cd77f27c01cb","Type":"ContainerDied","Data":"ac0ef36bf0b4c47899709c91d9ccbcde84d13a4f0c7160f947aa81dffbace9a9"} Mar 18 12:56:05 crc kubenswrapper[4843]: I0318 12:56:05.432052 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac0ef36bf0b4c47899709c91d9ccbcde84d13a4f0c7160f947aa81dffbace9a9" Mar 18 12:56:05 crc kubenswrapper[4843]: I0318 12:56:05.432583 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563976-jp28w" Mar 18 12:56:05 crc kubenswrapper[4843]: I0318 12:56:05.937269 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563970-n6nzp"] Mar 18 12:56:05 crc kubenswrapper[4843]: I0318 12:56:05.944332 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563970-n6nzp"] Mar 18 12:56:06 crc kubenswrapper[4843]: I0318 12:56:06.994902 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0d24420-11b7-4859-9b97-99bc5a58c8a1" path="/var/lib/kubelet/pods/d0d24420-11b7-4859-9b97-99bc5a58c8a1/volumes" Mar 18 12:56:20 crc kubenswrapper[4843]: I0318 12:56:20.034995 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:56:20 crc kubenswrapper[4843]: I0318 12:56:20.035531 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:56:25 crc kubenswrapper[4843]: E0318 12:56:25.381592 4843 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17aa29d2_9988_4fa7_86b4_e62e6f879817.slice/crio-conmon-768a0f45982b86419a1acebc633f59c2271cb2d43a9a24f0d7e76558cec5fc4a.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:56:25 crc kubenswrapper[4843]: I0318 12:56:25.649340 4843 generic.go:334] "Generic (PLEG): container finished" podID="17aa29d2-9988-4fa7-86b4-e62e6f879817" containerID="768a0f45982b86419a1acebc633f59c2271cb2d43a9a24f0d7e76558cec5fc4a" exitCode=2 Mar 18 12:56:25 crc kubenswrapper[4843]: I0318 12:56:25.649395 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" event={"ID":"17aa29d2-9988-4fa7-86b4-e62e6f879817","Type":"ContainerDied","Data":"768a0f45982b86419a1acebc633f59c2271cb2d43a9a24f0d7e76558cec5fc4a"} Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.039409 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.129364 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9vrq\" (UniqueName: \"kubernetes.io/projected/17aa29d2-9988-4fa7-86b4-e62e6f879817-kube-api-access-k9vrq\") pod \"17aa29d2-9988-4fa7-86b4-e62e6f879817\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.129501 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-inventory\") pod \"17aa29d2-9988-4fa7-86b4-e62e6f879817\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.129631 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-libvirt-secret-0\") pod \"17aa29d2-9988-4fa7-86b4-e62e6f879817\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.129710 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-ssh-key-openstack-edpm-ipam\") pod \"17aa29d2-9988-4fa7-86b4-e62e6f879817\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.129806 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-libvirt-combined-ca-bundle\") pod \"17aa29d2-9988-4fa7-86b4-e62e6f879817\" (UID: \"17aa29d2-9988-4fa7-86b4-e62e6f879817\") " Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.134880 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "17aa29d2-9988-4fa7-86b4-e62e6f879817" (UID: "17aa29d2-9988-4fa7-86b4-e62e6f879817"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.135508 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17aa29d2-9988-4fa7-86b4-e62e6f879817-kube-api-access-k9vrq" (OuterVolumeSpecName: "kube-api-access-k9vrq") pod "17aa29d2-9988-4fa7-86b4-e62e6f879817" (UID: "17aa29d2-9988-4fa7-86b4-e62e6f879817"). InnerVolumeSpecName "kube-api-access-k9vrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.157891 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-inventory" (OuterVolumeSpecName: "inventory") pod "17aa29d2-9988-4fa7-86b4-e62e6f879817" (UID: "17aa29d2-9988-4fa7-86b4-e62e6f879817"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.166776 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "17aa29d2-9988-4fa7-86b4-e62e6f879817" (UID: "17aa29d2-9988-4fa7-86b4-e62e6f879817"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.191216 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "17aa29d2-9988-4fa7-86b4-e62e6f879817" (UID: "17aa29d2-9988-4fa7-86b4-e62e6f879817"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.233264 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.233301 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.233315 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.233324 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9vrq\" (UniqueName: \"kubernetes.io/projected/17aa29d2-9988-4fa7-86b4-e62e6f879817-kube-api-access-k9vrq\") on node \"crc\" DevicePath \"\"" Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.233334 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17aa29d2-9988-4fa7-86b4-e62e6f879817-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.666276 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" event={"ID":"17aa29d2-9988-4fa7-86b4-e62e6f879817","Type":"ContainerDied","Data":"57658d77766721e5ac6ff0e4f6bc9bea33d1e6a77f7b6cf408c4821c157ac6b6"} Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.666317 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57658d77766721e5ac6ff0e4f6bc9bea33d1e6a77f7b6cf408c4821c157ac6b6" Mar 18 12:56:27 crc kubenswrapper[4843]: I0318 12:56:27.666333 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftv47" Mar 18 12:56:50 crc kubenswrapper[4843]: I0318 12:56:50.035409 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:56:50 crc kubenswrapper[4843]: I0318 12:56:50.036006 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.033343 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q"] Mar 18 12:57:05 crc kubenswrapper[4843]: E0318 12:57:05.034434 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17aa29d2-9988-4fa7-86b4-e62e6f879817" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.034457 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="17aa29d2-9988-4fa7-86b4-e62e6f879817" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:57:05 crc kubenswrapper[4843]: E0318 12:57:05.034490 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b44e3a-b093-415f-97ce-cd77f27c01cb" containerName="oc" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.034497 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b44e3a-b093-415f-97ce-cd77f27c01cb" containerName="oc" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.034790 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b44e3a-b093-415f-97ce-cd77f27c01cb" containerName="oc" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.034812 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="17aa29d2-9988-4fa7-86b4-e62e6f879817" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.035688 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.039417 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.044804 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.045038 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.045218 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.045461 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.045956 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q"] Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.111887 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.111937 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzrhk\" (UniqueName: \"kubernetes.io/projected/98e7a8b0-08df-4140-94f7-135db9497789-kube-api-access-xzrhk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.112155 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.112315 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.112537 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.214927 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzrhk\" (UniqueName: \"kubernetes.io/projected/98e7a8b0-08df-4140-94f7-135db9497789-kube-api-access-xzrhk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.215018 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.215183 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.215291 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.215472 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.222199 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.223250 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.223551 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.225446 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.235560 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzrhk\" (UniqueName: \"kubernetes.io/projected/98e7a8b0-08df-4140-94f7-135db9497789-kube-api-access-xzrhk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kng5q\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.366083 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.421094 4843 scope.go:117] "RemoveContainer" containerID="47bf1a38d829c1cf2ebae9958013f33b7c55f6d3563c72a06132d0d1a973466a" Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.928970 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q"] Mar 18 12:57:05 crc kubenswrapper[4843]: I0318 12:57:05.932028 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:57:06 crc kubenswrapper[4843]: I0318 12:57:06.106938 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" event={"ID":"98e7a8b0-08df-4140-94f7-135db9497789","Type":"ContainerStarted","Data":"902c3ea75472e408580c46f3bb215a6faee3c929bbc31d8912d7bda7ba7ee03b"} Mar 18 12:57:07 crc kubenswrapper[4843]: I0318 12:57:07.117017 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" event={"ID":"98e7a8b0-08df-4140-94f7-135db9497789","Type":"ContainerStarted","Data":"1d41df65223b7092cd19009ed3b09170723bceca41ee4af18fa44a076a3f54a7"} Mar 18 12:57:07 crc kubenswrapper[4843]: I0318 12:57:07.145467 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" podStartSLOduration=1.8981446279999998 podStartE2EDuration="2.145424185s" podCreationTimestamp="2026-03-18 12:57:05 +0000 UTC" firstStartedPulling="2026-03-18 12:57:05.931717331 +0000 UTC m=+2859.647542855" lastFinishedPulling="2026-03-18 12:57:06.178996888 +0000 UTC m=+2859.894822412" observedRunningTime="2026-03-18 12:57:07.139194138 +0000 UTC m=+2860.855019702" watchObservedRunningTime="2026-03-18 12:57:07.145424185 +0000 UTC m=+2860.861249719" Mar 18 12:57:20 crc kubenswrapper[4843]: I0318 12:57:20.036336 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:57:20 crc kubenswrapper[4843]: I0318 12:57:20.037048 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:57:20 crc kubenswrapper[4843]: I0318 12:57:20.037148 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 12:57:20 crc kubenswrapper[4843]: I0318 12:57:20.038413 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8039e005f14b8978061b52f5bdd6ef705088f9029a2fdca0fced087b8ccadae4"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:57:20 crc kubenswrapper[4843]: I0318 12:57:20.038548 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://8039e005f14b8978061b52f5bdd6ef705088f9029a2fdca0fced087b8ccadae4" gracePeriod=600 Mar 18 12:57:20 crc kubenswrapper[4843]: I0318 12:57:20.253366 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="8039e005f14b8978061b52f5bdd6ef705088f9029a2fdca0fced087b8ccadae4" exitCode=0 Mar 18 12:57:20 crc kubenswrapper[4843]: I0318 12:57:20.253448 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"8039e005f14b8978061b52f5bdd6ef705088f9029a2fdca0fced087b8ccadae4"} Mar 18 12:57:20 crc kubenswrapper[4843]: I0318 12:57:20.253716 4843 scope.go:117] "RemoveContainer" containerID="ced19e3ca06c31c240a28fc743b0854cda4bc6b7333009f17571fd2123f54276" Mar 18 12:57:21 crc kubenswrapper[4843]: I0318 12:57:21.271991 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9"} Mar 18 12:58:00 crc kubenswrapper[4843]: I0318 12:58:00.156824 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563978-9cs6p"] Mar 18 12:58:00 crc kubenswrapper[4843]: I0318 12:58:00.158759 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563978-9cs6p" Mar 18 12:58:00 crc kubenswrapper[4843]: I0318 12:58:00.161415 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:58:00 crc kubenswrapper[4843]: I0318 12:58:00.161544 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:58:00 crc kubenswrapper[4843]: I0318 12:58:00.161668 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 12:58:00 crc kubenswrapper[4843]: I0318 12:58:00.170732 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563978-9cs6p"] Mar 18 12:58:00 crc kubenswrapper[4843]: I0318 12:58:00.218662 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk9nm\" (UniqueName: \"kubernetes.io/projected/b692b03e-5e65-4067-ac66-d0a9d5becb13-kube-api-access-vk9nm\") pod \"auto-csr-approver-29563978-9cs6p\" (UID: \"b692b03e-5e65-4067-ac66-d0a9d5becb13\") " pod="openshift-infra/auto-csr-approver-29563978-9cs6p" Mar 18 12:58:00 crc kubenswrapper[4843]: I0318 12:58:00.321096 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk9nm\" (UniqueName: \"kubernetes.io/projected/b692b03e-5e65-4067-ac66-d0a9d5becb13-kube-api-access-vk9nm\") pod \"auto-csr-approver-29563978-9cs6p\" (UID: \"b692b03e-5e65-4067-ac66-d0a9d5becb13\") " pod="openshift-infra/auto-csr-approver-29563978-9cs6p" Mar 18 12:58:00 crc kubenswrapper[4843]: I0318 12:58:00.342045 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk9nm\" (UniqueName: \"kubernetes.io/projected/b692b03e-5e65-4067-ac66-d0a9d5becb13-kube-api-access-vk9nm\") pod \"auto-csr-approver-29563978-9cs6p\" (UID: \"b692b03e-5e65-4067-ac66-d0a9d5becb13\") " pod="openshift-infra/auto-csr-approver-29563978-9cs6p" Mar 18 12:58:00 crc kubenswrapper[4843]: I0318 12:58:00.510742 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563978-9cs6p" Mar 18 12:58:00 crc kubenswrapper[4843]: I0318 12:58:00.983469 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563978-9cs6p"] Mar 18 12:58:01 crc kubenswrapper[4843]: I0318 12:58:01.726238 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563978-9cs6p" event={"ID":"b692b03e-5e65-4067-ac66-d0a9d5becb13","Type":"ContainerStarted","Data":"1c81ab6143cb7e37ceb478f92450219058b02b1a7b4686659dd7dd9185ab5f86"} Mar 18 12:58:02 crc kubenswrapper[4843]: I0318 12:58:02.736840 4843 generic.go:334] "Generic (PLEG): container finished" podID="b692b03e-5e65-4067-ac66-d0a9d5becb13" containerID="78b6318933ff8ee3b2494c40057ab6a73b56732944b5d87b198124495c25897f" exitCode=0 Mar 18 12:58:02 crc kubenswrapper[4843]: I0318 12:58:02.736908 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563978-9cs6p" event={"ID":"b692b03e-5e65-4067-ac66-d0a9d5becb13","Type":"ContainerDied","Data":"78b6318933ff8ee3b2494c40057ab6a73b56732944b5d87b198124495c25897f"} Mar 18 12:58:04 crc kubenswrapper[4843]: I0318 12:58:04.079371 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563978-9cs6p" Mar 18 12:58:04 crc kubenswrapper[4843]: I0318 12:58:04.199767 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk9nm\" (UniqueName: \"kubernetes.io/projected/b692b03e-5e65-4067-ac66-d0a9d5becb13-kube-api-access-vk9nm\") pod \"b692b03e-5e65-4067-ac66-d0a9d5becb13\" (UID: \"b692b03e-5e65-4067-ac66-d0a9d5becb13\") " Mar 18 12:58:04 crc kubenswrapper[4843]: I0318 12:58:04.205597 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b692b03e-5e65-4067-ac66-d0a9d5becb13-kube-api-access-vk9nm" (OuterVolumeSpecName: "kube-api-access-vk9nm") pod "b692b03e-5e65-4067-ac66-d0a9d5becb13" (UID: "b692b03e-5e65-4067-ac66-d0a9d5becb13"). InnerVolumeSpecName "kube-api-access-vk9nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:58:04 crc kubenswrapper[4843]: I0318 12:58:04.303063 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk9nm\" (UniqueName: \"kubernetes.io/projected/b692b03e-5e65-4067-ac66-d0a9d5becb13-kube-api-access-vk9nm\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:04 crc kubenswrapper[4843]: I0318 12:58:04.756916 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563978-9cs6p" event={"ID":"b692b03e-5e65-4067-ac66-d0a9d5becb13","Type":"ContainerDied","Data":"1c81ab6143cb7e37ceb478f92450219058b02b1a7b4686659dd7dd9185ab5f86"} Mar 18 12:58:04 crc kubenswrapper[4843]: I0318 12:58:04.756973 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c81ab6143cb7e37ceb478f92450219058b02b1a7b4686659dd7dd9185ab5f86" Mar 18 12:58:04 crc kubenswrapper[4843]: I0318 12:58:04.756993 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563978-9cs6p" Mar 18 12:58:05 crc kubenswrapper[4843]: I0318 12:58:05.159611 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563972-2hl2b"] Mar 18 12:58:05 crc kubenswrapper[4843]: I0318 12:58:05.167899 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563972-2hl2b"] Mar 18 12:58:06 crc kubenswrapper[4843]: I0318 12:58:06.995806 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c952cae7-b1db-4558-9084-3294be4a4d57" path="/var/lib/kubelet/pods/c952cae7-b1db-4558-9084-3294be4a4d57/volumes" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.313609 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7jwrt"] Mar 18 12:58:41 crc kubenswrapper[4843]: E0318 12:58:41.317298 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b692b03e-5e65-4067-ac66-d0a9d5becb13" containerName="oc" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.317320 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="b692b03e-5e65-4067-ac66-d0a9d5becb13" containerName="oc" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.317586 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="b692b03e-5e65-4067-ac66-d0a9d5becb13" containerName="oc" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.319584 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.333096 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7jwrt"] Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.366037 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqs9n\" (UniqueName: \"kubernetes.io/projected/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-kube-api-access-mqs9n\") pod \"community-operators-7jwrt\" (UID: \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\") " pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.366303 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-utilities\") pod \"community-operators-7jwrt\" (UID: \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\") " pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.367151 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-catalog-content\") pod \"community-operators-7jwrt\" (UID: \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\") " pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.468341 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqs9n\" (UniqueName: \"kubernetes.io/projected/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-kube-api-access-mqs9n\") pod \"community-operators-7jwrt\" (UID: \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\") " pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.468801 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-utilities\") pod \"community-operators-7jwrt\" (UID: \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\") " pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.469022 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-catalog-content\") pod \"community-operators-7jwrt\" (UID: \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\") " pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.469445 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-utilities\") pod \"community-operators-7jwrt\" (UID: \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\") " pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.469630 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-catalog-content\") pod \"community-operators-7jwrt\" (UID: \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\") " pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.491731 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqs9n\" (UniqueName: \"kubernetes.io/projected/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-kube-api-access-mqs9n\") pod \"community-operators-7jwrt\" (UID: \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\") " pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:41 crc kubenswrapper[4843]: I0318 12:58:41.653598 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:42 crc kubenswrapper[4843]: I0318 12:58:42.153176 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7jwrt"] Mar 18 12:58:42 crc kubenswrapper[4843]: I0318 12:58:42.153952 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jwrt" event={"ID":"c9d05f5c-6230-4c04-8e29-802dbb1eddd4","Type":"ContainerStarted","Data":"caafe1768e62688c0a8d0e5b2ec5e9a9003066be8ba3afc7bfc04902590920fb"} Mar 18 12:58:43 crc kubenswrapper[4843]: I0318 12:58:43.164305 4843 generic.go:334] "Generic (PLEG): container finished" podID="c9d05f5c-6230-4c04-8e29-802dbb1eddd4" containerID="eaadaf6ba4bd587975c91c47dc60b92dbc7bb4522d0b7255f1b1a231ed634ae7" exitCode=0 Mar 18 12:58:43 crc kubenswrapper[4843]: I0318 12:58:43.164357 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jwrt" event={"ID":"c9d05f5c-6230-4c04-8e29-802dbb1eddd4","Type":"ContainerDied","Data":"eaadaf6ba4bd587975c91c47dc60b92dbc7bb4522d0b7255f1b1a231ed634ae7"} Mar 18 12:58:45 crc kubenswrapper[4843]: I0318 12:58:45.188993 4843 generic.go:334] "Generic (PLEG): container finished" podID="c9d05f5c-6230-4c04-8e29-802dbb1eddd4" containerID="82f839a84f12c0ce658c69e214e76f6d1c6db1fcd1fc153bfebc6862bf3cddc9" exitCode=0 Mar 18 12:58:45 crc kubenswrapper[4843]: I0318 12:58:45.189244 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jwrt" event={"ID":"c9d05f5c-6230-4c04-8e29-802dbb1eddd4","Type":"ContainerDied","Data":"82f839a84f12c0ce658c69e214e76f6d1c6db1fcd1fc153bfebc6862bf3cddc9"} Mar 18 12:58:46 crc kubenswrapper[4843]: I0318 12:58:46.203945 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jwrt" event={"ID":"c9d05f5c-6230-4c04-8e29-802dbb1eddd4","Type":"ContainerStarted","Data":"91d730c0fe70aed035d57b7ab0f508a6e27f7929e655fc291d43302000dba8a2"} Mar 18 12:58:46 crc kubenswrapper[4843]: I0318 12:58:46.234208 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7jwrt" podStartSLOduration=2.661338813 podStartE2EDuration="5.234187191s" podCreationTimestamp="2026-03-18 12:58:41 +0000 UTC" firstStartedPulling="2026-03-18 12:58:43.167203441 +0000 UTC m=+2956.883028965" lastFinishedPulling="2026-03-18 12:58:45.740051809 +0000 UTC m=+2959.455877343" observedRunningTime="2026-03-18 12:58:46.228295614 +0000 UTC m=+2959.944121138" watchObservedRunningTime="2026-03-18 12:58:46.234187191 +0000 UTC m=+2959.950012715" Mar 18 12:58:48 crc kubenswrapper[4843]: I0318 12:58:48.223280 4843 generic.go:334] "Generic (PLEG): container finished" podID="98e7a8b0-08df-4140-94f7-135db9497789" containerID="1d41df65223b7092cd19009ed3b09170723bceca41ee4af18fa44a076a3f54a7" exitCode=2 Mar 18 12:58:48 crc kubenswrapper[4843]: I0318 12:58:48.223501 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" event={"ID":"98e7a8b0-08df-4140-94f7-135db9497789","Type":"ContainerDied","Data":"1d41df65223b7092cd19009ed3b09170723bceca41ee4af18fa44a076a3f54a7"} Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.771669 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.864318 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-ssh-key-openstack-edpm-ipam\") pod \"98e7a8b0-08df-4140-94f7-135db9497789\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.864422 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzrhk\" (UniqueName: \"kubernetes.io/projected/98e7a8b0-08df-4140-94f7-135db9497789-kube-api-access-xzrhk\") pod \"98e7a8b0-08df-4140-94f7-135db9497789\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.864514 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-libvirt-secret-0\") pod \"98e7a8b0-08df-4140-94f7-135db9497789\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.864627 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-inventory\") pod \"98e7a8b0-08df-4140-94f7-135db9497789\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.864717 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-libvirt-combined-ca-bundle\") pod \"98e7a8b0-08df-4140-94f7-135db9497789\" (UID: \"98e7a8b0-08df-4140-94f7-135db9497789\") " Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.871707 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e7a8b0-08df-4140-94f7-135db9497789-kube-api-access-xzrhk" (OuterVolumeSpecName: "kube-api-access-xzrhk") pod "98e7a8b0-08df-4140-94f7-135db9497789" (UID: "98e7a8b0-08df-4140-94f7-135db9497789"). InnerVolumeSpecName "kube-api-access-xzrhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.871776 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "98e7a8b0-08df-4140-94f7-135db9497789" (UID: "98e7a8b0-08df-4140-94f7-135db9497789"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.893221 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "98e7a8b0-08df-4140-94f7-135db9497789" (UID: "98e7a8b0-08df-4140-94f7-135db9497789"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.905009 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "98e7a8b0-08df-4140-94f7-135db9497789" (UID: "98e7a8b0-08df-4140-94f7-135db9497789"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.910449 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-inventory" (OuterVolumeSpecName: "inventory") pod "98e7a8b0-08df-4140-94f7-135db9497789" (UID: "98e7a8b0-08df-4140-94f7-135db9497789"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.966702 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.966742 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.966757 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.966791 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzrhk\" (UniqueName: \"kubernetes.io/projected/98e7a8b0-08df-4140-94f7-135db9497789-kube-api-access-xzrhk\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:49 crc kubenswrapper[4843]: I0318 12:58:49.966804 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/98e7a8b0-08df-4140-94f7-135db9497789-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:50 crc kubenswrapper[4843]: I0318 12:58:50.250226 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" event={"ID":"98e7a8b0-08df-4140-94f7-135db9497789","Type":"ContainerDied","Data":"902c3ea75472e408580c46f3bb215a6faee3c929bbc31d8912d7bda7ba7ee03b"} Mar 18 12:58:50 crc kubenswrapper[4843]: I0318 12:58:50.250285 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="902c3ea75472e408580c46f3bb215a6faee3c929bbc31d8912d7bda7ba7ee03b" Mar 18 12:58:50 crc kubenswrapper[4843]: I0318 12:58:50.250395 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kng5q" Mar 18 12:58:51 crc kubenswrapper[4843]: I0318 12:58:51.654108 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:51 crc kubenswrapper[4843]: I0318 12:58:51.657677 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:51 crc kubenswrapper[4843]: I0318 12:58:51.727762 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:52 crc kubenswrapper[4843]: I0318 12:58:52.323896 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:52 crc kubenswrapper[4843]: I0318 12:58:52.393835 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7jwrt"] Mar 18 12:58:54 crc kubenswrapper[4843]: I0318 12:58:54.284953 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7jwrt" podUID="c9d05f5c-6230-4c04-8e29-802dbb1eddd4" containerName="registry-server" containerID="cri-o://91d730c0fe70aed035d57b7ab0f508a6e27f7929e655fc291d43302000dba8a2" gracePeriod=2 Mar 18 12:58:54 crc kubenswrapper[4843]: I0318 12:58:54.849504 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:54 crc kubenswrapper[4843]: I0318 12:58:54.891306 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-catalog-content\") pod \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\" (UID: \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\") " Mar 18 12:58:54 crc kubenswrapper[4843]: I0318 12:58:54.891500 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-utilities\") pod \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\" (UID: \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\") " Mar 18 12:58:54 crc kubenswrapper[4843]: I0318 12:58:54.891536 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqs9n\" (UniqueName: \"kubernetes.io/projected/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-kube-api-access-mqs9n\") pod \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\" (UID: \"c9d05f5c-6230-4c04-8e29-802dbb1eddd4\") " Mar 18 12:58:54 crc kubenswrapper[4843]: I0318 12:58:54.892547 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-utilities" (OuterVolumeSpecName: "utilities") pod "c9d05f5c-6230-4c04-8e29-802dbb1eddd4" (UID: "c9d05f5c-6230-4c04-8e29-802dbb1eddd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:58:54 crc kubenswrapper[4843]: I0318 12:58:54.903171 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-kube-api-access-mqs9n" (OuterVolumeSpecName: "kube-api-access-mqs9n") pod "c9d05f5c-6230-4c04-8e29-802dbb1eddd4" (UID: "c9d05f5c-6230-4c04-8e29-802dbb1eddd4"). InnerVolumeSpecName "kube-api-access-mqs9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:58:54 crc kubenswrapper[4843]: I0318 12:58:54.954401 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9d05f5c-6230-4c04-8e29-802dbb1eddd4" (UID: "c9d05f5c-6230-4c04-8e29-802dbb1eddd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:58:54 crc kubenswrapper[4843]: I0318 12:58:54.993792 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:54 crc kubenswrapper[4843]: I0318 12:58:54.994083 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqs9n\" (UniqueName: \"kubernetes.io/projected/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-kube-api-access-mqs9n\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:54 crc kubenswrapper[4843]: I0318 12:58:54.994097 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d05f5c-6230-4c04-8e29-802dbb1eddd4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.304690 4843 generic.go:334] "Generic (PLEG): container finished" podID="c9d05f5c-6230-4c04-8e29-802dbb1eddd4" containerID="91d730c0fe70aed035d57b7ab0f508a6e27f7929e655fc291d43302000dba8a2" exitCode=0 Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.304755 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jwrt" event={"ID":"c9d05f5c-6230-4c04-8e29-802dbb1eddd4","Type":"ContainerDied","Data":"91d730c0fe70aed035d57b7ab0f508a6e27f7929e655fc291d43302000dba8a2"} Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.304778 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jwrt" Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.304807 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jwrt" event={"ID":"c9d05f5c-6230-4c04-8e29-802dbb1eddd4","Type":"ContainerDied","Data":"caafe1768e62688c0a8d0e5b2ec5e9a9003066be8ba3afc7bfc04902590920fb"} Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.304831 4843 scope.go:117] "RemoveContainer" containerID="91d730c0fe70aed035d57b7ab0f508a6e27f7929e655fc291d43302000dba8a2" Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.342495 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7jwrt"] Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.344127 4843 scope.go:117] "RemoveContainer" containerID="82f839a84f12c0ce658c69e214e76f6d1c6db1fcd1fc153bfebc6862bf3cddc9" Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.353239 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7jwrt"] Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.391989 4843 scope.go:117] "RemoveContainer" containerID="eaadaf6ba4bd587975c91c47dc60b92dbc7bb4522d0b7255f1b1a231ed634ae7" Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.448917 4843 scope.go:117] "RemoveContainer" containerID="91d730c0fe70aed035d57b7ab0f508a6e27f7929e655fc291d43302000dba8a2" Mar 18 12:58:55 crc kubenswrapper[4843]: E0318 12:58:55.449780 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d730c0fe70aed035d57b7ab0f508a6e27f7929e655fc291d43302000dba8a2\": container with ID starting with 91d730c0fe70aed035d57b7ab0f508a6e27f7929e655fc291d43302000dba8a2 not found: ID does not exist" containerID="91d730c0fe70aed035d57b7ab0f508a6e27f7929e655fc291d43302000dba8a2" Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.449821 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d730c0fe70aed035d57b7ab0f508a6e27f7929e655fc291d43302000dba8a2"} err="failed to get container status \"91d730c0fe70aed035d57b7ab0f508a6e27f7929e655fc291d43302000dba8a2\": rpc error: code = NotFound desc = could not find container \"91d730c0fe70aed035d57b7ab0f508a6e27f7929e655fc291d43302000dba8a2\": container with ID starting with 91d730c0fe70aed035d57b7ab0f508a6e27f7929e655fc291d43302000dba8a2 not found: ID does not exist" Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.449851 4843 scope.go:117] "RemoveContainer" containerID="82f839a84f12c0ce658c69e214e76f6d1c6db1fcd1fc153bfebc6862bf3cddc9" Mar 18 12:58:55 crc kubenswrapper[4843]: E0318 12:58:55.450400 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82f839a84f12c0ce658c69e214e76f6d1c6db1fcd1fc153bfebc6862bf3cddc9\": container with ID starting with 82f839a84f12c0ce658c69e214e76f6d1c6db1fcd1fc153bfebc6862bf3cddc9 not found: ID does not exist" containerID="82f839a84f12c0ce658c69e214e76f6d1c6db1fcd1fc153bfebc6862bf3cddc9" Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.450447 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f839a84f12c0ce658c69e214e76f6d1c6db1fcd1fc153bfebc6862bf3cddc9"} err="failed to get container status \"82f839a84f12c0ce658c69e214e76f6d1c6db1fcd1fc153bfebc6862bf3cddc9\": rpc error: code = NotFound desc = could not find container \"82f839a84f12c0ce658c69e214e76f6d1c6db1fcd1fc153bfebc6862bf3cddc9\": container with ID starting with 82f839a84f12c0ce658c69e214e76f6d1c6db1fcd1fc153bfebc6862bf3cddc9 not found: ID does not exist" Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.450475 4843 scope.go:117] "RemoveContainer" containerID="eaadaf6ba4bd587975c91c47dc60b92dbc7bb4522d0b7255f1b1a231ed634ae7" Mar 18 12:58:55 crc kubenswrapper[4843]: E0318 12:58:55.450847 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaadaf6ba4bd587975c91c47dc60b92dbc7bb4522d0b7255f1b1a231ed634ae7\": container with ID starting with eaadaf6ba4bd587975c91c47dc60b92dbc7bb4522d0b7255f1b1a231ed634ae7 not found: ID does not exist" containerID="eaadaf6ba4bd587975c91c47dc60b92dbc7bb4522d0b7255f1b1a231ed634ae7" Mar 18 12:58:55 crc kubenswrapper[4843]: I0318 12:58:55.450904 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaadaf6ba4bd587975c91c47dc60b92dbc7bb4522d0b7255f1b1a231ed634ae7"} err="failed to get container status \"eaadaf6ba4bd587975c91c47dc60b92dbc7bb4522d0b7255f1b1a231ed634ae7\": rpc error: code = NotFound desc = could not find container \"eaadaf6ba4bd587975c91c47dc60b92dbc7bb4522d0b7255f1b1a231ed634ae7\": container with ID starting with eaadaf6ba4bd587975c91c47dc60b92dbc7bb4522d0b7255f1b1a231ed634ae7 not found: ID does not exist" Mar 18 12:58:56 crc kubenswrapper[4843]: I0318 12:58:56.999392 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9d05f5c-6230-4c04-8e29-802dbb1eddd4" path="/var/lib/kubelet/pods/c9d05f5c-6230-4c04-8e29-802dbb1eddd4/volumes" Mar 18 12:59:05 crc kubenswrapper[4843]: I0318 12:59:05.610798 4843 scope.go:117] "RemoveContainer" containerID="2be2299fce2da948e3385cbc0877e14046b370b7491ac52cc5bc156998f560c0" Mar 18 12:59:20 crc kubenswrapper[4843]: I0318 12:59:20.034559 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:59:20 crc kubenswrapper[4843]: I0318 12:59:20.035041 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:59:50 crc kubenswrapper[4843]: I0318 12:59:50.035110 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:59:50 crc kubenswrapper[4843]: I0318 12:59:50.035817 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.242634 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v"] Mar 18 13:00:00 crc kubenswrapper[4843]: E0318 13:00:00.243983 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d05f5c-6230-4c04-8e29-802dbb1eddd4" containerName="extract-content" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.244003 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d05f5c-6230-4c04-8e29-802dbb1eddd4" containerName="extract-content" Mar 18 13:00:00 crc kubenswrapper[4843]: E0318 13:00:00.244030 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d05f5c-6230-4c04-8e29-802dbb1eddd4" containerName="extract-utilities" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.244043 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d05f5c-6230-4c04-8e29-802dbb1eddd4" containerName="extract-utilities" Mar 18 13:00:00 crc kubenswrapper[4843]: E0318 13:00:00.244070 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e7a8b0-08df-4140-94f7-135db9497789" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.244080 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e7a8b0-08df-4140-94f7-135db9497789" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:00:00 crc kubenswrapper[4843]: E0318 13:00:00.244091 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d05f5c-6230-4c04-8e29-802dbb1eddd4" containerName="registry-server" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.244098 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d05f5c-6230-4c04-8e29-802dbb1eddd4" containerName="registry-server" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.244406 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e7a8b0-08df-4140-94f7-135db9497789" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.244435 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9d05f5c-6230-4c04-8e29-802dbb1eddd4" containerName="registry-server" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.245466 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.247952 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.248335 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.251608 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563980-vmzsh"] Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.253299 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563980-vmzsh" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.255356 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.255519 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.265782 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v"] Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.265862 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563980-vmzsh"] Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.268153 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.738953 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cdc91d39-cf27-435d-90c1-d596d09c4b8f-secret-volume\") pod \"collect-profiles-29563980-k245v\" (UID: \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.739188 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cdc91d39-cf27-435d-90c1-d596d09c4b8f-config-volume\") pod \"collect-profiles-29563980-k245v\" (UID: \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.739301 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2c6s\" (UniqueName: \"kubernetes.io/projected/cdc91d39-cf27-435d-90c1-d596d09c4b8f-kube-api-access-b2c6s\") pod \"collect-profiles-29563980-k245v\" (UID: \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.840602 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cdc91d39-cf27-435d-90c1-d596d09c4b8f-secret-volume\") pod \"collect-profiles-29563980-k245v\" (UID: \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.840783 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cdc91d39-cf27-435d-90c1-d596d09c4b8f-config-volume\") pod \"collect-profiles-29563980-k245v\" (UID: \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.840823 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx2vs\" (UniqueName: \"kubernetes.io/projected/1a766d2c-966a-4feb-9add-d31158c28c32-kube-api-access-cx2vs\") pod \"auto-csr-approver-29563980-vmzsh\" (UID: \"1a766d2c-966a-4feb-9add-d31158c28c32\") " pod="openshift-infra/auto-csr-approver-29563980-vmzsh" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.840883 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2c6s\" (UniqueName: \"kubernetes.io/projected/cdc91d39-cf27-435d-90c1-d596d09c4b8f-kube-api-access-b2c6s\") pod \"collect-profiles-29563980-k245v\" (UID: \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.844508 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cdc91d39-cf27-435d-90c1-d596d09c4b8f-config-volume\") pod \"collect-profiles-29563980-k245v\" (UID: \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.847636 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cdc91d39-cf27-435d-90c1-d596d09c4b8f-secret-volume\") pod \"collect-profiles-29563980-k245v\" (UID: \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.859160 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2c6s\" (UniqueName: \"kubernetes.io/projected/cdc91d39-cf27-435d-90c1-d596d09c4b8f-kube-api-access-b2c6s\") pod \"collect-profiles-29563980-k245v\" (UID: \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.944367 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx2vs\" (UniqueName: \"kubernetes.io/projected/1a766d2c-966a-4feb-9add-d31158c28c32-kube-api-access-cx2vs\") pod \"auto-csr-approver-29563980-vmzsh\" (UID: \"1a766d2c-966a-4feb-9add-d31158c28c32\") " pod="openshift-infra/auto-csr-approver-29563980-vmzsh" Mar 18 13:00:00 crc kubenswrapper[4843]: I0318 13:00:00.970304 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx2vs\" (UniqueName: \"kubernetes.io/projected/1a766d2c-966a-4feb-9add-d31158c28c32-kube-api-access-cx2vs\") pod \"auto-csr-approver-29563980-vmzsh\" (UID: \"1a766d2c-966a-4feb-9add-d31158c28c32\") " pod="openshift-infra/auto-csr-approver-29563980-vmzsh" Mar 18 13:00:01 crc kubenswrapper[4843]: I0318 13:00:01.092883 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" Mar 18 13:00:01 crc kubenswrapper[4843]: I0318 13:00:01.120039 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563980-vmzsh" Mar 18 13:00:01 crc kubenswrapper[4843]: I0318 13:00:01.783742 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563980-vmzsh"] Mar 18 13:00:01 crc kubenswrapper[4843]: I0318 13:00:01.854111 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v"] Mar 18 13:00:02 crc kubenswrapper[4843]: I0318 13:00:02.125056 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563980-vmzsh" event={"ID":"1a766d2c-966a-4feb-9add-d31158c28c32","Type":"ContainerStarted","Data":"c945e821b83af35249e4a032d5fe2151aeaa81e172737be10037286c5cc7af24"} Mar 18 13:00:02 crc kubenswrapper[4843]: I0318 13:00:02.126125 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" event={"ID":"cdc91d39-cf27-435d-90c1-d596d09c4b8f","Type":"ContainerStarted","Data":"b1bb08fd9d6df546f4a280a54166ae2de9a79987064a5aa31ceb1bbd86a75fd3"} Mar 18 13:00:03 crc kubenswrapper[4843]: I0318 13:00:03.139390 4843 generic.go:334] "Generic (PLEG): container finished" podID="cdc91d39-cf27-435d-90c1-d596d09c4b8f" containerID="d5716208d8be35b9b5a5aa760a90299337a00584119e900460b934d4951d4f75" exitCode=0 Mar 18 13:00:03 crc kubenswrapper[4843]: I0318 13:00:03.139521 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" event={"ID":"cdc91d39-cf27-435d-90c1-d596d09c4b8f","Type":"ContainerDied","Data":"d5716208d8be35b9b5a5aa760a90299337a00584119e900460b934d4951d4f75"} Mar 18 13:00:04 crc kubenswrapper[4843]: I0318 13:00:04.456292 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" Mar 18 13:00:04 crc kubenswrapper[4843]: I0318 13:00:04.527501 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cdc91d39-cf27-435d-90c1-d596d09c4b8f-config-volume\") pod \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\" (UID: \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\") " Mar 18 13:00:04 crc kubenswrapper[4843]: I0318 13:00:04.527676 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cdc91d39-cf27-435d-90c1-d596d09c4b8f-secret-volume\") pod \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\" (UID: \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\") " Mar 18 13:00:04 crc kubenswrapper[4843]: I0318 13:00:04.527849 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2c6s\" (UniqueName: \"kubernetes.io/projected/cdc91d39-cf27-435d-90c1-d596d09c4b8f-kube-api-access-b2c6s\") pod \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\" (UID: \"cdc91d39-cf27-435d-90c1-d596d09c4b8f\") " Mar 18 13:00:04 crc kubenswrapper[4843]: I0318 13:00:04.528588 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdc91d39-cf27-435d-90c1-d596d09c4b8f-config-volume" (OuterVolumeSpecName: "config-volume") pod "cdc91d39-cf27-435d-90c1-d596d09c4b8f" (UID: "cdc91d39-cf27-435d-90c1-d596d09c4b8f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:00:04 crc kubenswrapper[4843]: I0318 13:00:04.534375 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc91d39-cf27-435d-90c1-d596d09c4b8f-kube-api-access-b2c6s" (OuterVolumeSpecName: "kube-api-access-b2c6s") pod "cdc91d39-cf27-435d-90c1-d596d09c4b8f" (UID: "cdc91d39-cf27-435d-90c1-d596d09c4b8f"). InnerVolumeSpecName "kube-api-access-b2c6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:00:04 crc kubenswrapper[4843]: I0318 13:00:04.534459 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdc91d39-cf27-435d-90c1-d596d09c4b8f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cdc91d39-cf27-435d-90c1-d596d09c4b8f" (UID: "cdc91d39-cf27-435d-90c1-d596d09c4b8f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:00:04 crc kubenswrapper[4843]: I0318 13:00:04.630447 4843 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cdc91d39-cf27-435d-90c1-d596d09c4b8f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:00:04 crc kubenswrapper[4843]: I0318 13:00:04.630684 4843 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cdc91d39-cf27-435d-90c1-d596d09c4b8f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:00:04 crc kubenswrapper[4843]: I0318 13:00:04.630751 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2c6s\" (UniqueName: \"kubernetes.io/projected/cdc91d39-cf27-435d-90c1-d596d09c4b8f-kube-api-access-b2c6s\") on node \"crc\" DevicePath \"\"" Mar 18 13:00:05 crc kubenswrapper[4843]: I0318 13:00:05.159371 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" event={"ID":"cdc91d39-cf27-435d-90c1-d596d09c4b8f","Type":"ContainerDied","Data":"b1bb08fd9d6df546f4a280a54166ae2de9a79987064a5aa31ceb1bbd86a75fd3"} Mar 18 13:00:05 crc kubenswrapper[4843]: I0318 13:00:05.159690 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1bb08fd9d6df546f4a280a54166ae2de9a79987064a5aa31ceb1bbd86a75fd3" Mar 18 13:00:05 crc kubenswrapper[4843]: I0318 13:00:05.159460 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v" Mar 18 13:00:05 crc kubenswrapper[4843]: I0318 13:00:05.544530 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh"] Mar 18 13:00:05 crc kubenswrapper[4843]: I0318 13:00:05.553430 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-r8bbh"] Mar 18 13:00:06 crc kubenswrapper[4843]: I0318 13:00:06.171017 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563980-vmzsh" event={"ID":"1a766d2c-966a-4feb-9add-d31158c28c32","Type":"ContainerStarted","Data":"4ec5712a16e945fe8b581d10403ccad1034be78daec015ca0dbf0631a3c341cc"} Mar 18 13:00:06 crc kubenswrapper[4843]: I0318 13:00:06.193251 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563980-vmzsh" podStartSLOduration=2.112298839 podStartE2EDuration="6.193181602s" podCreationTimestamp="2026-03-18 13:00:00 +0000 UTC" firstStartedPulling="2026-03-18 13:00:01.79084819 +0000 UTC m=+3035.506673714" lastFinishedPulling="2026-03-18 13:00:05.871730953 +0000 UTC m=+3039.587556477" observedRunningTime="2026-03-18 13:00:06.185883585 +0000 UTC m=+3039.901709179" watchObservedRunningTime="2026-03-18 13:00:06.193181602 +0000 UTC m=+3039.909007156" Mar 18 13:00:07 crc kubenswrapper[4843]: I0318 13:00:07.004878 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b4d009-a68d-4287-b15d-b0470d62d486" path="/var/lib/kubelet/pods/05b4d009-a68d-4287-b15d-b0470d62d486/volumes" Mar 18 13:00:07 crc kubenswrapper[4843]: I0318 13:00:07.186489 4843 generic.go:334] "Generic (PLEG): container finished" podID="1a766d2c-966a-4feb-9add-d31158c28c32" containerID="4ec5712a16e945fe8b581d10403ccad1034be78daec015ca0dbf0631a3c341cc" exitCode=0 Mar 18 13:00:07 crc kubenswrapper[4843]: I0318 13:00:07.186553 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563980-vmzsh" event={"ID":"1a766d2c-966a-4feb-9add-d31158c28c32","Type":"ContainerDied","Data":"4ec5712a16e945fe8b581d10403ccad1034be78daec015ca0dbf0631a3c341cc"} Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.030626 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk"] Mar 18 13:00:08 crc kubenswrapper[4843]: E0318 13:00:08.031207 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc91d39-cf27-435d-90c1-d596d09c4b8f" containerName="collect-profiles" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.031224 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc91d39-cf27-435d-90c1-d596d09c4b8f" containerName="collect-profiles" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.031420 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc91d39-cf27-435d-90c1-d596d09c4b8f" containerName="collect-profiles" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.032145 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.038884 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.039049 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.039250 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.039794 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.041752 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.063242 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk"] Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.104611 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgkdm\" (UniqueName: \"kubernetes.io/projected/b908e440-89c5-462b-bab0-861853b924d3-kube-api-access-rgkdm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.104771 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.104803 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.104843 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.104914 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.206370 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.206692 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.206930 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.207186 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.207449 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgkdm\" (UniqueName: \"kubernetes.io/projected/b908e440-89c5-462b-bab0-861853b924d3-kube-api-access-rgkdm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.212321 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.213866 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.219487 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.226745 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.238038 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgkdm\" (UniqueName: \"kubernetes.io/projected/b908e440-89c5-462b-bab0-861853b924d3-kube-api-access-rgkdm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.350316 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.480500 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563980-vmzsh" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.515877 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx2vs\" (UniqueName: \"kubernetes.io/projected/1a766d2c-966a-4feb-9add-d31158c28c32-kube-api-access-cx2vs\") pod \"1a766d2c-966a-4feb-9add-d31158c28c32\" (UID: \"1a766d2c-966a-4feb-9add-d31158c28c32\") " Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.521063 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a766d2c-966a-4feb-9add-d31158c28c32-kube-api-access-cx2vs" (OuterVolumeSpecName: "kube-api-access-cx2vs") pod "1a766d2c-966a-4feb-9add-d31158c28c32" (UID: "1a766d2c-966a-4feb-9add-d31158c28c32"). InnerVolumeSpecName "kube-api-access-cx2vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.617417 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx2vs\" (UniqueName: \"kubernetes.io/projected/1a766d2c-966a-4feb-9add-d31158c28c32-kube-api-access-cx2vs\") on node \"crc\" DevicePath \"\"" Mar 18 13:00:08 crc kubenswrapper[4843]: I0318 13:00:08.902909 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk"] Mar 18 13:00:10 crc kubenswrapper[4843]: I0318 13:00:10.078624 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-wdzw9" podUID="95fc7104-316c-4699-98b1-3ff394a0c609" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.58:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:00:10 crc kubenswrapper[4843]: I0318 13:00:10.177166 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563980-vmzsh" event={"ID":"1a766d2c-966a-4feb-9add-d31158c28c32","Type":"ContainerDied","Data":"c945e821b83af35249e4a032d5fe2151aeaa81e172737be10037286c5cc7af24"} Mar 18 13:00:10 crc kubenswrapper[4843]: I0318 13:00:10.177270 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c945e821b83af35249e4a032d5fe2151aeaa81e172737be10037286c5cc7af24" Mar 18 13:00:10 crc kubenswrapper[4843]: I0318 13:00:10.177407 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563980-vmzsh" Mar 18 13:00:10 crc kubenswrapper[4843]: I0318 13:00:10.186582 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" event={"ID":"b908e440-89c5-462b-bab0-861853b924d3","Type":"ContainerStarted","Data":"542e442dc92fb5357dec33cba75a3e13899253a7a71a7aad0534e7f2ac6fafb1"} Mar 18 13:00:10 crc kubenswrapper[4843]: I0318 13:00:10.190584 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563974-6wbq2"] Mar 18 13:00:10 crc kubenswrapper[4843]: I0318 13:00:10.202959 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563974-6wbq2"] Mar 18 13:00:11 crc kubenswrapper[4843]: I0318 13:00:11.007228 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="727dae08-13f0-4310-806c-969324946ac6" path="/var/lib/kubelet/pods/727dae08-13f0-4310-806c-969324946ac6/volumes" Mar 18 13:00:12 crc kubenswrapper[4843]: I0318 13:00:12.387571 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" event={"ID":"b908e440-89c5-462b-bab0-861853b924d3","Type":"ContainerStarted","Data":"ec33a40c7f007c20e2a0658769a79ba5f05810fdf8bf6a7b9048a67e6d8cf143"} Mar 18 13:00:12 crc kubenswrapper[4843]: I0318 13:00:12.421534 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" podStartSLOduration=2.353964878 podStartE2EDuration="4.421505797s" podCreationTimestamp="2026-03-18 13:00:08 +0000 UTC" firstStartedPulling="2026-03-18 13:00:08.896581999 +0000 UTC m=+3042.612407523" lastFinishedPulling="2026-03-18 13:00:10.964122918 +0000 UTC m=+3044.679948442" observedRunningTime="2026-03-18 13:00:12.417058071 +0000 UTC m=+3046.132883595" watchObservedRunningTime="2026-03-18 13:00:12.421505797 +0000 UTC m=+3046.137331321" Mar 18 13:00:20 crc kubenswrapper[4843]: I0318 13:00:20.035633 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:00:20 crc kubenswrapper[4843]: I0318 13:00:20.036339 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:00:20 crc kubenswrapper[4843]: I0318 13:00:20.036394 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 13:00:20 crc kubenswrapper[4843]: I0318 13:00:20.037267 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:00:20 crc kubenswrapper[4843]: I0318 13:00:20.037363 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" gracePeriod=600 Mar 18 13:00:20 crc kubenswrapper[4843]: E0318 13:00:20.157223 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:00:20 crc kubenswrapper[4843]: I0318 13:00:20.676610 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" exitCode=0 Mar 18 13:00:20 crc kubenswrapper[4843]: I0318 13:00:20.676674 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9"} Mar 18 13:00:20 crc kubenswrapper[4843]: I0318 13:00:20.676744 4843 scope.go:117] "RemoveContainer" containerID="8039e005f14b8978061b52f5bdd6ef705088f9029a2fdca0fced087b8ccadae4" Mar 18 13:00:20 crc kubenswrapper[4843]: I0318 13:00:20.677636 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:00:20 crc kubenswrapper[4843]: E0318 13:00:20.678258 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:00:27 crc kubenswrapper[4843]: I0318 13:00:27.784014 4843 patch_prober.go:28] interesting pod/route-controller-manager-66dcd55869-xczn6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 13:00:27 crc kubenswrapper[4843]: I0318 13:00:27.784541 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-66dcd55869-xczn6" podUID="8e3ee223-0231-40f0-911b-fe651e958b6e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 13:00:31 crc kubenswrapper[4843]: I0318 13:00:31.983532 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:00:31 crc kubenswrapper[4843]: E0318 13:00:31.984327 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:00:47 crc kubenswrapper[4843]: I0318 13:00:47.002816 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:00:47 crc kubenswrapper[4843]: E0318 13:00:47.003933 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.220275 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29563981-r8ktq"] Mar 18 13:01:00 crc kubenswrapper[4843]: E0318 13:01:00.221263 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a766d2c-966a-4feb-9add-d31158c28c32" containerName="oc" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.221277 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a766d2c-966a-4feb-9add-d31158c28c32" containerName="oc" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.221478 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a766d2c-966a-4feb-9add-d31158c28c32" containerName="oc" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.222189 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.229259 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563981-r8ktq"] Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.316519 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-config-data\") pod \"keystone-cron-29563981-r8ktq\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.316923 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-fernet-keys\") pod \"keystone-cron-29563981-r8ktq\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.317113 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-combined-ca-bundle\") pod \"keystone-cron-29563981-r8ktq\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.317214 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c84sf\" (UniqueName: \"kubernetes.io/projected/ce4c6c09-97bc-4b96-9014-279327ba52b6-kube-api-access-c84sf\") pod \"keystone-cron-29563981-r8ktq\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.418960 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-config-data\") pod \"keystone-cron-29563981-r8ktq\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.419078 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-fernet-keys\") pod \"keystone-cron-29563981-r8ktq\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.419118 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-combined-ca-bundle\") pod \"keystone-cron-29563981-r8ktq\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.419148 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c84sf\" (UniqueName: \"kubernetes.io/projected/ce4c6c09-97bc-4b96-9014-279327ba52b6-kube-api-access-c84sf\") pod \"keystone-cron-29563981-r8ktq\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.425555 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-combined-ca-bundle\") pod \"keystone-cron-29563981-r8ktq\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.425744 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-fernet-keys\") pod \"keystone-cron-29563981-r8ktq\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.425761 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-config-data\") pod \"keystone-cron-29563981-r8ktq\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.437769 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c84sf\" (UniqueName: \"kubernetes.io/projected/ce4c6c09-97bc-4b96-9014-279327ba52b6-kube-api-access-c84sf\") pod \"keystone-cron-29563981-r8ktq\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.553180 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:00 crc kubenswrapper[4843]: I0318 13:01:00.985114 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:01:00 crc kubenswrapper[4843]: E0318 13:01:00.985708 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:01:01 crc kubenswrapper[4843]: I0318 13:01:01.015202 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563981-r8ktq"] Mar 18 13:01:01 crc kubenswrapper[4843]: I0318 13:01:01.147567 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563981-r8ktq" event={"ID":"ce4c6c09-97bc-4b96-9014-279327ba52b6","Type":"ContainerStarted","Data":"3bf5040a33cf48bf83fc5101821836e8139d1a310ad620214190df651639b54e"} Mar 18 13:01:02 crc kubenswrapper[4843]: I0318 13:01:02.166680 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563981-r8ktq" event={"ID":"ce4c6c09-97bc-4b96-9014-279327ba52b6","Type":"ContainerStarted","Data":"4a906a7d83f75e2b7bcb06691b9c78334be13f25395038928df344657b5ed670"} Mar 18 13:01:02 crc kubenswrapper[4843]: I0318 13:01:02.195537 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29563981-r8ktq" podStartSLOduration=2.195508868 podStartE2EDuration="2.195508868s" podCreationTimestamp="2026-03-18 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:01:02.189980812 +0000 UTC m=+3095.905806346" watchObservedRunningTime="2026-03-18 13:01:02.195508868 +0000 UTC m=+3095.911334402" Mar 18 13:01:04 crc kubenswrapper[4843]: I0318 13:01:04.201199 4843 generic.go:334] "Generic (PLEG): container finished" podID="ce4c6c09-97bc-4b96-9014-279327ba52b6" containerID="4a906a7d83f75e2b7bcb06691b9c78334be13f25395038928df344657b5ed670" exitCode=0 Mar 18 13:01:04 crc kubenswrapper[4843]: I0318 13:01:04.201295 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563981-r8ktq" event={"ID":"ce4c6c09-97bc-4b96-9014-279327ba52b6","Type":"ContainerDied","Data":"4a906a7d83f75e2b7bcb06691b9c78334be13f25395038928df344657b5ed670"} Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.661258 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.744190 4843 scope.go:117] "RemoveContainer" containerID="e4c1b5761d82579e78041f8594d2db667f5aafffb7db8bbfbe0c88e163d75b0a" Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.746137 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-combined-ca-bundle\") pod \"ce4c6c09-97bc-4b96-9014-279327ba52b6\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.746322 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-config-data\") pod \"ce4c6c09-97bc-4b96-9014-279327ba52b6\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.746394 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c84sf\" (UniqueName: \"kubernetes.io/projected/ce4c6c09-97bc-4b96-9014-279327ba52b6-kube-api-access-c84sf\") pod \"ce4c6c09-97bc-4b96-9014-279327ba52b6\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.746435 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-fernet-keys\") pod \"ce4c6c09-97bc-4b96-9014-279327ba52b6\" (UID: \"ce4c6c09-97bc-4b96-9014-279327ba52b6\") " Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.756685 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4c6c09-97bc-4b96-9014-279327ba52b6-kube-api-access-c84sf" (OuterVolumeSpecName: "kube-api-access-c84sf") pod "ce4c6c09-97bc-4b96-9014-279327ba52b6" (UID: "ce4c6c09-97bc-4b96-9014-279327ba52b6"). InnerVolumeSpecName "kube-api-access-c84sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.763952 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ce4c6c09-97bc-4b96-9014-279327ba52b6" (UID: "ce4c6c09-97bc-4b96-9014-279327ba52b6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.785244 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce4c6c09-97bc-4b96-9014-279327ba52b6" (UID: "ce4c6c09-97bc-4b96-9014-279327ba52b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.826409 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-config-data" (OuterVolumeSpecName: "config-data") pod "ce4c6c09-97bc-4b96-9014-279327ba52b6" (UID: "ce4c6c09-97bc-4b96-9014-279327ba52b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.849946 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c84sf\" (UniqueName: \"kubernetes.io/projected/ce4c6c09-97bc-4b96-9014-279327ba52b6-kube-api-access-c84sf\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.849975 4843 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.850023 4843 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.850034 4843 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce4c6c09-97bc-4b96-9014-279327ba52b6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:05 crc kubenswrapper[4843]: I0318 13:01:05.935715 4843 scope.go:117] "RemoveContainer" containerID="a2685add51d20aaddef4cde554d7370356bcd8fc29d993b19abb949b40f50eb5" Mar 18 13:01:06 crc kubenswrapper[4843]: I0318 13:01:06.227363 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563981-r8ktq" event={"ID":"ce4c6c09-97bc-4b96-9014-279327ba52b6","Type":"ContainerDied","Data":"3bf5040a33cf48bf83fc5101821836e8139d1a310ad620214190df651639b54e"} Mar 18 13:01:06 crc kubenswrapper[4843]: I0318 13:01:06.227409 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bf5040a33cf48bf83fc5101821836e8139d1a310ad620214190df651639b54e" Mar 18 13:01:06 crc kubenswrapper[4843]: I0318 13:01:06.227502 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563981-r8ktq" Mar 18 13:01:15 crc kubenswrapper[4843]: I0318 13:01:15.983821 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:01:15 crc kubenswrapper[4843]: E0318 13:01:15.984557 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:01:29 crc kubenswrapper[4843]: I0318 13:01:29.984999 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:01:29 crc kubenswrapper[4843]: E0318 13:01:29.986188 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:01:42 crc kubenswrapper[4843]: I0318 13:01:42.984454 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:01:42 crc kubenswrapper[4843]: E0318 13:01:42.985277 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:01:54 crc kubenswrapper[4843]: I0318 13:01:54.983751 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:01:54 crc kubenswrapper[4843]: E0318 13:01:54.985954 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:02:00 crc kubenswrapper[4843]: I0318 13:02:00.163186 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563982-8gn4k"] Mar 18 13:02:00 crc kubenswrapper[4843]: E0318 13:02:00.165272 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4c6c09-97bc-4b96-9014-279327ba52b6" containerName="keystone-cron" Mar 18 13:02:00 crc kubenswrapper[4843]: I0318 13:02:00.165406 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4c6c09-97bc-4b96-9014-279327ba52b6" containerName="keystone-cron" Mar 18 13:02:00 crc kubenswrapper[4843]: I0318 13:02:00.165750 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4c6c09-97bc-4b96-9014-279327ba52b6" containerName="keystone-cron" Mar 18 13:02:00 crc kubenswrapper[4843]: I0318 13:02:00.166752 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563982-8gn4k" Mar 18 13:02:00 crc kubenswrapper[4843]: I0318 13:02:00.169843 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:02:00 crc kubenswrapper[4843]: I0318 13:02:00.169878 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:02:00 crc kubenswrapper[4843]: I0318 13:02:00.170847 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:02:00 crc kubenswrapper[4843]: I0318 13:02:00.172777 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563982-8gn4k"] Mar 18 13:02:00 crc kubenswrapper[4843]: I0318 13:02:00.201193 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qgkr\" (UniqueName: \"kubernetes.io/projected/7f12ca1e-dde7-4136-9f77-ef1d5a3ec966-kube-api-access-5qgkr\") pod \"auto-csr-approver-29563982-8gn4k\" (UID: \"7f12ca1e-dde7-4136-9f77-ef1d5a3ec966\") " pod="openshift-infra/auto-csr-approver-29563982-8gn4k" Mar 18 13:02:00 crc kubenswrapper[4843]: I0318 13:02:00.302410 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qgkr\" (UniqueName: \"kubernetes.io/projected/7f12ca1e-dde7-4136-9f77-ef1d5a3ec966-kube-api-access-5qgkr\") pod \"auto-csr-approver-29563982-8gn4k\" (UID: \"7f12ca1e-dde7-4136-9f77-ef1d5a3ec966\") " pod="openshift-infra/auto-csr-approver-29563982-8gn4k" Mar 18 13:02:00 crc kubenswrapper[4843]: I0318 13:02:00.332781 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qgkr\" (UniqueName: \"kubernetes.io/projected/7f12ca1e-dde7-4136-9f77-ef1d5a3ec966-kube-api-access-5qgkr\") pod \"auto-csr-approver-29563982-8gn4k\" (UID: \"7f12ca1e-dde7-4136-9f77-ef1d5a3ec966\") " pod="openshift-infra/auto-csr-approver-29563982-8gn4k" Mar 18 13:02:00 crc kubenswrapper[4843]: I0318 13:02:00.494805 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563982-8gn4k" Mar 18 13:02:01 crc kubenswrapper[4843]: I0318 13:02:01.062970 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563982-8gn4k"] Mar 18 13:02:01 crc kubenswrapper[4843]: I0318 13:02:01.857825 4843 generic.go:334] "Generic (PLEG): container finished" podID="b908e440-89c5-462b-bab0-861853b924d3" containerID="ec33a40c7f007c20e2a0658769a79ba5f05810fdf8bf6a7b9048a67e6d8cf143" exitCode=2 Mar 18 13:02:01 crc kubenswrapper[4843]: I0318 13:02:01.857895 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" event={"ID":"b908e440-89c5-462b-bab0-861853b924d3","Type":"ContainerDied","Data":"ec33a40c7f007c20e2a0658769a79ba5f05810fdf8bf6a7b9048a67e6d8cf143"} Mar 18 13:02:01 crc kubenswrapper[4843]: I0318 13:02:01.859306 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563982-8gn4k" event={"ID":"7f12ca1e-dde7-4136-9f77-ef1d5a3ec966","Type":"ContainerStarted","Data":"20f669b23be4a1b4bb20846f6b6adf5150dd1b2ba65c8806b89609980a236723"} Mar 18 13:02:02 crc kubenswrapper[4843]: I0318 13:02:02.868697 4843 generic.go:334] "Generic (PLEG): container finished" podID="7f12ca1e-dde7-4136-9f77-ef1d5a3ec966" containerID="e2b559b6cd7dd1f3399c539fe9ba23ca7c5d3328ebaa68c19f5c557fec755510" exitCode=0 Mar 18 13:02:02 crc kubenswrapper[4843]: I0318 13:02:02.868834 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563982-8gn4k" event={"ID":"7f12ca1e-dde7-4136-9f77-ef1d5a3ec966","Type":"ContainerDied","Data":"e2b559b6cd7dd1f3399c539fe9ba23ca7c5d3328ebaa68c19f5c557fec755510"} Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.292721 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.369292 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-inventory\") pod \"b908e440-89c5-462b-bab0-861853b924d3\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.369397 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-ssh-key-openstack-edpm-ipam\") pod \"b908e440-89c5-462b-bab0-861853b924d3\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.369505 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-libvirt-combined-ca-bundle\") pod \"b908e440-89c5-462b-bab0-861853b924d3\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.369551 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgkdm\" (UniqueName: \"kubernetes.io/projected/b908e440-89c5-462b-bab0-861853b924d3-kube-api-access-rgkdm\") pod \"b908e440-89c5-462b-bab0-861853b924d3\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.370471 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-libvirt-secret-0\") pod \"b908e440-89c5-462b-bab0-861853b924d3\" (UID: \"b908e440-89c5-462b-bab0-861853b924d3\") " Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.376225 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b908e440-89c5-462b-bab0-861853b924d3-kube-api-access-rgkdm" (OuterVolumeSpecName: "kube-api-access-rgkdm") pod "b908e440-89c5-462b-bab0-861853b924d3" (UID: "b908e440-89c5-462b-bab0-861853b924d3"). InnerVolumeSpecName "kube-api-access-rgkdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.376258 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b908e440-89c5-462b-bab0-861853b924d3" (UID: "b908e440-89c5-462b-bab0-861853b924d3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.398483 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b908e440-89c5-462b-bab0-861853b924d3" (UID: "b908e440-89c5-462b-bab0-861853b924d3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.401832 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-inventory" (OuterVolumeSpecName: "inventory") pod "b908e440-89c5-462b-bab0-861853b924d3" (UID: "b908e440-89c5-462b-bab0-861853b924d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.408282 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b908e440-89c5-462b-bab0-861853b924d3" (UID: "b908e440-89c5-462b-bab0-861853b924d3"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.472742 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.472770 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.472783 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.472792 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgkdm\" (UniqueName: \"kubernetes.io/projected/b908e440-89c5-462b-bab0-861853b924d3-kube-api-access-rgkdm\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.472802 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b908e440-89c5-462b-bab0-861853b924d3-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.879183 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.879288 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk" event={"ID":"b908e440-89c5-462b-bab0-861853b924d3","Type":"ContainerDied","Data":"542e442dc92fb5357dec33cba75a3e13899253a7a71a7aad0534e7f2ac6fafb1"} Mar 18 13:02:03 crc kubenswrapper[4843]: I0318 13:02:03.879540 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="542e442dc92fb5357dec33cba75a3e13899253a7a71a7aad0534e7f2ac6fafb1" Mar 18 13:02:04 crc kubenswrapper[4843]: I0318 13:02:04.152879 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563982-8gn4k" Mar 18 13:02:04 crc kubenswrapper[4843]: I0318 13:02:04.184914 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qgkr\" (UniqueName: \"kubernetes.io/projected/7f12ca1e-dde7-4136-9f77-ef1d5a3ec966-kube-api-access-5qgkr\") pod \"7f12ca1e-dde7-4136-9f77-ef1d5a3ec966\" (UID: \"7f12ca1e-dde7-4136-9f77-ef1d5a3ec966\") " Mar 18 13:02:04 crc kubenswrapper[4843]: I0318 13:02:04.189986 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f12ca1e-dde7-4136-9f77-ef1d5a3ec966-kube-api-access-5qgkr" (OuterVolumeSpecName: "kube-api-access-5qgkr") pod "7f12ca1e-dde7-4136-9f77-ef1d5a3ec966" (UID: "7f12ca1e-dde7-4136-9f77-ef1d5a3ec966"). InnerVolumeSpecName "kube-api-access-5qgkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:02:04 crc kubenswrapper[4843]: I0318 13:02:04.289358 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qgkr\" (UniqueName: \"kubernetes.io/projected/7f12ca1e-dde7-4136-9f77-ef1d5a3ec966-kube-api-access-5qgkr\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:04 crc kubenswrapper[4843]: I0318 13:02:04.891733 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563982-8gn4k" event={"ID":"7f12ca1e-dde7-4136-9f77-ef1d5a3ec966","Type":"ContainerDied","Data":"20f669b23be4a1b4bb20846f6b6adf5150dd1b2ba65c8806b89609980a236723"} Mar 18 13:02:04 crc kubenswrapper[4843]: I0318 13:02:04.892871 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20f669b23be4a1b4bb20846f6b6adf5150dd1b2ba65c8806b89609980a236723" Mar 18 13:02:04 crc kubenswrapper[4843]: I0318 13:02:04.891970 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563982-8gn4k" Mar 18 13:02:05 crc kubenswrapper[4843]: I0318 13:02:05.282736 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563976-jp28w"] Mar 18 13:02:05 crc kubenswrapper[4843]: I0318 13:02:05.295527 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563976-jp28w"] Mar 18 13:02:06 crc kubenswrapper[4843]: I0318 13:02:06.998634 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b44e3a-b093-415f-97ce-cd77f27c01cb" path="/var/lib/kubelet/pods/54b44e3a-b093-415f-97ce-cd77f27c01cb/volumes" Mar 18 13:02:07 crc kubenswrapper[4843]: I0318 13:02:07.987054 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:02:07 crc kubenswrapper[4843]: E0318 13:02:07.987939 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:02:19 crc kubenswrapper[4843]: I0318 13:02:19.984142 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:02:19 crc kubenswrapper[4843]: E0318 13:02:19.984902 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:02:33 crc kubenswrapper[4843]: I0318 13:02:33.984712 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:02:33 crc kubenswrapper[4843]: E0318 13:02:33.986064 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:02:47 crc kubenswrapper[4843]: I0318 13:02:47.984772 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:02:47 crc kubenswrapper[4843]: E0318 13:02:47.986502 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.705145 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g9sht"] Mar 18 13:03:00 crc kubenswrapper[4843]: E0318 13:03:00.706538 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b908e440-89c5-462b-bab0-861853b924d3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.706561 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="b908e440-89c5-462b-bab0-861853b924d3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:03:00 crc kubenswrapper[4843]: E0318 13:03:00.706600 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f12ca1e-dde7-4136-9f77-ef1d5a3ec966" containerName="oc" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.706612 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f12ca1e-dde7-4136-9f77-ef1d5a3ec966" containerName="oc" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.706958 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f12ca1e-dde7-4136-9f77-ef1d5a3ec966" containerName="oc" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.707011 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="b908e440-89c5-462b-bab0-861853b924d3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.710608 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.716347 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g9sht"] Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.828075 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69grj\" (UniqueName: \"kubernetes.io/projected/95ba0e16-16d8-49c5-9cbc-a79546da3207-kube-api-access-69grj\") pod \"certified-operators-g9sht\" (UID: \"95ba0e16-16d8-49c5-9cbc-a79546da3207\") " pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.828404 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ba0e16-16d8-49c5-9cbc-a79546da3207-catalog-content\") pod \"certified-operators-g9sht\" (UID: \"95ba0e16-16d8-49c5-9cbc-a79546da3207\") " pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.828671 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ba0e16-16d8-49c5-9cbc-a79546da3207-utilities\") pod \"certified-operators-g9sht\" (UID: \"95ba0e16-16d8-49c5-9cbc-a79546da3207\") " pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.930453 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69grj\" (UniqueName: \"kubernetes.io/projected/95ba0e16-16d8-49c5-9cbc-a79546da3207-kube-api-access-69grj\") pod \"certified-operators-g9sht\" (UID: \"95ba0e16-16d8-49c5-9cbc-a79546da3207\") " pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.930912 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ba0e16-16d8-49c5-9cbc-a79546da3207-catalog-content\") pod \"certified-operators-g9sht\" (UID: \"95ba0e16-16d8-49c5-9cbc-a79546da3207\") " pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.931006 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ba0e16-16d8-49c5-9cbc-a79546da3207-utilities\") pod \"certified-operators-g9sht\" (UID: \"95ba0e16-16d8-49c5-9cbc-a79546da3207\") " pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.931515 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ba0e16-16d8-49c5-9cbc-a79546da3207-utilities\") pod \"certified-operators-g9sht\" (UID: \"95ba0e16-16d8-49c5-9cbc-a79546da3207\") " pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.931781 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ba0e16-16d8-49c5-9cbc-a79546da3207-catalog-content\") pod \"certified-operators-g9sht\" (UID: \"95ba0e16-16d8-49c5-9cbc-a79546da3207\") " pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:00 crc kubenswrapper[4843]: I0318 13:03:00.949616 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69grj\" (UniqueName: \"kubernetes.io/projected/95ba0e16-16d8-49c5-9cbc-a79546da3207-kube-api-access-69grj\") pod \"certified-operators-g9sht\" (UID: \"95ba0e16-16d8-49c5-9cbc-a79546da3207\") " pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:01 crc kubenswrapper[4843]: I0318 13:03:01.087019 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:01 crc kubenswrapper[4843]: I0318 13:03:01.575082 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g9sht"] Mar 18 13:03:01 crc kubenswrapper[4843]: W0318 13:03:01.586877 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95ba0e16_16d8_49c5_9cbc_a79546da3207.slice/crio-f052b7ebc8a595808d1389745e4ae8292faf608851ab6f3e48b57995fff3d295 WatchSource:0}: Error finding container f052b7ebc8a595808d1389745e4ae8292faf608851ab6f3e48b57995fff3d295: Status 404 returned error can't find the container with id f052b7ebc8a595808d1389745e4ae8292faf608851ab6f3e48b57995fff3d295 Mar 18 13:03:01 crc kubenswrapper[4843]: I0318 13:03:01.784289 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9sht" event={"ID":"95ba0e16-16d8-49c5-9cbc-a79546da3207","Type":"ContainerStarted","Data":"f052b7ebc8a595808d1389745e4ae8292faf608851ab6f3e48b57995fff3d295"} Mar 18 13:03:01 crc kubenswrapper[4843]: I0318 13:03:01.984269 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:03:01 crc kubenswrapper[4843]: E0318 13:03:01.984729 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:03:02 crc kubenswrapper[4843]: I0318 13:03:02.804718 4843 generic.go:334] "Generic (PLEG): container finished" podID="95ba0e16-16d8-49c5-9cbc-a79546da3207" containerID="074f690c65097c489df55f174003ce9f76ff3db696224ffd3ab11ef41e1f8685" exitCode=0 Mar 18 13:03:02 crc kubenswrapper[4843]: I0318 13:03:02.804811 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9sht" event={"ID":"95ba0e16-16d8-49c5-9cbc-a79546da3207","Type":"ContainerDied","Data":"074f690c65097c489df55f174003ce9f76ff3db696224ffd3ab11ef41e1f8685"} Mar 18 13:03:02 crc kubenswrapper[4843]: I0318 13:03:02.809231 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:03:04 crc kubenswrapper[4843]: I0318 13:03:04.834614 4843 generic.go:334] "Generic (PLEG): container finished" podID="95ba0e16-16d8-49c5-9cbc-a79546da3207" containerID="30e743c72fb3f32981a22239381e9913b0fe03625360048f928c5719d7c4a731" exitCode=0 Mar 18 13:03:04 crc kubenswrapper[4843]: I0318 13:03:04.834735 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9sht" event={"ID":"95ba0e16-16d8-49c5-9cbc-a79546da3207","Type":"ContainerDied","Data":"30e743c72fb3f32981a22239381e9913b0fe03625360048f928c5719d7c4a731"} Mar 18 13:03:05 crc kubenswrapper[4843]: I0318 13:03:05.849212 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9sht" event={"ID":"95ba0e16-16d8-49c5-9cbc-a79546da3207","Type":"ContainerStarted","Data":"b322b83f8796dcc9f553ce003577827babdd56a0366359892e3a21029f39acbb"} Mar 18 13:03:05 crc kubenswrapper[4843]: I0318 13:03:05.875779 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g9sht" podStartSLOduration=3.442157645 podStartE2EDuration="5.875729108s" podCreationTimestamp="2026-03-18 13:03:00 +0000 UTC" firstStartedPulling="2026-03-18 13:03:02.808805897 +0000 UTC m=+3216.524631431" lastFinishedPulling="2026-03-18 13:03:05.24237733 +0000 UTC m=+3218.958202894" observedRunningTime="2026-03-18 13:03:05.874677348 +0000 UTC m=+3219.590502902" watchObservedRunningTime="2026-03-18 13:03:05.875729108 +0000 UTC m=+3219.591554632" Mar 18 13:03:06 crc kubenswrapper[4843]: I0318 13:03:06.020411 4843 scope.go:117] "RemoveContainer" containerID="3b5ee2d0b4d6d6b98ba504f73c2baa53c919a6086a9e06e55710ca697d11b8d8" Mar 18 13:03:11 crc kubenswrapper[4843]: I0318 13:03:11.087974 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:11 crc kubenswrapper[4843]: I0318 13:03:11.088363 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:11 crc kubenswrapper[4843]: I0318 13:03:11.150633 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:11 crc kubenswrapper[4843]: I0318 13:03:11.962502 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:12 crc kubenswrapper[4843]: I0318 13:03:12.877847 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g9sht"] Mar 18 13:03:13 crc kubenswrapper[4843]: I0318 13:03:13.937099 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g9sht" podUID="95ba0e16-16d8-49c5-9cbc-a79546da3207" containerName="registry-server" containerID="cri-o://b322b83f8796dcc9f553ce003577827babdd56a0366359892e3a21029f39acbb" gracePeriod=2 Mar 18 13:03:13 crc kubenswrapper[4843]: I0318 13:03:13.984542 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:03:13 crc kubenswrapper[4843]: E0318 13:03:13.984820 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.400094 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.449142 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ba0e16-16d8-49c5-9cbc-a79546da3207-catalog-content\") pod \"95ba0e16-16d8-49c5-9cbc-a79546da3207\" (UID: \"95ba0e16-16d8-49c5-9cbc-a79546da3207\") " Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.449324 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69grj\" (UniqueName: \"kubernetes.io/projected/95ba0e16-16d8-49c5-9cbc-a79546da3207-kube-api-access-69grj\") pod \"95ba0e16-16d8-49c5-9cbc-a79546da3207\" (UID: \"95ba0e16-16d8-49c5-9cbc-a79546da3207\") " Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.449363 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ba0e16-16d8-49c5-9cbc-a79546da3207-utilities\") pod \"95ba0e16-16d8-49c5-9cbc-a79546da3207\" (UID: \"95ba0e16-16d8-49c5-9cbc-a79546da3207\") " Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.450274 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95ba0e16-16d8-49c5-9cbc-a79546da3207-utilities" (OuterVolumeSpecName: "utilities") pod "95ba0e16-16d8-49c5-9cbc-a79546da3207" (UID: "95ba0e16-16d8-49c5-9cbc-a79546da3207"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.460216 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ba0e16-16d8-49c5-9cbc-a79546da3207-kube-api-access-69grj" (OuterVolumeSpecName: "kube-api-access-69grj") pod "95ba0e16-16d8-49c5-9cbc-a79546da3207" (UID: "95ba0e16-16d8-49c5-9cbc-a79546da3207"). InnerVolumeSpecName "kube-api-access-69grj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.524751 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95ba0e16-16d8-49c5-9cbc-a79546da3207-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95ba0e16-16d8-49c5-9cbc-a79546da3207" (UID: "95ba0e16-16d8-49c5-9cbc-a79546da3207"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.552252 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69grj\" (UniqueName: \"kubernetes.io/projected/95ba0e16-16d8-49c5-9cbc-a79546da3207-kube-api-access-69grj\") on node \"crc\" DevicePath \"\"" Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.552284 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ba0e16-16d8-49c5-9cbc-a79546da3207-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.552295 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ba0e16-16d8-49c5-9cbc-a79546da3207-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.950300 4843 generic.go:334] "Generic (PLEG): container finished" podID="95ba0e16-16d8-49c5-9cbc-a79546da3207" containerID="b322b83f8796dcc9f553ce003577827babdd56a0366359892e3a21029f39acbb" exitCode=0 Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.950343 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9sht" event={"ID":"95ba0e16-16d8-49c5-9cbc-a79546da3207","Type":"ContainerDied","Data":"b322b83f8796dcc9f553ce003577827babdd56a0366359892e3a21029f39acbb"} Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.950399 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9sht" event={"ID":"95ba0e16-16d8-49c5-9cbc-a79546da3207","Type":"ContainerDied","Data":"f052b7ebc8a595808d1389745e4ae8292faf608851ab6f3e48b57995fff3d295"} Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.950426 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g9sht" Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.950438 4843 scope.go:117] "RemoveContainer" containerID="b322b83f8796dcc9f553ce003577827babdd56a0366359892e3a21029f39acbb" Mar 18 13:03:14 crc kubenswrapper[4843]: I0318 13:03:14.975828 4843 scope.go:117] "RemoveContainer" containerID="30e743c72fb3f32981a22239381e9913b0fe03625360048f928c5719d7c4a731" Mar 18 13:03:15 crc kubenswrapper[4843]: I0318 13:03:15.011431 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g9sht"] Mar 18 13:03:15 crc kubenswrapper[4843]: I0318 13:03:15.012232 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g9sht"] Mar 18 13:03:15 crc kubenswrapper[4843]: I0318 13:03:15.028246 4843 scope.go:117] "RemoveContainer" containerID="074f690c65097c489df55f174003ce9f76ff3db696224ffd3ab11ef41e1f8685" Mar 18 13:03:15 crc kubenswrapper[4843]: I0318 13:03:15.049246 4843 scope.go:117] "RemoveContainer" containerID="b322b83f8796dcc9f553ce003577827babdd56a0366359892e3a21029f39acbb" Mar 18 13:03:15 crc kubenswrapper[4843]: E0318 13:03:15.049838 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b322b83f8796dcc9f553ce003577827babdd56a0366359892e3a21029f39acbb\": container with ID starting with b322b83f8796dcc9f553ce003577827babdd56a0366359892e3a21029f39acbb not found: ID does not exist" containerID="b322b83f8796dcc9f553ce003577827babdd56a0366359892e3a21029f39acbb" Mar 18 13:03:15 crc kubenswrapper[4843]: I0318 13:03:15.049899 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b322b83f8796dcc9f553ce003577827babdd56a0366359892e3a21029f39acbb"} err="failed to get container status \"b322b83f8796dcc9f553ce003577827babdd56a0366359892e3a21029f39acbb\": rpc error: code = NotFound desc = could not find container \"b322b83f8796dcc9f553ce003577827babdd56a0366359892e3a21029f39acbb\": container with ID starting with b322b83f8796dcc9f553ce003577827babdd56a0366359892e3a21029f39acbb not found: ID does not exist" Mar 18 13:03:15 crc kubenswrapper[4843]: I0318 13:03:15.049937 4843 scope.go:117] "RemoveContainer" containerID="30e743c72fb3f32981a22239381e9913b0fe03625360048f928c5719d7c4a731" Mar 18 13:03:15 crc kubenswrapper[4843]: E0318 13:03:15.050419 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30e743c72fb3f32981a22239381e9913b0fe03625360048f928c5719d7c4a731\": container with ID starting with 30e743c72fb3f32981a22239381e9913b0fe03625360048f928c5719d7c4a731 not found: ID does not exist" containerID="30e743c72fb3f32981a22239381e9913b0fe03625360048f928c5719d7c4a731" Mar 18 13:03:15 crc kubenswrapper[4843]: I0318 13:03:15.050455 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30e743c72fb3f32981a22239381e9913b0fe03625360048f928c5719d7c4a731"} err="failed to get container status \"30e743c72fb3f32981a22239381e9913b0fe03625360048f928c5719d7c4a731\": rpc error: code = NotFound desc = could not find container \"30e743c72fb3f32981a22239381e9913b0fe03625360048f928c5719d7c4a731\": container with ID starting with 30e743c72fb3f32981a22239381e9913b0fe03625360048f928c5719d7c4a731 not found: ID does not exist" Mar 18 13:03:15 crc kubenswrapper[4843]: I0318 13:03:15.050473 4843 scope.go:117] "RemoveContainer" containerID="074f690c65097c489df55f174003ce9f76ff3db696224ffd3ab11ef41e1f8685" Mar 18 13:03:15 crc kubenswrapper[4843]: E0318 13:03:15.050800 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"074f690c65097c489df55f174003ce9f76ff3db696224ffd3ab11ef41e1f8685\": container with ID starting with 074f690c65097c489df55f174003ce9f76ff3db696224ffd3ab11ef41e1f8685 not found: ID does not exist" containerID="074f690c65097c489df55f174003ce9f76ff3db696224ffd3ab11ef41e1f8685" Mar 18 13:03:15 crc kubenswrapper[4843]: I0318 13:03:15.050824 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"074f690c65097c489df55f174003ce9f76ff3db696224ffd3ab11ef41e1f8685"} err="failed to get container status \"074f690c65097c489df55f174003ce9f76ff3db696224ffd3ab11ef41e1f8685\": rpc error: code = NotFound desc = could not find container \"074f690c65097c489df55f174003ce9f76ff3db696224ffd3ab11ef41e1f8685\": container with ID starting with 074f690c65097c489df55f174003ce9f76ff3db696224ffd3ab11ef41e1f8685 not found: ID does not exist" Mar 18 13:03:16 crc kubenswrapper[4843]: I0318 13:03:16.997123 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ba0e16-16d8-49c5-9cbc-a79546da3207" path="/var/lib/kubelet/pods/95ba0e16-16d8-49c5-9cbc-a79546da3207/volumes" Mar 18 13:03:27 crc kubenswrapper[4843]: I0318 13:03:27.002549 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:03:27 crc kubenswrapper[4843]: E0318 13:03:27.004397 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:03:37 crc kubenswrapper[4843]: I0318 13:03:37.984508 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:03:37 crc kubenswrapper[4843]: E0318 13:03:37.985257 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:03:51 crc kubenswrapper[4843]: I0318 13:03:51.435473 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:03:51 crc kubenswrapper[4843]: E0318 13:03:51.437155 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.161141 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563984-875jw"] Mar 18 13:04:00 crc kubenswrapper[4843]: E0318 13:04:00.162429 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ba0e16-16d8-49c5-9cbc-a79546da3207" containerName="extract-utilities" Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.162453 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ba0e16-16d8-49c5-9cbc-a79546da3207" containerName="extract-utilities" Mar 18 13:04:00 crc kubenswrapper[4843]: E0318 13:04:00.162503 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ba0e16-16d8-49c5-9cbc-a79546da3207" containerName="registry-server" Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.162519 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ba0e16-16d8-49c5-9cbc-a79546da3207" containerName="registry-server" Mar 18 13:04:00 crc kubenswrapper[4843]: E0318 13:04:00.162538 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ba0e16-16d8-49c5-9cbc-a79546da3207" containerName="extract-content" Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.162551 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ba0e16-16d8-49c5-9cbc-a79546da3207" containerName="extract-content" Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.162972 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ba0e16-16d8-49c5-9cbc-a79546da3207" containerName="registry-server" Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.164140 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563984-875jw" Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.168917 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.169013 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.168918 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.175204 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563984-875jw"] Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.310850 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt7nv\" (UniqueName: \"kubernetes.io/projected/1b58fe38-600a-472e-ab5f-0f8b51b3a3a5-kube-api-access-kt7nv\") pod \"auto-csr-approver-29563984-875jw\" (UID: \"1b58fe38-600a-472e-ab5f-0f8b51b3a3a5\") " pod="openshift-infra/auto-csr-approver-29563984-875jw" Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.413084 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt7nv\" (UniqueName: \"kubernetes.io/projected/1b58fe38-600a-472e-ab5f-0f8b51b3a3a5-kube-api-access-kt7nv\") pod \"auto-csr-approver-29563984-875jw\" (UID: \"1b58fe38-600a-472e-ab5f-0f8b51b3a3a5\") " pod="openshift-infra/auto-csr-approver-29563984-875jw" Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.432212 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt7nv\" (UniqueName: \"kubernetes.io/projected/1b58fe38-600a-472e-ab5f-0f8b51b3a3a5-kube-api-access-kt7nv\") pod \"auto-csr-approver-29563984-875jw\" (UID: \"1b58fe38-600a-472e-ab5f-0f8b51b3a3a5\") " pod="openshift-infra/auto-csr-approver-29563984-875jw" Mar 18 13:04:00 crc kubenswrapper[4843]: I0318 13:04:00.484773 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563984-875jw" Mar 18 13:04:01 crc kubenswrapper[4843]: I0318 13:04:01.032188 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563984-875jw"] Mar 18 13:04:01 crc kubenswrapper[4843]: I0318 13:04:01.543197 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563984-875jw" event={"ID":"1b58fe38-600a-472e-ab5f-0f8b51b3a3a5","Type":"ContainerStarted","Data":"a5394109f9c5ad53e60da410a26512c924b67ab8627b386cf9e7c1ad40bc64ac"} Mar 18 13:04:01 crc kubenswrapper[4843]: I0318 13:04:01.850867 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vjx2r"] Mar 18 13:04:01 crc kubenswrapper[4843]: I0318 13:04:01.852879 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:01 crc kubenswrapper[4843]: I0318 13:04:01.862365 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vjx2r"] Mar 18 13:04:01 crc kubenswrapper[4843]: I0318 13:04:01.868225 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a47b5-5098-429c-a2de-f42447e77c23-utilities\") pod \"redhat-operators-vjx2r\" (UID: \"f81a47b5-5098-429c-a2de-f42447e77c23\") " pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:01 crc kubenswrapper[4843]: I0318 13:04:01.868300 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a47b5-5098-429c-a2de-f42447e77c23-catalog-content\") pod \"redhat-operators-vjx2r\" (UID: \"f81a47b5-5098-429c-a2de-f42447e77c23\") " pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:01 crc kubenswrapper[4843]: I0318 13:04:01.868400 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpx9n\" (UniqueName: \"kubernetes.io/projected/f81a47b5-5098-429c-a2de-f42447e77c23-kube-api-access-tpx9n\") pod \"redhat-operators-vjx2r\" (UID: \"f81a47b5-5098-429c-a2de-f42447e77c23\") " pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:01 crc kubenswrapper[4843]: I0318 13:04:01.970240 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a47b5-5098-429c-a2de-f42447e77c23-utilities\") pod \"redhat-operators-vjx2r\" (UID: \"f81a47b5-5098-429c-a2de-f42447e77c23\") " pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:01 crc kubenswrapper[4843]: I0318 13:04:01.970325 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a47b5-5098-429c-a2de-f42447e77c23-catalog-content\") pod \"redhat-operators-vjx2r\" (UID: \"f81a47b5-5098-429c-a2de-f42447e77c23\") " pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:01 crc kubenswrapper[4843]: I0318 13:04:01.970415 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpx9n\" (UniqueName: \"kubernetes.io/projected/f81a47b5-5098-429c-a2de-f42447e77c23-kube-api-access-tpx9n\") pod \"redhat-operators-vjx2r\" (UID: \"f81a47b5-5098-429c-a2de-f42447e77c23\") " pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:01 crc kubenswrapper[4843]: I0318 13:04:01.973390 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a47b5-5098-429c-a2de-f42447e77c23-utilities\") pod \"redhat-operators-vjx2r\" (UID: \"f81a47b5-5098-429c-a2de-f42447e77c23\") " pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:01 crc kubenswrapper[4843]: I0318 13:04:01.980061 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a47b5-5098-429c-a2de-f42447e77c23-catalog-content\") pod \"redhat-operators-vjx2r\" (UID: \"f81a47b5-5098-429c-a2de-f42447e77c23\") " pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:02 crc kubenswrapper[4843]: I0318 13:04:02.000709 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpx9n\" (UniqueName: \"kubernetes.io/projected/f81a47b5-5098-429c-a2de-f42447e77c23-kube-api-access-tpx9n\") pod \"redhat-operators-vjx2r\" (UID: \"f81a47b5-5098-429c-a2de-f42447e77c23\") " pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:02 crc kubenswrapper[4843]: I0318 13:04:02.184763 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:02 crc kubenswrapper[4843]: W0318 13:04:02.877403 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81a47b5_5098_429c_a2de_f42447e77c23.slice/crio-d9033687fe56917440d2a1c0b0f55ce1bb79ec1e0f7c85e6291cff5fe7cef7cf WatchSource:0}: Error finding container d9033687fe56917440d2a1c0b0f55ce1bb79ec1e0f7c85e6291cff5fe7cef7cf: Status 404 returned error can't find the container with id d9033687fe56917440d2a1c0b0f55ce1bb79ec1e0f7c85e6291cff5fe7cef7cf Mar 18 13:04:02 crc kubenswrapper[4843]: I0318 13:04:02.884266 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vjx2r"] Mar 18 13:04:03 crc kubenswrapper[4843]: I0318 13:04:03.563944 4843 generic.go:334] "Generic (PLEG): container finished" podID="f81a47b5-5098-429c-a2de-f42447e77c23" containerID="6cd300be469859aadcef251cff2310e9de76132456472280703a0a1044b4d81b" exitCode=0 Mar 18 13:04:03 crc kubenswrapper[4843]: I0318 13:04:03.564062 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjx2r" event={"ID":"f81a47b5-5098-429c-a2de-f42447e77c23","Type":"ContainerDied","Data":"6cd300be469859aadcef251cff2310e9de76132456472280703a0a1044b4d81b"} Mar 18 13:04:03 crc kubenswrapper[4843]: I0318 13:04:03.564337 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjx2r" event={"ID":"f81a47b5-5098-429c-a2de-f42447e77c23","Type":"ContainerStarted","Data":"d9033687fe56917440d2a1c0b0f55ce1bb79ec1e0f7c85e6291cff5fe7cef7cf"} Mar 18 13:04:03 crc kubenswrapper[4843]: I0318 13:04:03.566843 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563984-875jw" event={"ID":"1b58fe38-600a-472e-ab5f-0f8b51b3a3a5","Type":"ContainerStarted","Data":"58d94e500f803a7e36acc706fc5333fe51ab5071fe26244cbd05974dbc96e965"} Mar 18 13:04:03 crc kubenswrapper[4843]: I0318 13:04:03.605095 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563984-875jw" podStartSLOduration=1.379911169 podStartE2EDuration="3.605075264s" podCreationTimestamp="2026-03-18 13:04:00 +0000 UTC" firstStartedPulling="2026-03-18 13:04:01.039283536 +0000 UTC m=+3274.755109100" lastFinishedPulling="2026-03-18 13:04:03.264447671 +0000 UTC m=+3276.980273195" observedRunningTime="2026-03-18 13:04:03.601093231 +0000 UTC m=+3277.316918755" watchObservedRunningTime="2026-03-18 13:04:03.605075264 +0000 UTC m=+3277.320900788" Mar 18 13:04:04 crc kubenswrapper[4843]: I0318 13:04:04.581160 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjx2r" event={"ID":"f81a47b5-5098-429c-a2de-f42447e77c23","Type":"ContainerStarted","Data":"d1f04a88fac1f9bec798c87bbdb16d367a1c933b861ad51a27c900bcb003f4ec"} Mar 18 13:04:04 crc kubenswrapper[4843]: I0318 13:04:04.582918 4843 generic.go:334] "Generic (PLEG): container finished" podID="1b58fe38-600a-472e-ab5f-0f8b51b3a3a5" containerID="58d94e500f803a7e36acc706fc5333fe51ab5071fe26244cbd05974dbc96e965" exitCode=0 Mar 18 13:04:04 crc kubenswrapper[4843]: I0318 13:04:04.583017 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563984-875jw" event={"ID":"1b58fe38-600a-472e-ab5f-0f8b51b3a3a5","Type":"ContainerDied","Data":"58d94e500f803a7e36acc706fc5333fe51ab5071fe26244cbd05974dbc96e965"} Mar 18 13:04:04 crc kubenswrapper[4843]: I0318 13:04:04.984099 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:04:04 crc kubenswrapper[4843]: E0318 13:04:04.984484 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:04:05 crc kubenswrapper[4843]: I0318 13:04:05.915165 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563984-875jw" Mar 18 13:04:06 crc kubenswrapper[4843]: I0318 13:04:06.085544 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt7nv\" (UniqueName: \"kubernetes.io/projected/1b58fe38-600a-472e-ab5f-0f8b51b3a3a5-kube-api-access-kt7nv\") pod \"1b58fe38-600a-472e-ab5f-0f8b51b3a3a5\" (UID: \"1b58fe38-600a-472e-ab5f-0f8b51b3a3a5\") " Mar 18 13:04:06 crc kubenswrapper[4843]: I0318 13:04:06.094838 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b58fe38-600a-472e-ab5f-0f8b51b3a3a5-kube-api-access-kt7nv" (OuterVolumeSpecName: "kube-api-access-kt7nv") pod "1b58fe38-600a-472e-ab5f-0f8b51b3a3a5" (UID: "1b58fe38-600a-472e-ab5f-0f8b51b3a3a5"). InnerVolumeSpecName "kube-api-access-kt7nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:06 crc kubenswrapper[4843]: I0318 13:04:06.189416 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt7nv\" (UniqueName: \"kubernetes.io/projected/1b58fe38-600a-472e-ab5f-0f8b51b3a3a5-kube-api-access-kt7nv\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:06 crc kubenswrapper[4843]: I0318 13:04:06.602533 4843 generic.go:334] "Generic (PLEG): container finished" podID="f81a47b5-5098-429c-a2de-f42447e77c23" containerID="d1f04a88fac1f9bec798c87bbdb16d367a1c933b861ad51a27c900bcb003f4ec" exitCode=0 Mar 18 13:04:06 crc kubenswrapper[4843]: I0318 13:04:06.602623 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjx2r" event={"ID":"f81a47b5-5098-429c-a2de-f42447e77c23","Type":"ContainerDied","Data":"d1f04a88fac1f9bec798c87bbdb16d367a1c933b861ad51a27c900bcb003f4ec"} Mar 18 13:04:06 crc kubenswrapper[4843]: I0318 13:04:06.604621 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563984-875jw" event={"ID":"1b58fe38-600a-472e-ab5f-0f8b51b3a3a5","Type":"ContainerDied","Data":"a5394109f9c5ad53e60da410a26512c924b67ab8627b386cf9e7c1ad40bc64ac"} Mar 18 13:04:06 crc kubenswrapper[4843]: I0318 13:04:06.604704 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5394109f9c5ad53e60da410a26512c924b67ab8627b386cf9e7c1ad40bc64ac" Mar 18 13:04:06 crc kubenswrapper[4843]: I0318 13:04:06.604666 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563984-875jw" Mar 18 13:04:07 crc kubenswrapper[4843]: I0318 13:04:07.006922 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563978-9cs6p"] Mar 18 13:04:07 crc kubenswrapper[4843]: I0318 13:04:07.006974 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563978-9cs6p"] Mar 18 13:04:07 crc kubenswrapper[4843]: I0318 13:04:07.615598 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjx2r" event={"ID":"f81a47b5-5098-429c-a2de-f42447e77c23","Type":"ContainerStarted","Data":"c9f43f37a76fe3a1fc4fc5d190fdc2bc7c855c88484a2c833d885dee2610d4fd"} Mar 18 13:04:07 crc kubenswrapper[4843]: I0318 13:04:07.637856 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vjx2r" podStartSLOduration=3.15230384 podStartE2EDuration="6.637836702s" podCreationTimestamp="2026-03-18 13:04:01 +0000 UTC" firstStartedPulling="2026-03-18 13:04:03.565965665 +0000 UTC m=+3277.281791189" lastFinishedPulling="2026-03-18 13:04:07.051498527 +0000 UTC m=+3280.767324051" observedRunningTime="2026-03-18 13:04:07.630607818 +0000 UTC m=+3281.346433352" watchObservedRunningTime="2026-03-18 13:04:07.637836702 +0000 UTC m=+3281.353662226" Mar 18 13:04:08 crc kubenswrapper[4843]: I0318 13:04:08.996752 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b692b03e-5e65-4067-ac66-d0a9d5becb13" path="/var/lib/kubelet/pods/b692b03e-5e65-4067-ac66-d0a9d5becb13/volumes" Mar 18 13:04:12 crc kubenswrapper[4843]: I0318 13:04:12.185536 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:12 crc kubenswrapper[4843]: I0318 13:04:12.185886 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:13 crc kubenswrapper[4843]: I0318 13:04:13.244084 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vjx2r" podUID="f81a47b5-5098-429c-a2de-f42447e77c23" containerName="registry-server" probeResult="failure" output=< Mar 18 13:04:13 crc kubenswrapper[4843]: timeout: failed to connect service ":50051" within 1s Mar 18 13:04:13 crc kubenswrapper[4843]: > Mar 18 13:04:17 crc kubenswrapper[4843]: I0318 13:04:17.015303 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:04:17 crc kubenswrapper[4843]: E0318 13:04:17.018229 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:04:22 crc kubenswrapper[4843]: I0318 13:04:22.269994 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:22 crc kubenswrapper[4843]: I0318 13:04:22.333045 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:22 crc kubenswrapper[4843]: I0318 13:04:22.519173 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vjx2r"] Mar 18 13:04:23 crc kubenswrapper[4843]: I0318 13:04:23.788957 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vjx2r" podUID="f81a47b5-5098-429c-a2de-f42447e77c23" containerName="registry-server" containerID="cri-o://c9f43f37a76fe3a1fc4fc5d190fdc2bc7c855c88484a2c833d885dee2610d4fd" gracePeriod=2 Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.251533 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.296107 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a47b5-5098-429c-a2de-f42447e77c23-catalog-content\") pod \"f81a47b5-5098-429c-a2de-f42447e77c23\" (UID: \"f81a47b5-5098-429c-a2de-f42447e77c23\") " Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.296332 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpx9n\" (UniqueName: \"kubernetes.io/projected/f81a47b5-5098-429c-a2de-f42447e77c23-kube-api-access-tpx9n\") pod \"f81a47b5-5098-429c-a2de-f42447e77c23\" (UID: \"f81a47b5-5098-429c-a2de-f42447e77c23\") " Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.297522 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a47b5-5098-429c-a2de-f42447e77c23-utilities\") pod \"f81a47b5-5098-429c-a2de-f42447e77c23\" (UID: \"f81a47b5-5098-429c-a2de-f42447e77c23\") " Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.298499 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f81a47b5-5098-429c-a2de-f42447e77c23-utilities" (OuterVolumeSpecName: "utilities") pod "f81a47b5-5098-429c-a2de-f42447e77c23" (UID: "f81a47b5-5098-429c-a2de-f42447e77c23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.299883 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a47b5-5098-429c-a2de-f42447e77c23-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.303759 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f81a47b5-5098-429c-a2de-f42447e77c23-kube-api-access-tpx9n" (OuterVolumeSpecName: "kube-api-access-tpx9n") pod "f81a47b5-5098-429c-a2de-f42447e77c23" (UID: "f81a47b5-5098-429c-a2de-f42447e77c23"). InnerVolumeSpecName "kube-api-access-tpx9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.402228 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpx9n\" (UniqueName: \"kubernetes.io/projected/f81a47b5-5098-429c-a2de-f42447e77c23-kube-api-access-tpx9n\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.423561 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f81a47b5-5098-429c-a2de-f42447e77c23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f81a47b5-5098-429c-a2de-f42447e77c23" (UID: "f81a47b5-5098-429c-a2de-f42447e77c23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.503958 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a47b5-5098-429c-a2de-f42447e77c23-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.805179 4843 generic.go:334] "Generic (PLEG): container finished" podID="f81a47b5-5098-429c-a2de-f42447e77c23" containerID="c9f43f37a76fe3a1fc4fc5d190fdc2bc7c855c88484a2c833d885dee2610d4fd" exitCode=0 Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.805240 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjx2r" event={"ID":"f81a47b5-5098-429c-a2de-f42447e77c23","Type":"ContainerDied","Data":"c9f43f37a76fe3a1fc4fc5d190fdc2bc7c855c88484a2c833d885dee2610d4fd"} Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.805257 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjx2r" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.805284 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjx2r" event={"ID":"f81a47b5-5098-429c-a2de-f42447e77c23","Type":"ContainerDied","Data":"d9033687fe56917440d2a1c0b0f55ce1bb79ec1e0f7c85e6291cff5fe7cef7cf"} Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.805307 4843 scope.go:117] "RemoveContainer" containerID="c9f43f37a76fe3a1fc4fc5d190fdc2bc7c855c88484a2c833d885dee2610d4fd" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.851022 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vjx2r"] Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.854834 4843 scope.go:117] "RemoveContainer" containerID="d1f04a88fac1f9bec798c87bbdb16d367a1c933b861ad51a27c900bcb003f4ec" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.860964 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vjx2r"] Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.892520 4843 scope.go:117] "RemoveContainer" containerID="6cd300be469859aadcef251cff2310e9de76132456472280703a0a1044b4d81b" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.937814 4843 scope.go:117] "RemoveContainer" containerID="c9f43f37a76fe3a1fc4fc5d190fdc2bc7c855c88484a2c833d885dee2610d4fd" Mar 18 13:04:24 crc kubenswrapper[4843]: E0318 13:04:24.938743 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f43f37a76fe3a1fc4fc5d190fdc2bc7c855c88484a2c833d885dee2610d4fd\": container with ID starting with c9f43f37a76fe3a1fc4fc5d190fdc2bc7c855c88484a2c833d885dee2610d4fd not found: ID does not exist" containerID="c9f43f37a76fe3a1fc4fc5d190fdc2bc7c855c88484a2c833d885dee2610d4fd" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.938789 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f43f37a76fe3a1fc4fc5d190fdc2bc7c855c88484a2c833d885dee2610d4fd"} err="failed to get container status \"c9f43f37a76fe3a1fc4fc5d190fdc2bc7c855c88484a2c833d885dee2610d4fd\": rpc error: code = NotFound desc = could not find container \"c9f43f37a76fe3a1fc4fc5d190fdc2bc7c855c88484a2c833d885dee2610d4fd\": container with ID starting with c9f43f37a76fe3a1fc4fc5d190fdc2bc7c855c88484a2c833d885dee2610d4fd not found: ID does not exist" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.938825 4843 scope.go:117] "RemoveContainer" containerID="d1f04a88fac1f9bec798c87bbdb16d367a1c933b861ad51a27c900bcb003f4ec" Mar 18 13:04:24 crc kubenswrapper[4843]: E0318 13:04:24.939506 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f04a88fac1f9bec798c87bbdb16d367a1c933b861ad51a27c900bcb003f4ec\": container with ID starting with d1f04a88fac1f9bec798c87bbdb16d367a1c933b861ad51a27c900bcb003f4ec not found: ID does not exist" containerID="d1f04a88fac1f9bec798c87bbdb16d367a1c933b861ad51a27c900bcb003f4ec" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.939559 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f04a88fac1f9bec798c87bbdb16d367a1c933b861ad51a27c900bcb003f4ec"} err="failed to get container status \"d1f04a88fac1f9bec798c87bbdb16d367a1c933b861ad51a27c900bcb003f4ec\": rpc error: code = NotFound desc = could not find container \"d1f04a88fac1f9bec798c87bbdb16d367a1c933b861ad51a27c900bcb003f4ec\": container with ID starting with d1f04a88fac1f9bec798c87bbdb16d367a1c933b861ad51a27c900bcb003f4ec not found: ID does not exist" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.939593 4843 scope.go:117] "RemoveContainer" containerID="6cd300be469859aadcef251cff2310e9de76132456472280703a0a1044b4d81b" Mar 18 13:04:24 crc kubenswrapper[4843]: E0318 13:04:24.940135 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd300be469859aadcef251cff2310e9de76132456472280703a0a1044b4d81b\": container with ID starting with 6cd300be469859aadcef251cff2310e9de76132456472280703a0a1044b4d81b not found: ID does not exist" containerID="6cd300be469859aadcef251cff2310e9de76132456472280703a0a1044b4d81b" Mar 18 13:04:24 crc kubenswrapper[4843]: I0318 13:04:24.940168 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd300be469859aadcef251cff2310e9de76132456472280703a0a1044b4d81b"} err="failed to get container status \"6cd300be469859aadcef251cff2310e9de76132456472280703a0a1044b4d81b\": rpc error: code = NotFound desc = could not find container \"6cd300be469859aadcef251cff2310e9de76132456472280703a0a1044b4d81b\": container with ID starting with 6cd300be469859aadcef251cff2310e9de76132456472280703a0a1044b4d81b not found: ID does not exist" Mar 18 13:04:25 crc kubenswrapper[4843]: I0318 13:04:25.000275 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f81a47b5-5098-429c-a2de-f42447e77c23" path="/var/lib/kubelet/pods/f81a47b5-5098-429c-a2de-f42447e77c23/volumes" Mar 18 13:04:29 crc kubenswrapper[4843]: I0318 13:04:29.984406 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:04:29 crc kubenswrapper[4843]: E0318 13:04:29.985178 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.089705 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p"] Mar 18 13:04:40 crc kubenswrapper[4843]: E0318 13:04:40.090742 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81a47b5-5098-429c-a2de-f42447e77c23" containerName="registry-server" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.090758 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81a47b5-5098-429c-a2de-f42447e77c23" containerName="registry-server" Mar 18 13:04:40 crc kubenswrapper[4843]: E0318 13:04:40.090780 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b58fe38-600a-472e-ab5f-0f8b51b3a3a5" containerName="oc" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.090789 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b58fe38-600a-472e-ab5f-0f8b51b3a3a5" containerName="oc" Mar 18 13:04:40 crc kubenswrapper[4843]: E0318 13:04:40.090814 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81a47b5-5098-429c-a2de-f42447e77c23" containerName="extract-utilities" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.090822 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81a47b5-5098-429c-a2de-f42447e77c23" containerName="extract-utilities" Mar 18 13:04:40 crc kubenswrapper[4843]: E0318 13:04:40.090849 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81a47b5-5098-429c-a2de-f42447e77c23" containerName="extract-content" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.090857 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81a47b5-5098-429c-a2de-f42447e77c23" containerName="extract-content" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.091117 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b58fe38-600a-472e-ab5f-0f8b51b3a3a5" containerName="oc" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.091136 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f81a47b5-5098-429c-a2de-f42447e77c23" containerName="registry-server" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.092021 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.096408 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.097194 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.097276 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.097484 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.101866 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.109870 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p"] Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.175310 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.175394 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.175458 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2h6h\" (UniqueName: \"kubernetes.io/projected/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-kube-api-access-q2h6h\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.175838 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.175961 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.278109 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.278307 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.279584 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.279799 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.280561 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2h6h\" (UniqueName: \"kubernetes.io/projected/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-kube-api-access-q2h6h\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.286184 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.287093 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.287423 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.287689 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.317049 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2h6h\" (UniqueName: \"kubernetes.io/projected/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-kube-api-access-q2h6h\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:40 crc kubenswrapper[4843]: I0318 13:04:40.423553 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:04:41 crc kubenswrapper[4843]: I0318 13:04:41.003636 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p"] Mar 18 13:04:42 crc kubenswrapper[4843]: I0318 13:04:42.008708 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" event={"ID":"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7","Type":"ContainerStarted","Data":"7e0b835458d9f57fb0c9eca0e05aff1b7df5f37d485f7e0a96a29ac1a17090bc"} Mar 18 13:04:42 crc kubenswrapper[4843]: I0318 13:04:42.008951 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" event={"ID":"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7","Type":"ContainerStarted","Data":"8b203be3e0946c2dec9a1540e66ab952ce97960279e7fc29593131c028193782"} Mar 18 13:04:42 crc kubenswrapper[4843]: I0318 13:04:42.046846 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" podStartSLOduration=1.83635544 podStartE2EDuration="2.046817092s" podCreationTimestamp="2026-03-18 13:04:40 +0000 UTC" firstStartedPulling="2026-03-18 13:04:41.01045226 +0000 UTC m=+3314.726277784" lastFinishedPulling="2026-03-18 13:04:41.220913912 +0000 UTC m=+3314.936739436" observedRunningTime="2026-03-18 13:04:42.036926072 +0000 UTC m=+3315.752751646" watchObservedRunningTime="2026-03-18 13:04:42.046817092 +0000 UTC m=+3315.762642626" Mar 18 13:04:43 crc kubenswrapper[4843]: I0318 13:04:43.985761 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:04:43 crc kubenswrapper[4843]: E0318 13:04:43.986422 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:04:55 crc kubenswrapper[4843]: I0318 13:04:55.984906 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:04:55 crc kubenswrapper[4843]: E0318 13:04:55.986417 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:05:06 crc kubenswrapper[4843]: I0318 13:05:06.144021 4843 scope.go:117] "RemoveContainer" containerID="78b6318933ff8ee3b2494c40057ab6a73b56732944b5d87b198124495c25897f" Mar 18 13:05:07 crc kubenswrapper[4843]: I0318 13:05:07.983771 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:05:07 crc kubenswrapper[4843]: E0318 13:05:07.984219 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:05:23 crc kubenswrapper[4843]: I0318 13:05:22.984406 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:05:24 crc kubenswrapper[4843]: I0318 13:05:24.768798 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"3891d3af7e234a225653322ace5f5d4313d80295e37a05c82ac85445c877fa76"} Mar 18 13:06:00 crc kubenswrapper[4843]: I0318 13:06:00.173531 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563986-kxvjt"] Mar 18 13:06:00 crc kubenswrapper[4843]: I0318 13:06:00.175923 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-kxvjt" Mar 18 13:06:00 crc kubenswrapper[4843]: I0318 13:06:00.178379 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:06:00 crc kubenswrapper[4843]: I0318 13:06:00.178928 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:06:00 crc kubenswrapper[4843]: I0318 13:06:00.179925 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:06:00 crc kubenswrapper[4843]: I0318 13:06:00.182453 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-kxvjt"] Mar 18 13:06:00 crc kubenswrapper[4843]: I0318 13:06:00.227857 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2ltm\" (UniqueName: \"kubernetes.io/projected/d7dc4737-cbc6-4a45-b535-d0d903aac1dc-kube-api-access-d2ltm\") pod \"auto-csr-approver-29563986-kxvjt\" (UID: \"d7dc4737-cbc6-4a45-b535-d0d903aac1dc\") " pod="openshift-infra/auto-csr-approver-29563986-kxvjt" Mar 18 13:06:00 crc kubenswrapper[4843]: I0318 13:06:00.329779 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2ltm\" (UniqueName: \"kubernetes.io/projected/d7dc4737-cbc6-4a45-b535-d0d903aac1dc-kube-api-access-d2ltm\") pod \"auto-csr-approver-29563986-kxvjt\" (UID: \"d7dc4737-cbc6-4a45-b535-d0d903aac1dc\") " pod="openshift-infra/auto-csr-approver-29563986-kxvjt" Mar 18 13:06:00 crc kubenswrapper[4843]: I0318 13:06:00.365748 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2ltm\" (UniqueName: \"kubernetes.io/projected/d7dc4737-cbc6-4a45-b535-d0d903aac1dc-kube-api-access-d2ltm\") pod \"auto-csr-approver-29563986-kxvjt\" (UID: \"d7dc4737-cbc6-4a45-b535-d0d903aac1dc\") " pod="openshift-infra/auto-csr-approver-29563986-kxvjt" Mar 18 13:06:00 crc kubenswrapper[4843]: I0318 13:06:00.507009 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-kxvjt" Mar 18 13:06:00 crc kubenswrapper[4843]: I0318 13:06:00.979638 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-kxvjt"] Mar 18 13:06:01 crc kubenswrapper[4843]: I0318 13:06:01.311600 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563986-kxvjt" event={"ID":"d7dc4737-cbc6-4a45-b535-d0d903aac1dc","Type":"ContainerStarted","Data":"add235b90209887229f22145768458b9eda84c5889e92501157a849234f1dedf"} Mar 18 13:06:03 crc kubenswrapper[4843]: I0318 13:06:03.335893 4843 generic.go:334] "Generic (PLEG): container finished" podID="d7dc4737-cbc6-4a45-b535-d0d903aac1dc" containerID="f31bf6e605e9ee5e3936ab9288281410525412a395e3a81622bf0f55de605bde" exitCode=0 Mar 18 13:06:03 crc kubenswrapper[4843]: I0318 13:06:03.335994 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563986-kxvjt" event={"ID":"d7dc4737-cbc6-4a45-b535-d0d903aac1dc","Type":"ContainerDied","Data":"f31bf6e605e9ee5e3936ab9288281410525412a395e3a81622bf0f55de605bde"} Mar 18 13:06:04 crc kubenswrapper[4843]: I0318 13:06:04.776916 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-kxvjt" Mar 18 13:06:04 crc kubenswrapper[4843]: I0318 13:06:04.974023 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2ltm\" (UniqueName: \"kubernetes.io/projected/d7dc4737-cbc6-4a45-b535-d0d903aac1dc-kube-api-access-d2ltm\") pod \"d7dc4737-cbc6-4a45-b535-d0d903aac1dc\" (UID: \"d7dc4737-cbc6-4a45-b535-d0d903aac1dc\") " Mar 18 13:06:04 crc kubenswrapper[4843]: I0318 13:06:04.980863 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7dc4737-cbc6-4a45-b535-d0d903aac1dc-kube-api-access-d2ltm" (OuterVolumeSpecName: "kube-api-access-d2ltm") pod "d7dc4737-cbc6-4a45-b535-d0d903aac1dc" (UID: "d7dc4737-cbc6-4a45-b535-d0d903aac1dc"). InnerVolumeSpecName "kube-api-access-d2ltm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:06:05 crc kubenswrapper[4843]: I0318 13:06:05.076975 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2ltm\" (UniqueName: \"kubernetes.io/projected/d7dc4737-cbc6-4a45-b535-d0d903aac1dc-kube-api-access-d2ltm\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:05 crc kubenswrapper[4843]: I0318 13:06:05.362080 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563986-kxvjt" event={"ID":"d7dc4737-cbc6-4a45-b535-d0d903aac1dc","Type":"ContainerDied","Data":"add235b90209887229f22145768458b9eda84c5889e92501157a849234f1dedf"} Mar 18 13:06:05 crc kubenswrapper[4843]: I0318 13:06:05.362136 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="add235b90209887229f22145768458b9eda84c5889e92501157a849234f1dedf" Mar 18 13:06:05 crc kubenswrapper[4843]: I0318 13:06:05.362178 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-kxvjt" Mar 18 13:06:05 crc kubenswrapper[4843]: I0318 13:06:05.860090 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563980-vmzsh"] Mar 18 13:06:05 crc kubenswrapper[4843]: I0318 13:06:05.868529 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563980-vmzsh"] Mar 18 13:06:06 crc kubenswrapper[4843]: I0318 13:06:06.999149 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a766d2c-966a-4feb-9add-d31158c28c32" path="/var/lib/kubelet/pods/1a766d2c-966a-4feb-9add-d31158c28c32/volumes" Mar 18 13:06:26 crc kubenswrapper[4843]: I0318 13:06:26.716985 4843 generic.go:334] "Generic (PLEG): container finished" podID="d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7" containerID="7e0b835458d9f57fb0c9eca0e05aff1b7df5f37d485f7e0a96a29ac1a17090bc" exitCode=2 Mar 18 13:06:26 crc kubenswrapper[4843]: I0318 13:06:26.717277 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" event={"ID":"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7","Type":"ContainerDied","Data":"7e0b835458d9f57fb0c9eca0e05aff1b7df5f37d485f7e0a96a29ac1a17090bc"} Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.289974 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.440417 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2h6h\" (UniqueName: \"kubernetes.io/projected/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-kube-api-access-q2h6h\") pod \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.440619 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-inventory\") pod \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.440673 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-combined-ca-bundle\") pod \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.440815 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-secret-0\") pod \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.440914 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-ssh-key-openstack-edpm-ipam\") pod \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.448739 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-kube-api-access-q2h6h" (OuterVolumeSpecName: "kube-api-access-q2h6h") pod "d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7" (UID: "d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7"). InnerVolumeSpecName "kube-api-access-q2h6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.449835 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7" (UID: "d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.538558 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7" (UID: "d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.541456 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7" (UID: "d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.542415 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-inventory" (OuterVolumeSpecName: "inventory") pod "d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7" (UID: "d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.542481 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-secret-0\") pod \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\" (UID: \"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7\") " Mar 18 13:06:28 crc kubenswrapper[4843]: W0318 13:06:28.542629 4843 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7/volumes/kubernetes.io~secret/libvirt-secret-0 Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.542669 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7" (UID: "d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.543188 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.543212 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2h6h\" (UniqueName: \"kubernetes.io/projected/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-kube-api-access-q2h6h\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.543225 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.543235 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.543245 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.741054 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" event={"ID":"d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7","Type":"ContainerDied","Data":"8b203be3e0946c2dec9a1540e66ab952ce97960279e7fc29593131c028193782"} Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.741106 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b203be3e0946c2dec9a1540e66ab952ce97960279e7fc29593131c028193782" Mar 18 13:06:28 crc kubenswrapper[4843]: I0318 13:06:28.741199 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p" Mar 18 13:07:06 crc kubenswrapper[4843]: I0318 13:07:06.276455 4843 scope.go:117] "RemoveContainer" containerID="4ec5712a16e945fe8b581d10403ccad1034be78daec015ca0dbf0631a3c341cc" Mar 18 13:07:50 crc kubenswrapper[4843]: I0318 13:07:50.035041 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:07:50 crc kubenswrapper[4843]: I0318 13:07:50.036306 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.146073 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563988-rkhbk"] Mar 18 13:08:00 crc kubenswrapper[4843]: E0318 13:08:00.147250 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7dc4737-cbc6-4a45-b535-d0d903aac1dc" containerName="oc" Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.147271 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7dc4737-cbc6-4a45-b535-d0d903aac1dc" containerName="oc" Mar 18 13:08:00 crc kubenswrapper[4843]: E0318 13:08:00.147288 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.147297 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.147543 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7dc4737-cbc6-4a45-b535-d0d903aac1dc" containerName="oc" Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.147571 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.148464 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-rkhbk" Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.150950 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.151032 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.151776 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.160488 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-rkhbk"] Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.249984 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn2w8\" (UniqueName: \"kubernetes.io/projected/833ea7f3-18ec-4f3e-9e88-e475af555255-kube-api-access-fn2w8\") pod \"auto-csr-approver-29563988-rkhbk\" (UID: \"833ea7f3-18ec-4f3e-9e88-e475af555255\") " pod="openshift-infra/auto-csr-approver-29563988-rkhbk" Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.352401 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn2w8\" (UniqueName: \"kubernetes.io/projected/833ea7f3-18ec-4f3e-9e88-e475af555255-kube-api-access-fn2w8\") pod \"auto-csr-approver-29563988-rkhbk\" (UID: \"833ea7f3-18ec-4f3e-9e88-e475af555255\") " pod="openshift-infra/auto-csr-approver-29563988-rkhbk" Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.379755 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn2w8\" (UniqueName: \"kubernetes.io/projected/833ea7f3-18ec-4f3e-9e88-e475af555255-kube-api-access-fn2w8\") pod \"auto-csr-approver-29563988-rkhbk\" (UID: \"833ea7f3-18ec-4f3e-9e88-e475af555255\") " pod="openshift-infra/auto-csr-approver-29563988-rkhbk" Mar 18 13:08:00 crc kubenswrapper[4843]: I0318 13:08:00.467317 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-rkhbk" Mar 18 13:08:01 crc kubenswrapper[4843]: I0318 13:08:01.091704 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-rkhbk"] Mar 18 13:08:02 crc kubenswrapper[4843]: I0318 13:08:02.072399 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563988-rkhbk" event={"ID":"833ea7f3-18ec-4f3e-9e88-e475af555255","Type":"ContainerStarted","Data":"e2cd4cb37d278ee03a97fb5e775386e35a6740e43384e3f2b00efff13dab938a"} Mar 18 13:08:09 crc kubenswrapper[4843]: I0318 13:08:09.137138 4843 generic.go:334] "Generic (PLEG): container finished" podID="833ea7f3-18ec-4f3e-9e88-e475af555255" containerID="3137cac00d9e53a4189e83a0ab9d1875153bd124063390792391eb790f5b78ad" exitCode=0 Mar 18 13:08:09 crc kubenswrapper[4843]: I0318 13:08:09.137507 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563988-rkhbk" event={"ID":"833ea7f3-18ec-4f3e-9e88-e475af555255","Type":"ContainerDied","Data":"3137cac00d9e53a4189e83a0ab9d1875153bd124063390792391eb790f5b78ad"} Mar 18 13:08:10 crc kubenswrapper[4843]: I0318 13:08:10.473910 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-rkhbk" Mar 18 13:08:10 crc kubenswrapper[4843]: I0318 13:08:10.592674 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn2w8\" (UniqueName: \"kubernetes.io/projected/833ea7f3-18ec-4f3e-9e88-e475af555255-kube-api-access-fn2w8\") pod \"833ea7f3-18ec-4f3e-9e88-e475af555255\" (UID: \"833ea7f3-18ec-4f3e-9e88-e475af555255\") " Mar 18 13:08:10 crc kubenswrapper[4843]: I0318 13:08:10.598531 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/833ea7f3-18ec-4f3e-9e88-e475af555255-kube-api-access-fn2w8" (OuterVolumeSpecName: "kube-api-access-fn2w8") pod "833ea7f3-18ec-4f3e-9e88-e475af555255" (UID: "833ea7f3-18ec-4f3e-9e88-e475af555255"). InnerVolumeSpecName "kube-api-access-fn2w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:08:10 crc kubenswrapper[4843]: I0318 13:08:10.695008 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn2w8\" (UniqueName: \"kubernetes.io/projected/833ea7f3-18ec-4f3e-9e88-e475af555255-kube-api-access-fn2w8\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:11 crc kubenswrapper[4843]: I0318 13:08:11.160333 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563988-rkhbk" event={"ID":"833ea7f3-18ec-4f3e-9e88-e475af555255","Type":"ContainerDied","Data":"e2cd4cb37d278ee03a97fb5e775386e35a6740e43384e3f2b00efff13dab938a"} Mar 18 13:08:11 crc kubenswrapper[4843]: I0318 13:08:11.160400 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-rkhbk" Mar 18 13:08:11 crc kubenswrapper[4843]: I0318 13:08:11.160408 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2cd4cb37d278ee03a97fb5e775386e35a6740e43384e3f2b00efff13dab938a" Mar 18 13:08:11 crc kubenswrapper[4843]: I0318 13:08:11.546185 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563982-8gn4k"] Mar 18 13:08:11 crc kubenswrapper[4843]: I0318 13:08:11.557174 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563982-8gn4k"] Mar 18 13:08:12 crc kubenswrapper[4843]: I0318 13:08:12.995811 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f12ca1e-dde7-4136-9f77-ef1d5a3ec966" path="/var/lib/kubelet/pods/7f12ca1e-dde7-4136-9f77-ef1d5a3ec966/volumes" Mar 18 13:08:20 crc kubenswrapper[4843]: I0318 13:08:20.035424 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:08:20 crc kubenswrapper[4843]: I0318 13:08:20.036152 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.402881 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-87nnw"] Mar 18 13:08:39 crc kubenswrapper[4843]: E0318 13:08:39.404782 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833ea7f3-18ec-4f3e-9e88-e475af555255" containerName="oc" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.404806 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="833ea7f3-18ec-4f3e-9e88-e475af555255" containerName="oc" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.405489 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="833ea7f3-18ec-4f3e-9e88-e475af555255" containerName="oc" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.409269 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.417516 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-87nnw"] Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.604603 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-utilities\") pod \"redhat-marketplace-87nnw\" (UID: \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\") " pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.604683 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-catalog-content\") pod \"redhat-marketplace-87nnw\" (UID: \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\") " pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.605256 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj4kh\" (UniqueName: \"kubernetes.io/projected/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-kube-api-access-vj4kh\") pod \"redhat-marketplace-87nnw\" (UID: \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\") " pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.707379 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-utilities\") pod \"redhat-marketplace-87nnw\" (UID: \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\") " pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.707433 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-catalog-content\") pod \"redhat-marketplace-87nnw\" (UID: \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\") " pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.707564 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj4kh\" (UniqueName: \"kubernetes.io/projected/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-kube-api-access-vj4kh\") pod \"redhat-marketplace-87nnw\" (UID: \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\") " pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.708198 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-utilities\") pod \"redhat-marketplace-87nnw\" (UID: \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\") " pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.708254 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-catalog-content\") pod \"redhat-marketplace-87nnw\" (UID: \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\") " pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.740812 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj4kh\" (UniqueName: \"kubernetes.io/projected/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-kube-api-access-vj4kh\") pod \"redhat-marketplace-87nnw\" (UID: \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\") " pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:39 crc kubenswrapper[4843]: I0318 13:08:39.749585 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:40 crc kubenswrapper[4843]: I0318 13:08:40.285105 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-87nnw"] Mar 18 13:08:40 crc kubenswrapper[4843]: I0318 13:08:40.788943 4843 generic.go:334] "Generic (PLEG): container finished" podID="ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" containerID="8f7b7bfb5c08528a392816f0eb26dabcc0c80be1153de0a388e32e38c8911c9a" exitCode=0 Mar 18 13:08:40 crc kubenswrapper[4843]: I0318 13:08:40.788988 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-87nnw" event={"ID":"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c","Type":"ContainerDied","Data":"8f7b7bfb5c08528a392816f0eb26dabcc0c80be1153de0a388e32e38c8911c9a"} Mar 18 13:08:40 crc kubenswrapper[4843]: I0318 13:08:40.789017 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-87nnw" event={"ID":"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c","Type":"ContainerStarted","Data":"3c89a7a5831810aae1ee5ada0b182b0359f543d70e8f1132cab09e12ea62a057"} Mar 18 13:08:40 crc kubenswrapper[4843]: I0318 13:08:40.791575 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:08:41 crc kubenswrapper[4843]: I0318 13:08:41.800239 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-87nnw" event={"ID":"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c","Type":"ContainerStarted","Data":"55438eba93e0b071123d5d10dba7cca69dfec3b1f6ad60056b3a81dd4f5b46a0"} Mar 18 13:08:42 crc kubenswrapper[4843]: I0318 13:08:42.811885 4843 generic.go:334] "Generic (PLEG): container finished" podID="ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" containerID="55438eba93e0b071123d5d10dba7cca69dfec3b1f6ad60056b3a81dd4f5b46a0" exitCode=0 Mar 18 13:08:42 crc kubenswrapper[4843]: I0318 13:08:42.812078 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-87nnw" event={"ID":"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c","Type":"ContainerDied","Data":"55438eba93e0b071123d5d10dba7cca69dfec3b1f6ad60056b3a81dd4f5b46a0"} Mar 18 13:08:43 crc kubenswrapper[4843]: I0318 13:08:43.844207 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-87nnw" event={"ID":"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c","Type":"ContainerStarted","Data":"43c0cdcd91c7236bc7af827b6338303b58b4ed8bf7e3b864f94c97f8b182c75f"} Mar 18 13:08:43 crc kubenswrapper[4843]: I0318 13:08:43.873680 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-87nnw" podStartSLOduration=2.39159313 podStartE2EDuration="4.87359441s" podCreationTimestamp="2026-03-18 13:08:39 +0000 UTC" firstStartedPulling="2026-03-18 13:08:40.791257061 +0000 UTC m=+3554.507082585" lastFinishedPulling="2026-03-18 13:08:43.273258341 +0000 UTC m=+3556.989083865" observedRunningTime="2026-03-18 13:08:43.862720642 +0000 UTC m=+3557.578546166" watchObservedRunningTime="2026-03-18 13:08:43.87359441 +0000 UTC m=+3557.589419934" Mar 18 13:08:49 crc kubenswrapper[4843]: I0318 13:08:49.750787 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:49 crc kubenswrapper[4843]: I0318 13:08:49.751098 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:49 crc kubenswrapper[4843]: I0318 13:08:49.802862 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:49 crc kubenswrapper[4843]: I0318 13:08:49.948333 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:50 crc kubenswrapper[4843]: I0318 13:08:50.035423 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:08:50 crc kubenswrapper[4843]: I0318 13:08:50.035512 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:08:50 crc kubenswrapper[4843]: I0318 13:08:50.035564 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 13:08:50 crc kubenswrapper[4843]: I0318 13:08:50.036614 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3891d3af7e234a225653322ace5f5d4313d80295e37a05c82ac85445c877fa76"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:08:50 crc kubenswrapper[4843]: I0318 13:08:50.036710 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://3891d3af7e234a225653322ace5f5d4313d80295e37a05c82ac85445c877fa76" gracePeriod=600 Mar 18 13:08:50 crc kubenswrapper[4843]: I0318 13:08:50.039102 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-87nnw"] Mar 18 13:08:50 crc kubenswrapper[4843]: I0318 13:08:50.912970 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="3891d3af7e234a225653322ace5f5d4313d80295e37a05c82ac85445c877fa76" exitCode=0 Mar 18 13:08:50 crc kubenswrapper[4843]: I0318 13:08:50.913055 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"3891d3af7e234a225653322ace5f5d4313d80295e37a05c82ac85445c877fa76"} Mar 18 13:08:50 crc kubenswrapper[4843]: I0318 13:08:50.913355 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a"} Mar 18 13:08:50 crc kubenswrapper[4843]: I0318 13:08:50.913377 4843 scope.go:117] "RemoveContainer" containerID="dccc0e17d377b5a998985336a054483b9510a3ca41f57ea805f5f984f09d0ab9" Mar 18 13:08:51 crc kubenswrapper[4843]: I0318 13:08:51.934342 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-87nnw" podUID="ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" containerName="registry-server" containerID="cri-o://43c0cdcd91c7236bc7af827b6338303b58b4ed8bf7e3b864f94c97f8b182c75f" gracePeriod=2 Mar 18 13:08:52 crc kubenswrapper[4843]: I0318 13:08:52.960071 4843 generic.go:334] "Generic (PLEG): container finished" podID="ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" containerID="43c0cdcd91c7236bc7af827b6338303b58b4ed8bf7e3b864f94c97f8b182c75f" exitCode=0 Mar 18 13:08:52 crc kubenswrapper[4843]: I0318 13:08:52.960164 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-87nnw" event={"ID":"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c","Type":"ContainerDied","Data":"43c0cdcd91c7236bc7af827b6338303b58b4ed8bf7e3b864f94c97f8b182c75f"} Mar 18 13:08:53 crc kubenswrapper[4843]: I0318 13:08:53.069153 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:53 crc kubenswrapper[4843]: I0318 13:08:53.153471 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-catalog-content\") pod \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\" (UID: \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\") " Mar 18 13:08:53 crc kubenswrapper[4843]: I0318 13:08:53.153596 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-utilities\") pod \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\" (UID: \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\") " Mar 18 13:08:53 crc kubenswrapper[4843]: I0318 13:08:53.153754 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj4kh\" (UniqueName: \"kubernetes.io/projected/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-kube-api-access-vj4kh\") pod \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\" (UID: \"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c\") " Mar 18 13:08:53 crc kubenswrapper[4843]: I0318 13:08:53.155466 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-utilities" (OuterVolumeSpecName: "utilities") pod "ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" (UID: "ab4305ab-666a-4d8c-8d07-3f79fb6bce8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:08:53 crc kubenswrapper[4843]: I0318 13:08:53.160407 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-kube-api-access-vj4kh" (OuterVolumeSpecName: "kube-api-access-vj4kh") pod "ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" (UID: "ab4305ab-666a-4d8c-8d07-3f79fb6bce8c"). InnerVolumeSpecName "kube-api-access-vj4kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:08:53 crc kubenswrapper[4843]: I0318 13:08:53.184299 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" (UID: "ab4305ab-666a-4d8c-8d07-3f79fb6bce8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:08:53 crc kubenswrapper[4843]: I0318 13:08:53.256325 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:53 crc kubenswrapper[4843]: I0318 13:08:53.256374 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj4kh\" (UniqueName: \"kubernetes.io/projected/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-kube-api-access-vj4kh\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:53 crc kubenswrapper[4843]: I0318 13:08:53.256384 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:53 crc kubenswrapper[4843]: I0318 13:08:53.974014 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-87nnw" event={"ID":"ab4305ab-666a-4d8c-8d07-3f79fb6bce8c","Type":"ContainerDied","Data":"3c89a7a5831810aae1ee5ada0b182b0359f543d70e8f1132cab09e12ea62a057"} Mar 18 13:08:53 crc kubenswrapper[4843]: I0318 13:08:53.974376 4843 scope.go:117] "RemoveContainer" containerID="43c0cdcd91c7236bc7af827b6338303b58b4ed8bf7e3b864f94c97f8b182c75f" Mar 18 13:08:53 crc kubenswrapper[4843]: I0318 13:08:53.974509 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-87nnw" Mar 18 13:08:54 crc kubenswrapper[4843]: I0318 13:08:54.005061 4843 scope.go:117] "RemoveContainer" containerID="55438eba93e0b071123d5d10dba7cca69dfec3b1f6ad60056b3a81dd4f5b46a0" Mar 18 13:08:54 crc kubenswrapper[4843]: I0318 13:08:54.019922 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-87nnw"] Mar 18 13:08:54 crc kubenswrapper[4843]: I0318 13:08:54.030327 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-87nnw"] Mar 18 13:08:54 crc kubenswrapper[4843]: I0318 13:08:54.031453 4843 scope.go:117] "RemoveContainer" containerID="8f7b7bfb5c08528a392816f0eb26dabcc0c80be1153de0a388e32e38c8911c9a" Mar 18 13:08:54 crc kubenswrapper[4843]: I0318 13:08:54.997869 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" path="/var/lib/kubelet/pods/ab4305ab-666a-4d8c-8d07-3f79fb6bce8c/volumes" Mar 18 13:09:06 crc kubenswrapper[4843]: I0318 13:09:06.385241 4843 scope.go:117] "RemoveContainer" containerID="e2b559b6cd7dd1f3399c539fe9ba23ca7c5d3328ebaa68c19f5c557fec755510" Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.160606 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563990-dscgm"] Mar 18 13:10:00 crc kubenswrapper[4843]: E0318 13:10:00.161452 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" containerName="extract-content" Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.161466 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" containerName="extract-content" Mar 18 13:10:00 crc kubenswrapper[4843]: E0318 13:10:00.161490 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" containerName="extract-utilities" Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.161496 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" containerName="extract-utilities" Mar 18 13:10:00 crc kubenswrapper[4843]: E0318 13:10:00.161514 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" containerName="registry-server" Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.161520 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" containerName="registry-server" Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.161715 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab4305ab-666a-4d8c-8d07-3f79fb6bce8c" containerName="registry-server" Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.162335 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-dscgm" Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.167480 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.167724 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.167840 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.176055 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-dscgm"] Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.233169 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dpcb\" (UniqueName: \"kubernetes.io/projected/0f1c4e85-39a4-44a3-9e45-09587f0e59cb-kube-api-access-5dpcb\") pod \"auto-csr-approver-29563990-dscgm\" (UID: \"0f1c4e85-39a4-44a3-9e45-09587f0e59cb\") " pod="openshift-infra/auto-csr-approver-29563990-dscgm" Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.334593 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dpcb\" (UniqueName: \"kubernetes.io/projected/0f1c4e85-39a4-44a3-9e45-09587f0e59cb-kube-api-access-5dpcb\") pod \"auto-csr-approver-29563990-dscgm\" (UID: \"0f1c4e85-39a4-44a3-9e45-09587f0e59cb\") " pod="openshift-infra/auto-csr-approver-29563990-dscgm" Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.383938 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dpcb\" (UniqueName: \"kubernetes.io/projected/0f1c4e85-39a4-44a3-9e45-09587f0e59cb-kube-api-access-5dpcb\") pod \"auto-csr-approver-29563990-dscgm\" (UID: \"0f1c4e85-39a4-44a3-9e45-09587f0e59cb\") " pod="openshift-infra/auto-csr-approver-29563990-dscgm" Mar 18 13:10:00 crc kubenswrapper[4843]: I0318 13:10:00.485229 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-dscgm" Mar 18 13:10:01 crc kubenswrapper[4843]: I0318 13:10:01.087519 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-dscgm"] Mar 18 13:10:01 crc kubenswrapper[4843]: I0318 13:10:01.344595 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-dscgm" event={"ID":"0f1c4e85-39a4-44a3-9e45-09587f0e59cb","Type":"ContainerStarted","Data":"25e640159b845eae14c207b449fc23220ab5a198539112d4efd41fe5bc56a6d2"} Mar 18 13:10:02 crc kubenswrapper[4843]: I0318 13:10:02.003331 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v7vl4"] Mar 18 13:10:02 crc kubenswrapper[4843]: I0318 13:10:02.005947 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:02 crc kubenswrapper[4843]: I0318 13:10:02.015407 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v7vl4"] Mar 18 13:10:02 crc kubenswrapper[4843]: I0318 13:10:02.155332 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-catalog-content\") pod \"community-operators-v7vl4\" (UID: \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\") " pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:02 crc kubenswrapper[4843]: I0318 13:10:02.155557 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmbct\" (UniqueName: \"kubernetes.io/projected/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-kube-api-access-lmbct\") pod \"community-operators-v7vl4\" (UID: \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\") " pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:02 crc kubenswrapper[4843]: I0318 13:10:02.155684 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-utilities\") pod \"community-operators-v7vl4\" (UID: \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\") " pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:02 crc kubenswrapper[4843]: I0318 13:10:02.257235 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmbct\" (UniqueName: \"kubernetes.io/projected/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-kube-api-access-lmbct\") pod \"community-operators-v7vl4\" (UID: \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\") " pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:02 crc kubenswrapper[4843]: I0318 13:10:02.257358 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-utilities\") pod \"community-operators-v7vl4\" (UID: \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\") " pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:02 crc kubenswrapper[4843]: I0318 13:10:02.257475 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-catalog-content\") pod \"community-operators-v7vl4\" (UID: \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\") " pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:02 crc kubenswrapper[4843]: I0318 13:10:02.257952 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-catalog-content\") pod \"community-operators-v7vl4\" (UID: \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\") " pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:02 crc kubenswrapper[4843]: I0318 13:10:02.258331 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-utilities\") pod \"community-operators-v7vl4\" (UID: \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\") " pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:02 crc kubenswrapper[4843]: I0318 13:10:02.282226 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmbct\" (UniqueName: \"kubernetes.io/projected/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-kube-api-access-lmbct\") pod \"community-operators-v7vl4\" (UID: \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\") " pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:02 crc kubenswrapper[4843]: I0318 13:10:02.358599 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:03 crc kubenswrapper[4843]: W0318 13:10:03.042890 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod886fd47f_840d_4f86_bf2c_c5fcea2b05cb.slice/crio-bda57916b1699eb31020f38b8a55faa9d14a8026ed2fe5039a3052068c25eb48 WatchSource:0}: Error finding container bda57916b1699eb31020f38b8a55faa9d14a8026ed2fe5039a3052068c25eb48: Status 404 returned error can't find the container with id bda57916b1699eb31020f38b8a55faa9d14a8026ed2fe5039a3052068c25eb48 Mar 18 13:10:03 crc kubenswrapper[4843]: I0318 13:10:03.044237 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v7vl4"] Mar 18 13:10:03 crc kubenswrapper[4843]: I0318 13:10:03.367163 4843 generic.go:334] "Generic (PLEG): container finished" podID="886fd47f-840d-4f86-bf2c-c5fcea2b05cb" containerID="2719127a771f3582a8cc9fabcc527bb325f546475e67b935946434da5a9855d1" exitCode=0 Mar 18 13:10:03 crc kubenswrapper[4843]: I0318 13:10:03.367228 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v7vl4" event={"ID":"886fd47f-840d-4f86-bf2c-c5fcea2b05cb","Type":"ContainerDied","Data":"2719127a771f3582a8cc9fabcc527bb325f546475e67b935946434da5a9855d1"} Mar 18 13:10:03 crc kubenswrapper[4843]: I0318 13:10:03.367258 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v7vl4" event={"ID":"886fd47f-840d-4f86-bf2c-c5fcea2b05cb","Type":"ContainerStarted","Data":"bda57916b1699eb31020f38b8a55faa9d14a8026ed2fe5039a3052068c25eb48"} Mar 18 13:10:04 crc kubenswrapper[4843]: I0318 13:10:04.384214 4843 generic.go:334] "Generic (PLEG): container finished" podID="0f1c4e85-39a4-44a3-9e45-09587f0e59cb" containerID="69a7f998a933c896afd8ce0c910ac40bbc98f2fb384b94e90dc7d63939c60962" exitCode=0 Mar 18 13:10:04 crc kubenswrapper[4843]: I0318 13:10:04.384334 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-dscgm" event={"ID":"0f1c4e85-39a4-44a3-9e45-09587f0e59cb","Type":"ContainerDied","Data":"69a7f998a933c896afd8ce0c910ac40bbc98f2fb384b94e90dc7d63939c60962"} Mar 18 13:10:05 crc kubenswrapper[4843]: I0318 13:10:05.399845 4843 generic.go:334] "Generic (PLEG): container finished" podID="886fd47f-840d-4f86-bf2c-c5fcea2b05cb" containerID="264a745a45d09bff2ea6478d80f175ab44b06d1f1adec8983300b6868f6e0bd1" exitCode=0 Mar 18 13:10:05 crc kubenswrapper[4843]: I0318 13:10:05.399927 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v7vl4" event={"ID":"886fd47f-840d-4f86-bf2c-c5fcea2b05cb","Type":"ContainerDied","Data":"264a745a45d09bff2ea6478d80f175ab44b06d1f1adec8983300b6868f6e0bd1"} Mar 18 13:10:05 crc kubenswrapper[4843]: I0318 13:10:05.728307 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-dscgm" Mar 18 13:10:05 crc kubenswrapper[4843]: I0318 13:10:05.865078 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dpcb\" (UniqueName: \"kubernetes.io/projected/0f1c4e85-39a4-44a3-9e45-09587f0e59cb-kube-api-access-5dpcb\") pod \"0f1c4e85-39a4-44a3-9e45-09587f0e59cb\" (UID: \"0f1c4e85-39a4-44a3-9e45-09587f0e59cb\") " Mar 18 13:10:05 crc kubenswrapper[4843]: I0318 13:10:05.870635 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f1c4e85-39a4-44a3-9e45-09587f0e59cb-kube-api-access-5dpcb" (OuterVolumeSpecName: "kube-api-access-5dpcb") pod "0f1c4e85-39a4-44a3-9e45-09587f0e59cb" (UID: "0f1c4e85-39a4-44a3-9e45-09587f0e59cb"). InnerVolumeSpecName "kube-api-access-5dpcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:10:05 crc kubenswrapper[4843]: I0318 13:10:05.971096 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dpcb\" (UniqueName: \"kubernetes.io/projected/0f1c4e85-39a4-44a3-9e45-09587f0e59cb-kube-api-access-5dpcb\") on node \"crc\" DevicePath \"\"" Mar 18 13:10:06 crc kubenswrapper[4843]: I0318 13:10:06.409241 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v7vl4" event={"ID":"886fd47f-840d-4f86-bf2c-c5fcea2b05cb","Type":"ContainerStarted","Data":"4655a48fc6145f3e7a4d8f6f31ade4919a7ea1a6536349ee4c29c007b36fa569"} Mar 18 13:10:06 crc kubenswrapper[4843]: I0318 13:10:06.410639 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-dscgm" event={"ID":"0f1c4e85-39a4-44a3-9e45-09587f0e59cb","Type":"ContainerDied","Data":"25e640159b845eae14c207b449fc23220ab5a198539112d4efd41fe5bc56a6d2"} Mar 18 13:10:06 crc kubenswrapper[4843]: I0318 13:10:06.410989 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25e640159b845eae14c207b449fc23220ab5a198539112d4efd41fe5bc56a6d2" Mar 18 13:10:06 crc kubenswrapper[4843]: I0318 13:10:06.410746 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-dscgm" Mar 18 13:10:06 crc kubenswrapper[4843]: I0318 13:10:06.428322 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v7vl4" podStartSLOduration=2.917151581 podStartE2EDuration="5.428300926s" podCreationTimestamp="2026-03-18 13:10:01 +0000 UTC" firstStartedPulling="2026-03-18 13:10:03.372273803 +0000 UTC m=+3637.088099337" lastFinishedPulling="2026-03-18 13:10:05.883423158 +0000 UTC m=+3639.599248682" observedRunningTime="2026-03-18 13:10:06.426034871 +0000 UTC m=+3640.141860405" watchObservedRunningTime="2026-03-18 13:10:06.428300926 +0000 UTC m=+3640.144126450" Mar 18 13:10:06 crc kubenswrapper[4843]: I0318 13:10:06.802567 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563984-875jw"] Mar 18 13:10:06 crc kubenswrapper[4843]: I0318 13:10:06.810930 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563984-875jw"] Mar 18 13:10:06 crc kubenswrapper[4843]: I0318 13:10:06.996871 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b58fe38-600a-472e-ab5f-0f8b51b3a3a5" path="/var/lib/kubelet/pods/1b58fe38-600a-472e-ab5f-0f8b51b3a3a5/volumes" Mar 18 13:10:12 crc kubenswrapper[4843]: I0318 13:10:12.358913 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:12 crc kubenswrapper[4843]: I0318 13:10:12.359521 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:12 crc kubenswrapper[4843]: I0318 13:10:12.406477 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:12 crc kubenswrapper[4843]: I0318 13:10:12.509766 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:12 crc kubenswrapper[4843]: I0318 13:10:12.657284 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v7vl4"] Mar 18 13:10:14 crc kubenswrapper[4843]: I0318 13:10:14.485467 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v7vl4" podUID="886fd47f-840d-4f86-bf2c-c5fcea2b05cb" containerName="registry-server" containerID="cri-o://4655a48fc6145f3e7a4d8f6f31ade4919a7ea1a6536349ee4c29c007b36fa569" gracePeriod=2 Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.014024 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.134758 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-catalog-content\") pod \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\" (UID: \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\") " Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.134947 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-utilities\") pod \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\" (UID: \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\") " Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.135001 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmbct\" (UniqueName: \"kubernetes.io/projected/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-kube-api-access-lmbct\") pod \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\" (UID: \"886fd47f-840d-4f86-bf2c-c5fcea2b05cb\") " Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.135862 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-utilities" (OuterVolumeSpecName: "utilities") pod "886fd47f-840d-4f86-bf2c-c5fcea2b05cb" (UID: "886fd47f-840d-4f86-bf2c-c5fcea2b05cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.140971 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-kube-api-access-lmbct" (OuterVolumeSpecName: "kube-api-access-lmbct") pod "886fd47f-840d-4f86-bf2c-c5fcea2b05cb" (UID: "886fd47f-840d-4f86-bf2c-c5fcea2b05cb"). InnerVolumeSpecName "kube-api-access-lmbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.191007 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "886fd47f-840d-4f86-bf2c-c5fcea2b05cb" (UID: "886fd47f-840d-4f86-bf2c-c5fcea2b05cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.237253 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.237292 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmbct\" (UniqueName: \"kubernetes.io/projected/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-kube-api-access-lmbct\") on node \"crc\" DevicePath \"\"" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.237304 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/886fd47f-840d-4f86-bf2c-c5fcea2b05cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.497189 4843 generic.go:334] "Generic (PLEG): container finished" podID="886fd47f-840d-4f86-bf2c-c5fcea2b05cb" containerID="4655a48fc6145f3e7a4d8f6f31ade4919a7ea1a6536349ee4c29c007b36fa569" exitCode=0 Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.497228 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v7vl4" event={"ID":"886fd47f-840d-4f86-bf2c-c5fcea2b05cb","Type":"ContainerDied","Data":"4655a48fc6145f3e7a4d8f6f31ade4919a7ea1a6536349ee4c29c007b36fa569"} Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.497254 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v7vl4" event={"ID":"886fd47f-840d-4f86-bf2c-c5fcea2b05cb","Type":"ContainerDied","Data":"bda57916b1699eb31020f38b8a55faa9d14a8026ed2fe5039a3052068c25eb48"} Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.497273 4843 scope.go:117] "RemoveContainer" containerID="4655a48fc6145f3e7a4d8f6f31ade4919a7ea1a6536349ee4c29c007b36fa569" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.497379 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v7vl4" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.520853 4843 scope.go:117] "RemoveContainer" containerID="264a745a45d09bff2ea6478d80f175ab44b06d1f1adec8983300b6868f6e0bd1" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.545757 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v7vl4"] Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.555630 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v7vl4"] Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.559129 4843 scope.go:117] "RemoveContainer" containerID="2719127a771f3582a8cc9fabcc527bb325f546475e67b935946434da5a9855d1" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.602850 4843 scope.go:117] "RemoveContainer" containerID="4655a48fc6145f3e7a4d8f6f31ade4919a7ea1a6536349ee4c29c007b36fa569" Mar 18 13:10:15 crc kubenswrapper[4843]: E0318 13:10:15.603461 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4655a48fc6145f3e7a4d8f6f31ade4919a7ea1a6536349ee4c29c007b36fa569\": container with ID starting with 4655a48fc6145f3e7a4d8f6f31ade4919a7ea1a6536349ee4c29c007b36fa569 not found: ID does not exist" containerID="4655a48fc6145f3e7a4d8f6f31ade4919a7ea1a6536349ee4c29c007b36fa569" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.603493 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4655a48fc6145f3e7a4d8f6f31ade4919a7ea1a6536349ee4c29c007b36fa569"} err="failed to get container status \"4655a48fc6145f3e7a4d8f6f31ade4919a7ea1a6536349ee4c29c007b36fa569\": rpc error: code = NotFound desc = could not find container \"4655a48fc6145f3e7a4d8f6f31ade4919a7ea1a6536349ee4c29c007b36fa569\": container with ID starting with 4655a48fc6145f3e7a4d8f6f31ade4919a7ea1a6536349ee4c29c007b36fa569 not found: ID does not exist" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.603516 4843 scope.go:117] "RemoveContainer" containerID="264a745a45d09bff2ea6478d80f175ab44b06d1f1adec8983300b6868f6e0bd1" Mar 18 13:10:15 crc kubenswrapper[4843]: E0318 13:10:15.603993 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"264a745a45d09bff2ea6478d80f175ab44b06d1f1adec8983300b6868f6e0bd1\": container with ID starting with 264a745a45d09bff2ea6478d80f175ab44b06d1f1adec8983300b6868f6e0bd1 not found: ID does not exist" containerID="264a745a45d09bff2ea6478d80f175ab44b06d1f1adec8983300b6868f6e0bd1" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.604023 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"264a745a45d09bff2ea6478d80f175ab44b06d1f1adec8983300b6868f6e0bd1"} err="failed to get container status \"264a745a45d09bff2ea6478d80f175ab44b06d1f1adec8983300b6868f6e0bd1\": rpc error: code = NotFound desc = could not find container \"264a745a45d09bff2ea6478d80f175ab44b06d1f1adec8983300b6868f6e0bd1\": container with ID starting with 264a745a45d09bff2ea6478d80f175ab44b06d1f1adec8983300b6868f6e0bd1 not found: ID does not exist" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.604039 4843 scope.go:117] "RemoveContainer" containerID="2719127a771f3582a8cc9fabcc527bb325f546475e67b935946434da5a9855d1" Mar 18 13:10:15 crc kubenswrapper[4843]: E0318 13:10:15.604447 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2719127a771f3582a8cc9fabcc527bb325f546475e67b935946434da5a9855d1\": container with ID starting with 2719127a771f3582a8cc9fabcc527bb325f546475e67b935946434da5a9855d1 not found: ID does not exist" containerID="2719127a771f3582a8cc9fabcc527bb325f546475e67b935946434da5a9855d1" Mar 18 13:10:15 crc kubenswrapper[4843]: I0318 13:10:15.604508 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2719127a771f3582a8cc9fabcc527bb325f546475e67b935946434da5a9855d1"} err="failed to get container status \"2719127a771f3582a8cc9fabcc527bb325f546475e67b935946434da5a9855d1\": rpc error: code = NotFound desc = could not find container \"2719127a771f3582a8cc9fabcc527bb325f546475e67b935946434da5a9855d1\": container with ID starting with 2719127a771f3582a8cc9fabcc527bb325f546475e67b935946434da5a9855d1 not found: ID does not exist" Mar 18 13:10:16 crc kubenswrapper[4843]: I0318 13:10:16.996756 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="886fd47f-840d-4f86-bf2c-c5fcea2b05cb" path="/var/lib/kubelet/pods/886fd47f-840d-4f86-bf2c-c5fcea2b05cb/volumes" Mar 18 13:10:50 crc kubenswrapper[4843]: I0318 13:10:50.035368 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:10:50 crc kubenswrapper[4843]: I0318 13:10:50.035888 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:11:06 crc kubenswrapper[4843]: I0318 13:11:06.532317 4843 scope.go:117] "RemoveContainer" containerID="58d94e500f803a7e36acc706fc5333fe51ab5071fe26244cbd05974dbc96e965" Mar 18 13:11:20 crc kubenswrapper[4843]: I0318 13:11:20.034885 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:11:20 crc kubenswrapper[4843]: I0318 13:11:20.035459 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.032999 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx"] Mar 18 13:11:46 crc kubenswrapper[4843]: E0318 13:11:46.034564 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f1c4e85-39a4-44a3-9e45-09587f0e59cb" containerName="oc" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.034581 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f1c4e85-39a4-44a3-9e45-09587f0e59cb" containerName="oc" Mar 18 13:11:46 crc kubenswrapper[4843]: E0318 13:11:46.034627 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886fd47f-840d-4f86-bf2c-c5fcea2b05cb" containerName="extract-content" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.034634 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="886fd47f-840d-4f86-bf2c-c5fcea2b05cb" containerName="extract-content" Mar 18 13:11:46 crc kubenswrapper[4843]: E0318 13:11:46.034668 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886fd47f-840d-4f86-bf2c-c5fcea2b05cb" containerName="registry-server" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.034677 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="886fd47f-840d-4f86-bf2c-c5fcea2b05cb" containerName="registry-server" Mar 18 13:11:46 crc kubenswrapper[4843]: E0318 13:11:46.034695 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886fd47f-840d-4f86-bf2c-c5fcea2b05cb" containerName="extract-utilities" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.034704 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="886fd47f-840d-4f86-bf2c-c5fcea2b05cb" containerName="extract-utilities" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.035123 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="886fd47f-840d-4f86-bf2c-c5fcea2b05cb" containerName="registry-server" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.035151 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f1c4e85-39a4-44a3-9e45-09587f0e59cb" containerName="oc" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.035834 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.038231 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g7nn2" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.038565 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.038963 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.039145 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.039366 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.049785 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx"] Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.155761 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.155837 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.155902 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.155976 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.156552 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5hcs\" (UniqueName: \"kubernetes.io/projected/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-kube-api-access-d5hcs\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.259028 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5hcs\" (UniqueName: \"kubernetes.io/projected/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-kube-api-access-d5hcs\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.259137 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.259169 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.259945 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.260018 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.265274 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.265346 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.270591 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.271212 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.275814 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5hcs\" (UniqueName: \"kubernetes.io/projected/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-kube-api-access-d5hcs\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-csdpx\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:46 crc kubenswrapper[4843]: I0318 13:11:46.367108 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:11:47 crc kubenswrapper[4843]: I0318 13:11:47.082245 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx"] Mar 18 13:11:47 crc kubenswrapper[4843]: I0318 13:11:47.329150 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" event={"ID":"d8417d5f-c42c-4aa5-bd01-97c74ab650c0","Type":"ContainerStarted","Data":"70bd69fb774e344d680726151778ca758acd36acfddddafa99188f73d2c087e3"} Mar 18 13:11:48 crc kubenswrapper[4843]: I0318 13:11:48.338005 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" event={"ID":"d8417d5f-c42c-4aa5-bd01-97c74ab650c0","Type":"ContainerStarted","Data":"c3829c44638d5b5eeb3654da9816552f54bb1d115c50aa11d9a67a282e403c0a"} Mar 18 13:11:48 crc kubenswrapper[4843]: I0318 13:11:48.360248 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" podStartSLOduration=1.6560444749999998 podStartE2EDuration="2.360230586s" podCreationTimestamp="2026-03-18 13:11:46 +0000 UTC" firstStartedPulling="2026-03-18 13:11:47.096509222 +0000 UTC m=+3740.812334746" lastFinishedPulling="2026-03-18 13:11:47.800695333 +0000 UTC m=+3741.516520857" observedRunningTime="2026-03-18 13:11:48.358149887 +0000 UTC m=+3742.073975401" watchObservedRunningTime="2026-03-18 13:11:48.360230586 +0000 UTC m=+3742.076056110" Mar 18 13:11:50 crc kubenswrapper[4843]: I0318 13:11:50.035096 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:11:50 crc kubenswrapper[4843]: I0318 13:11:50.035446 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:11:50 crc kubenswrapper[4843]: I0318 13:11:50.035501 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 13:11:50 crc kubenswrapper[4843]: I0318 13:11:50.036454 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:11:50 crc kubenswrapper[4843]: I0318 13:11:50.036541 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" gracePeriod=600 Mar 18 13:11:50 crc kubenswrapper[4843]: E0318 13:11:50.163361 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:11:50 crc kubenswrapper[4843]: I0318 13:11:50.360247 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" exitCode=0 Mar 18 13:11:50 crc kubenswrapper[4843]: I0318 13:11:50.360286 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a"} Mar 18 13:11:50 crc kubenswrapper[4843]: I0318 13:11:50.360345 4843 scope.go:117] "RemoveContainer" containerID="3891d3af7e234a225653322ace5f5d4313d80295e37a05c82ac85445c877fa76" Mar 18 13:11:50 crc kubenswrapper[4843]: I0318 13:11:50.361008 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:11:50 crc kubenswrapper[4843]: E0318 13:11:50.361293 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:12:00 crc kubenswrapper[4843]: I0318 13:12:00.153916 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563992-hngbv"] Mar 18 13:12:00 crc kubenswrapper[4843]: I0318 13:12:00.156735 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-hngbv" Mar 18 13:12:00 crc kubenswrapper[4843]: I0318 13:12:00.161939 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:12:00 crc kubenswrapper[4843]: I0318 13:12:00.162055 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:12:00 crc kubenswrapper[4843]: I0318 13:12:00.162210 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:12:00 crc kubenswrapper[4843]: I0318 13:12:00.163724 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-hngbv"] Mar 18 13:12:00 crc kubenswrapper[4843]: I0318 13:12:00.295107 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2snss\" (UniqueName: \"kubernetes.io/projected/fa83801f-d1ab-40f7-8ad2-4251ea55c894-kube-api-access-2snss\") pod \"auto-csr-approver-29563992-hngbv\" (UID: \"fa83801f-d1ab-40f7-8ad2-4251ea55c894\") " pod="openshift-infra/auto-csr-approver-29563992-hngbv" Mar 18 13:12:00 crc kubenswrapper[4843]: I0318 13:12:00.396985 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2snss\" (UniqueName: \"kubernetes.io/projected/fa83801f-d1ab-40f7-8ad2-4251ea55c894-kube-api-access-2snss\") pod \"auto-csr-approver-29563992-hngbv\" (UID: \"fa83801f-d1ab-40f7-8ad2-4251ea55c894\") " pod="openshift-infra/auto-csr-approver-29563992-hngbv" Mar 18 13:12:00 crc kubenswrapper[4843]: I0318 13:12:00.415912 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2snss\" (UniqueName: \"kubernetes.io/projected/fa83801f-d1ab-40f7-8ad2-4251ea55c894-kube-api-access-2snss\") pod \"auto-csr-approver-29563992-hngbv\" (UID: \"fa83801f-d1ab-40f7-8ad2-4251ea55c894\") " pod="openshift-infra/auto-csr-approver-29563992-hngbv" Mar 18 13:12:00 crc kubenswrapper[4843]: I0318 13:12:00.498176 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-hngbv" Mar 18 13:12:00 crc kubenswrapper[4843]: I0318 13:12:00.962756 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-hngbv"] Mar 18 13:12:01 crc kubenswrapper[4843]: I0318 13:12:01.464170 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563992-hngbv" event={"ID":"fa83801f-d1ab-40f7-8ad2-4251ea55c894","Type":"ContainerStarted","Data":"91efc3a8616ce46c231ceb5a43dac2b1476c008e86a304ed7e6be017da825b4b"} Mar 18 13:12:01 crc kubenswrapper[4843]: I0318 13:12:01.984315 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:12:01 crc kubenswrapper[4843]: E0318 13:12:01.984555 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:12:03 crc kubenswrapper[4843]: I0318 13:12:03.482429 4843 generic.go:334] "Generic (PLEG): container finished" podID="fa83801f-d1ab-40f7-8ad2-4251ea55c894" containerID="0475c827a42ca1b8d921d6fbff542530d1407459c7b57a19c839e5fd629e9362" exitCode=0 Mar 18 13:12:03 crc kubenswrapper[4843]: I0318 13:12:03.482511 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563992-hngbv" event={"ID":"fa83801f-d1ab-40f7-8ad2-4251ea55c894","Type":"ContainerDied","Data":"0475c827a42ca1b8d921d6fbff542530d1407459c7b57a19c839e5fd629e9362"} Mar 18 13:12:04 crc kubenswrapper[4843]: I0318 13:12:04.850953 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-hngbv" Mar 18 13:12:05 crc kubenswrapper[4843]: I0318 13:12:05.024463 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2snss\" (UniqueName: \"kubernetes.io/projected/fa83801f-d1ab-40f7-8ad2-4251ea55c894-kube-api-access-2snss\") pod \"fa83801f-d1ab-40f7-8ad2-4251ea55c894\" (UID: \"fa83801f-d1ab-40f7-8ad2-4251ea55c894\") " Mar 18 13:12:05 crc kubenswrapper[4843]: I0318 13:12:05.030430 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa83801f-d1ab-40f7-8ad2-4251ea55c894-kube-api-access-2snss" (OuterVolumeSpecName: "kube-api-access-2snss") pod "fa83801f-d1ab-40f7-8ad2-4251ea55c894" (UID: "fa83801f-d1ab-40f7-8ad2-4251ea55c894"). InnerVolumeSpecName "kube-api-access-2snss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:12:05 crc kubenswrapper[4843]: I0318 13:12:05.128257 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2snss\" (UniqueName: \"kubernetes.io/projected/fa83801f-d1ab-40f7-8ad2-4251ea55c894-kube-api-access-2snss\") on node \"crc\" DevicePath \"\"" Mar 18 13:12:05 crc kubenswrapper[4843]: I0318 13:12:05.501899 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563992-hngbv" event={"ID":"fa83801f-d1ab-40f7-8ad2-4251ea55c894","Type":"ContainerDied","Data":"91efc3a8616ce46c231ceb5a43dac2b1476c008e86a304ed7e6be017da825b4b"} Mar 18 13:12:05 crc kubenswrapper[4843]: I0318 13:12:05.501943 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91efc3a8616ce46c231ceb5a43dac2b1476c008e86a304ed7e6be017da825b4b" Mar 18 13:12:05 crc kubenswrapper[4843]: I0318 13:12:05.501993 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-hngbv" Mar 18 13:12:05 crc kubenswrapper[4843]: I0318 13:12:05.926378 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-kxvjt"] Mar 18 13:12:05 crc kubenswrapper[4843]: I0318 13:12:05.934705 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-kxvjt"] Mar 18 13:12:07 crc kubenswrapper[4843]: I0318 13:12:07.019611 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7dc4737-cbc6-4a45-b535-d0d903aac1dc" path="/var/lib/kubelet/pods/d7dc4737-cbc6-4a45-b535-d0d903aac1dc/volumes" Mar 18 13:12:15 crc kubenswrapper[4843]: I0318 13:12:15.984224 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:12:15 crc kubenswrapper[4843]: E0318 13:12:15.984987 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:12:28 crc kubenswrapper[4843]: I0318 13:12:28.060003 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:12:28 crc kubenswrapper[4843]: E0318 13:12:28.061012 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:12:39 crc kubenswrapper[4843]: I0318 13:12:39.984456 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:12:39 crc kubenswrapper[4843]: E0318 13:12:39.985855 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:12:53 crc kubenswrapper[4843]: I0318 13:12:53.984275 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:12:53 crc kubenswrapper[4843]: E0318 13:12:53.984990 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:13:06 crc kubenswrapper[4843]: I0318 13:13:06.656230 4843 scope.go:117] "RemoveContainer" containerID="f31bf6e605e9ee5e3936ab9288281410525412a395e3a81622bf0f55de605bde" Mar 18 13:13:07 crc kubenswrapper[4843]: I0318 13:13:07.984240 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:13:07 crc kubenswrapper[4843]: E0318 13:13:07.985029 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:13:19 crc kubenswrapper[4843]: I0318 13:13:19.983950 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:13:19 crc kubenswrapper[4843]: E0318 13:13:19.984816 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:13:30 crc kubenswrapper[4843]: I0318 13:13:30.768554 4843 generic.go:334] "Generic (PLEG): container finished" podID="d8417d5f-c42c-4aa5-bd01-97c74ab650c0" containerID="c3829c44638d5b5eeb3654da9816552f54bb1d115c50aa11d9a67a282e403c0a" exitCode=2 Mar 18 13:13:30 crc kubenswrapper[4843]: I0318 13:13:30.768674 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" event={"ID":"d8417d5f-c42c-4aa5-bd01-97c74ab650c0","Type":"ContainerDied","Data":"c3829c44638d5b5eeb3654da9816552f54bb1d115c50aa11d9a67a282e403c0a"} Mar 18 13:13:31 crc kubenswrapper[4843]: I0318 13:13:31.983497 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:13:31 crc kubenswrapper[4843]: E0318 13:13:31.983892 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.282568 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.424689 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5hcs\" (UniqueName: \"kubernetes.io/projected/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-kube-api-access-d5hcs\") pod \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.424760 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-inventory\") pod \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.424813 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-libvirt-combined-ca-bundle\") pod \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.424913 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-ssh-key-openstack-edpm-ipam\") pod \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.425014 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-libvirt-secret-0\") pod \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\" (UID: \"d8417d5f-c42c-4aa5-bd01-97c74ab650c0\") " Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.430513 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d8417d5f-c42c-4aa5-bd01-97c74ab650c0" (UID: "d8417d5f-c42c-4aa5-bd01-97c74ab650c0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.435928 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-kube-api-access-d5hcs" (OuterVolumeSpecName: "kube-api-access-d5hcs") pod "d8417d5f-c42c-4aa5-bd01-97c74ab650c0" (UID: "d8417d5f-c42c-4aa5-bd01-97c74ab650c0"). InnerVolumeSpecName "kube-api-access-d5hcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.458617 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-inventory" (OuterVolumeSpecName: "inventory") pod "d8417d5f-c42c-4aa5-bd01-97c74ab650c0" (UID: "d8417d5f-c42c-4aa5-bd01-97c74ab650c0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.459116 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d8417d5f-c42c-4aa5-bd01-97c74ab650c0" (UID: "d8417d5f-c42c-4aa5-bd01-97c74ab650c0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.464804 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "d8417d5f-c42c-4aa5-bd01-97c74ab650c0" (UID: "d8417d5f-c42c-4aa5-bd01-97c74ab650c0"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.528145 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5hcs\" (UniqueName: \"kubernetes.io/projected/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-kube-api-access-d5hcs\") on node \"crc\" DevicePath \"\"" Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.528189 4843 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.528200 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.528210 4843 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.528220 4843 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d8417d5f-c42c-4aa5-bd01-97c74ab650c0-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.793370 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" event={"ID":"d8417d5f-c42c-4aa5-bd01-97c74ab650c0","Type":"ContainerDied","Data":"70bd69fb774e344d680726151778ca758acd36acfddddafa99188f73d2c087e3"} Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.793426 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70bd69fb774e344d680726151778ca758acd36acfddddafa99188f73d2c087e3" Mar 18 13:13:32 crc kubenswrapper[4843]: I0318 13:13:32.793440 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-csdpx" Mar 18 13:13:45 crc kubenswrapper[4843]: I0318 13:13:45.984033 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:13:45 crc kubenswrapper[4843]: E0318 13:13:45.984800 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:13:57 crc kubenswrapper[4843]: I0318 13:13:57.984129 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:13:57 crc kubenswrapper[4843]: E0318 13:13:57.985151 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.150802 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563994-rzz2k"] Mar 18 13:14:00 crc kubenswrapper[4843]: E0318 13:14:00.151639 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa83801f-d1ab-40f7-8ad2-4251ea55c894" containerName="oc" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.151678 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa83801f-d1ab-40f7-8ad2-4251ea55c894" containerName="oc" Mar 18 13:14:00 crc kubenswrapper[4843]: E0318 13:14:00.151731 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8417d5f-c42c-4aa5-bd01-97c74ab650c0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.151741 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8417d5f-c42c-4aa5-bd01-97c74ab650c0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.152009 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa83801f-d1ab-40f7-8ad2-4251ea55c894" containerName="oc" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.152041 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8417d5f-c42c-4aa5-bd01-97c74ab650c0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.153249 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-rzz2k" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.155383 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.155645 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.156002 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.160976 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-rzz2k"] Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.185217 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5st66\" (UniqueName: \"kubernetes.io/projected/1c47fc0c-e590-4f47-aaa2-bff437f222a6-kube-api-access-5st66\") pod \"auto-csr-approver-29563994-rzz2k\" (UID: \"1c47fc0c-e590-4f47-aaa2-bff437f222a6\") " pod="openshift-infra/auto-csr-approver-29563994-rzz2k" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.288039 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5st66\" (UniqueName: \"kubernetes.io/projected/1c47fc0c-e590-4f47-aaa2-bff437f222a6-kube-api-access-5st66\") pod \"auto-csr-approver-29563994-rzz2k\" (UID: \"1c47fc0c-e590-4f47-aaa2-bff437f222a6\") " pod="openshift-infra/auto-csr-approver-29563994-rzz2k" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.308711 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5st66\" (UniqueName: \"kubernetes.io/projected/1c47fc0c-e590-4f47-aaa2-bff437f222a6-kube-api-access-5st66\") pod \"auto-csr-approver-29563994-rzz2k\" (UID: \"1c47fc0c-e590-4f47-aaa2-bff437f222a6\") " pod="openshift-infra/auto-csr-approver-29563994-rzz2k" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.484789 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-rzz2k" Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.962784 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-rzz2k"] Mar 18 13:14:00 crc kubenswrapper[4843]: I0318 13:14:00.971575 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:14:01 crc kubenswrapper[4843]: I0318 13:14:01.073357 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-rzz2k" event={"ID":"1c47fc0c-e590-4f47-aaa2-bff437f222a6","Type":"ContainerStarted","Data":"9aaf9b47e909f5c0cee30018650d57ad4c860bc4a1b9e5abbcac90c08c54d109"} Mar 18 13:14:03 crc kubenswrapper[4843]: I0318 13:14:03.096507 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-rzz2k" event={"ID":"1c47fc0c-e590-4f47-aaa2-bff437f222a6","Type":"ContainerStarted","Data":"837a8b3b533caa7b0aafbcb35b2e7b0248f451e8d7f0949e71e4b4dfc6442d35"} Mar 18 13:14:04 crc kubenswrapper[4843]: I0318 13:14:04.110368 4843 generic.go:334] "Generic (PLEG): container finished" podID="1c47fc0c-e590-4f47-aaa2-bff437f222a6" containerID="837a8b3b533caa7b0aafbcb35b2e7b0248f451e8d7f0949e71e4b4dfc6442d35" exitCode=0 Mar 18 13:14:04 crc kubenswrapper[4843]: I0318 13:14:04.110732 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-rzz2k" event={"ID":"1c47fc0c-e590-4f47-aaa2-bff437f222a6","Type":"ContainerDied","Data":"837a8b3b533caa7b0aafbcb35b2e7b0248f451e8d7f0949e71e4b4dfc6442d35"} Mar 18 13:14:05 crc kubenswrapper[4843]: I0318 13:14:05.447642 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-rzz2k" Mar 18 13:14:05 crc kubenswrapper[4843]: I0318 13:14:05.615918 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5st66\" (UniqueName: \"kubernetes.io/projected/1c47fc0c-e590-4f47-aaa2-bff437f222a6-kube-api-access-5st66\") pod \"1c47fc0c-e590-4f47-aaa2-bff437f222a6\" (UID: \"1c47fc0c-e590-4f47-aaa2-bff437f222a6\") " Mar 18 13:14:05 crc kubenswrapper[4843]: I0318 13:14:05.626026 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c47fc0c-e590-4f47-aaa2-bff437f222a6-kube-api-access-5st66" (OuterVolumeSpecName: "kube-api-access-5st66") pod "1c47fc0c-e590-4f47-aaa2-bff437f222a6" (UID: "1c47fc0c-e590-4f47-aaa2-bff437f222a6"). InnerVolumeSpecName "kube-api-access-5st66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:14:05 crc kubenswrapper[4843]: I0318 13:14:05.718147 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5st66\" (UniqueName: \"kubernetes.io/projected/1c47fc0c-e590-4f47-aaa2-bff437f222a6-kube-api-access-5st66\") on node \"crc\" DevicePath \"\"" Mar 18 13:14:06 crc kubenswrapper[4843]: I0318 13:14:06.128645 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-rzz2k" event={"ID":"1c47fc0c-e590-4f47-aaa2-bff437f222a6","Type":"ContainerDied","Data":"9aaf9b47e909f5c0cee30018650d57ad4c860bc4a1b9e5abbcac90c08c54d109"} Mar 18 13:14:06 crc kubenswrapper[4843]: I0318 13:14:06.128718 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aaf9b47e909f5c0cee30018650d57ad4c860bc4a1b9e5abbcac90c08c54d109" Mar 18 13:14:06 crc kubenswrapper[4843]: I0318 13:14:06.128733 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-rzz2k" Mar 18 13:14:06 crc kubenswrapper[4843]: I0318 13:14:06.234207 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-rkhbk"] Mar 18 13:14:06 crc kubenswrapper[4843]: I0318 13:14:06.282262 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-rkhbk"] Mar 18 13:14:07 crc kubenswrapper[4843]: I0318 13:14:07.003541 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="833ea7f3-18ec-4f3e-9e88-e475af555255" path="/var/lib/kubelet/pods/833ea7f3-18ec-4f3e-9e88-e475af555255/volumes" Mar 18 13:14:09 crc kubenswrapper[4843]: I0318 13:14:09.984077 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:14:09 crc kubenswrapper[4843]: E0318 13:14:09.985459 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:14:20 crc kubenswrapper[4843]: I0318 13:14:20.984229 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:14:20 crc kubenswrapper[4843]: E0318 13:14:20.984946 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:14:32 crc kubenswrapper[4843]: I0318 13:14:32.984259 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:14:32 crc kubenswrapper[4843]: E0318 13:14:32.985025 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:14:43 crc kubenswrapper[4843]: I0318 13:14:43.984463 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:14:43 crc kubenswrapper[4843]: E0318 13:14:43.985191 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:14:55 crc kubenswrapper[4843]: I0318 13:14:55.984601 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:14:55 crc kubenswrapper[4843]: E0318 13:14:55.985295 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.318477 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx"] Mar 18 13:15:00 crc kubenswrapper[4843]: E0318 13:15:00.319712 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c47fc0c-e590-4f47-aaa2-bff437f222a6" containerName="oc" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.319746 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c47fc0c-e590-4f47-aaa2-bff437f222a6" containerName="oc" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.320113 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c47fc0c-e590-4f47-aaa2-bff437f222a6" containerName="oc" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.321112 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.323573 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.323636 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.329073 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx"] Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.442134 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q47l8\" (UniqueName: \"kubernetes.io/projected/7d77f534-e56c-42d0-b77c-af703d9447b5-kube-api-access-q47l8\") pod \"collect-profiles-29563995-dkhzx\" (UID: \"7d77f534-e56c-42d0-b77c-af703d9447b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.442245 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d77f534-e56c-42d0-b77c-af703d9447b5-config-volume\") pod \"collect-profiles-29563995-dkhzx\" (UID: \"7d77f534-e56c-42d0-b77c-af703d9447b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.442266 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d77f534-e56c-42d0-b77c-af703d9447b5-secret-volume\") pod \"collect-profiles-29563995-dkhzx\" (UID: \"7d77f534-e56c-42d0-b77c-af703d9447b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.543985 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q47l8\" (UniqueName: \"kubernetes.io/projected/7d77f534-e56c-42d0-b77c-af703d9447b5-kube-api-access-q47l8\") pod \"collect-profiles-29563995-dkhzx\" (UID: \"7d77f534-e56c-42d0-b77c-af703d9447b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.544409 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d77f534-e56c-42d0-b77c-af703d9447b5-config-volume\") pod \"collect-profiles-29563995-dkhzx\" (UID: \"7d77f534-e56c-42d0-b77c-af703d9447b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.544559 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d77f534-e56c-42d0-b77c-af703d9447b5-secret-volume\") pod \"collect-profiles-29563995-dkhzx\" (UID: \"7d77f534-e56c-42d0-b77c-af703d9447b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.546358 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d77f534-e56c-42d0-b77c-af703d9447b5-config-volume\") pod \"collect-profiles-29563995-dkhzx\" (UID: \"7d77f534-e56c-42d0-b77c-af703d9447b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.555018 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d77f534-e56c-42d0-b77c-af703d9447b5-secret-volume\") pod \"collect-profiles-29563995-dkhzx\" (UID: \"7d77f534-e56c-42d0-b77c-af703d9447b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.565922 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q47l8\" (UniqueName: \"kubernetes.io/projected/7d77f534-e56c-42d0-b77c-af703d9447b5-kube-api-access-q47l8\") pod \"collect-profiles-29563995-dkhzx\" (UID: \"7d77f534-e56c-42d0-b77c-af703d9447b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" Mar 18 13:15:00 crc kubenswrapper[4843]: I0318 13:15:00.649575 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" Mar 18 13:15:01 crc kubenswrapper[4843]: I0318 13:15:01.112834 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx"] Mar 18 13:15:01 crc kubenswrapper[4843]: I0318 13:15:01.913378 4843 generic.go:334] "Generic (PLEG): container finished" podID="7d77f534-e56c-42d0-b77c-af703d9447b5" containerID="212e54750da046ab953ae0d74bc00314243b0cc3ab4bf117e8b01fccb2507d6a" exitCode=0 Mar 18 13:15:01 crc kubenswrapper[4843]: I0318 13:15:01.913687 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" event={"ID":"7d77f534-e56c-42d0-b77c-af703d9447b5","Type":"ContainerDied","Data":"212e54750da046ab953ae0d74bc00314243b0cc3ab4bf117e8b01fccb2507d6a"} Mar 18 13:15:01 crc kubenswrapper[4843]: I0318 13:15:01.913724 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" event={"ID":"7d77f534-e56c-42d0-b77c-af703d9447b5","Type":"ContainerStarted","Data":"4438294c45228b1f6c20db3942d4b5c31ab4b7adcd543b483fc7d3c1eaa345b7"} Mar 18 13:15:03 crc kubenswrapper[4843]: I0318 13:15:03.260785 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" Mar 18 13:15:03 crc kubenswrapper[4843]: I0318 13:15:03.442980 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d77f534-e56c-42d0-b77c-af703d9447b5-secret-volume\") pod \"7d77f534-e56c-42d0-b77c-af703d9447b5\" (UID: \"7d77f534-e56c-42d0-b77c-af703d9447b5\") " Mar 18 13:15:03 crc kubenswrapper[4843]: I0318 13:15:03.443126 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d77f534-e56c-42d0-b77c-af703d9447b5-config-volume\") pod \"7d77f534-e56c-42d0-b77c-af703d9447b5\" (UID: \"7d77f534-e56c-42d0-b77c-af703d9447b5\") " Mar 18 13:15:03 crc kubenswrapper[4843]: I0318 13:15:03.443238 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q47l8\" (UniqueName: \"kubernetes.io/projected/7d77f534-e56c-42d0-b77c-af703d9447b5-kube-api-access-q47l8\") pod \"7d77f534-e56c-42d0-b77c-af703d9447b5\" (UID: \"7d77f534-e56c-42d0-b77c-af703d9447b5\") " Mar 18 13:15:03 crc kubenswrapper[4843]: I0318 13:15:03.445002 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d77f534-e56c-42d0-b77c-af703d9447b5-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d77f534-e56c-42d0-b77c-af703d9447b5" (UID: "7d77f534-e56c-42d0-b77c-af703d9447b5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:15:03 crc kubenswrapper[4843]: I0318 13:15:03.448992 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d77f534-e56c-42d0-b77c-af703d9447b5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d77f534-e56c-42d0-b77c-af703d9447b5" (UID: "7d77f534-e56c-42d0-b77c-af703d9447b5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:15:03 crc kubenswrapper[4843]: I0318 13:15:03.449380 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d77f534-e56c-42d0-b77c-af703d9447b5-kube-api-access-q47l8" (OuterVolumeSpecName: "kube-api-access-q47l8") pod "7d77f534-e56c-42d0-b77c-af703d9447b5" (UID: "7d77f534-e56c-42d0-b77c-af703d9447b5"). InnerVolumeSpecName "kube-api-access-q47l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:15:03 crc kubenswrapper[4843]: I0318 13:15:03.545322 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q47l8\" (UniqueName: \"kubernetes.io/projected/7d77f534-e56c-42d0-b77c-af703d9447b5-kube-api-access-q47l8\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:03 crc kubenswrapper[4843]: I0318 13:15:03.545405 4843 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d77f534-e56c-42d0-b77c-af703d9447b5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:03 crc kubenswrapper[4843]: I0318 13:15:03.545420 4843 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d77f534-e56c-42d0-b77c-af703d9447b5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:03 crc kubenswrapper[4843]: I0318 13:15:03.934025 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" event={"ID":"7d77f534-e56c-42d0-b77c-af703d9447b5","Type":"ContainerDied","Data":"4438294c45228b1f6c20db3942d4b5c31ab4b7adcd543b483fc7d3c1eaa345b7"} Mar 18 13:15:03 crc kubenswrapper[4843]: I0318 13:15:03.934093 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4438294c45228b1f6c20db3942d4b5c31ab4b7adcd543b483fc7d3c1eaa345b7" Mar 18 13:15:03 crc kubenswrapper[4843]: I0318 13:15:03.934147 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-dkhzx" Mar 18 13:15:04 crc kubenswrapper[4843]: I0318 13:15:04.329980 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl"] Mar 18 13:15:04 crc kubenswrapper[4843]: I0318 13:15:04.342113 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-nqssl"] Mar 18 13:15:05 crc kubenswrapper[4843]: I0318 13:15:05.069310 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e18e83a1-efe3-4695-9897-f0ca13d4bedf" path="/var/lib/kubelet/pods/e18e83a1-efe3-4695-9897-f0ca13d4bedf/volumes" Mar 18 13:15:06 crc kubenswrapper[4843]: I0318 13:15:06.965115 4843 scope.go:117] "RemoveContainer" containerID="29e56a3dc8eada61b02d762e9ae7cda6f7f5ecd86146748f0c2dd10f62883a64" Mar 18 13:15:06 crc kubenswrapper[4843]: I0318 13:15:06.995897 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:15:06 crc kubenswrapper[4843]: E0318 13:15:06.996229 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:15:07 crc kubenswrapper[4843]: I0318 13:15:07.000914 4843 scope.go:117] "RemoveContainer" containerID="3137cac00d9e53a4189e83a0ab9d1875153bd124063390792391eb790f5b78ad" Mar 18 13:15:20 crc kubenswrapper[4843]: I0318 13:15:20.984484 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:15:20 crc kubenswrapper[4843]: E0318 13:15:20.985287 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:15:35 crc kubenswrapper[4843]: I0318 13:15:35.983831 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:15:35 crc kubenswrapper[4843]: E0318 13:15:35.984562 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:15:49 crc kubenswrapper[4843]: I0318 13:15:49.984331 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:15:49 crc kubenswrapper[4843]: E0318 13:15:49.985132 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:16:00 crc kubenswrapper[4843]: I0318 13:16:00.169646 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563996-vjkpt"] Mar 18 13:16:00 crc kubenswrapper[4843]: E0318 13:16:00.170885 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d77f534-e56c-42d0-b77c-af703d9447b5" containerName="collect-profiles" Mar 18 13:16:00 crc kubenswrapper[4843]: I0318 13:16:00.170909 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d77f534-e56c-42d0-b77c-af703d9447b5" containerName="collect-profiles" Mar 18 13:16:00 crc kubenswrapper[4843]: I0318 13:16:00.171164 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d77f534-e56c-42d0-b77c-af703d9447b5" containerName="collect-profiles" Mar 18 13:16:00 crc kubenswrapper[4843]: I0318 13:16:00.172124 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-vjkpt" Mar 18 13:16:00 crc kubenswrapper[4843]: I0318 13:16:00.175526 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:16:00 crc kubenswrapper[4843]: I0318 13:16:00.175526 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:16:00 crc kubenswrapper[4843]: I0318 13:16:00.175687 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:16:00 crc kubenswrapper[4843]: I0318 13:16:00.180925 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-vjkpt"] Mar 18 13:16:00 crc kubenswrapper[4843]: I0318 13:16:00.339997 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbzxs\" (UniqueName: \"kubernetes.io/projected/8b04fc45-e7e8-4977-bdda-dfaf6e74fafa-kube-api-access-bbzxs\") pod \"auto-csr-approver-29563996-vjkpt\" (UID: \"8b04fc45-e7e8-4977-bdda-dfaf6e74fafa\") " pod="openshift-infra/auto-csr-approver-29563996-vjkpt" Mar 18 13:16:00 crc kubenswrapper[4843]: I0318 13:16:00.442667 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbzxs\" (UniqueName: \"kubernetes.io/projected/8b04fc45-e7e8-4977-bdda-dfaf6e74fafa-kube-api-access-bbzxs\") pod \"auto-csr-approver-29563996-vjkpt\" (UID: \"8b04fc45-e7e8-4977-bdda-dfaf6e74fafa\") " pod="openshift-infra/auto-csr-approver-29563996-vjkpt" Mar 18 13:16:00 crc kubenswrapper[4843]: I0318 13:16:00.461442 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbzxs\" (UniqueName: \"kubernetes.io/projected/8b04fc45-e7e8-4977-bdda-dfaf6e74fafa-kube-api-access-bbzxs\") pod \"auto-csr-approver-29563996-vjkpt\" (UID: \"8b04fc45-e7e8-4977-bdda-dfaf6e74fafa\") " pod="openshift-infra/auto-csr-approver-29563996-vjkpt" Mar 18 13:16:00 crc kubenswrapper[4843]: I0318 13:16:00.498163 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-vjkpt" Mar 18 13:16:00 crc kubenswrapper[4843]: I0318 13:16:00.955759 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-vjkpt"] Mar 18 13:16:01 crc kubenswrapper[4843]: I0318 13:16:01.105157 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563996-vjkpt" event={"ID":"8b04fc45-e7e8-4977-bdda-dfaf6e74fafa","Type":"ContainerStarted","Data":"cb46e3556ec9acd72146ae9985ae211fd740ad70521534a6dea09943157b363f"} Mar 18 13:16:03 crc kubenswrapper[4843]: I0318 13:16:03.986604 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:16:03 crc kubenswrapper[4843]: E0318 13:16:03.987731 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:16:09 crc kubenswrapper[4843]: I0318 13:16:09.191993 4843 generic.go:334] "Generic (PLEG): container finished" podID="8b04fc45-e7e8-4977-bdda-dfaf6e74fafa" containerID="2c02efb9eed4c2d6a9d31c9d0387859da266fec7c30f4a78795831a7225104dc" exitCode=0 Mar 18 13:16:09 crc kubenswrapper[4843]: I0318 13:16:09.192041 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563996-vjkpt" event={"ID":"8b04fc45-e7e8-4977-bdda-dfaf6e74fafa","Type":"ContainerDied","Data":"2c02efb9eed4c2d6a9d31c9d0387859da266fec7c30f4a78795831a7225104dc"} Mar 18 13:16:10 crc kubenswrapper[4843]: I0318 13:16:10.575607 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-vjkpt" Mar 18 13:16:10 crc kubenswrapper[4843]: I0318 13:16:10.693174 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbzxs\" (UniqueName: \"kubernetes.io/projected/8b04fc45-e7e8-4977-bdda-dfaf6e74fafa-kube-api-access-bbzxs\") pod \"8b04fc45-e7e8-4977-bdda-dfaf6e74fafa\" (UID: \"8b04fc45-e7e8-4977-bdda-dfaf6e74fafa\") " Mar 18 13:16:10 crc kubenswrapper[4843]: I0318 13:16:10.699361 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b04fc45-e7e8-4977-bdda-dfaf6e74fafa-kube-api-access-bbzxs" (OuterVolumeSpecName: "kube-api-access-bbzxs") pod "8b04fc45-e7e8-4977-bdda-dfaf6e74fafa" (UID: "8b04fc45-e7e8-4977-bdda-dfaf6e74fafa"). InnerVolumeSpecName "kube-api-access-bbzxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:16:10 crc kubenswrapper[4843]: I0318 13:16:10.795123 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbzxs\" (UniqueName: \"kubernetes.io/projected/8b04fc45-e7e8-4977-bdda-dfaf6e74fafa-kube-api-access-bbzxs\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:11 crc kubenswrapper[4843]: I0318 13:16:11.209111 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563996-vjkpt" event={"ID":"8b04fc45-e7e8-4977-bdda-dfaf6e74fafa","Type":"ContainerDied","Data":"cb46e3556ec9acd72146ae9985ae211fd740ad70521534a6dea09943157b363f"} Mar 18 13:16:11 crc kubenswrapper[4843]: I0318 13:16:11.209149 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb46e3556ec9acd72146ae9985ae211fd740ad70521534a6dea09943157b363f" Mar 18 13:16:11 crc kubenswrapper[4843]: I0318 13:16:11.209242 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-vjkpt" Mar 18 13:16:11 crc kubenswrapper[4843]: I0318 13:16:11.653917 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-dscgm"] Mar 18 13:16:11 crc kubenswrapper[4843]: I0318 13:16:11.664686 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-dscgm"] Mar 18 13:16:12 crc kubenswrapper[4843]: I0318 13:16:12.994934 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f1c4e85-39a4-44a3-9e45-09587f0e59cb" path="/var/lib/kubelet/pods/0f1c4e85-39a4-44a3-9e45-09587f0e59cb/volumes" Mar 18 13:16:17 crc kubenswrapper[4843]: I0318 13:16:17.983888 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:16:17 crc kubenswrapper[4843]: E0318 13:16:17.984677 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:16:28 crc kubenswrapper[4843]: I0318 13:16:28.984026 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:16:28 crc kubenswrapper[4843]: E0318 13:16:28.984908 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:16:40 crc kubenswrapper[4843]: I0318 13:16:40.983847 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:16:40 crc kubenswrapper[4843]: E0318 13:16:40.984607 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:16:51 crc kubenswrapper[4843]: I0318 13:16:51.984056 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:16:53 crc kubenswrapper[4843]: I0318 13:16:53.172277 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"7974653c49645239160ac43ca7d8c3d2bf1a27727cb40316647ec5569b7697ac"} Mar 18 13:17:07 crc kubenswrapper[4843]: I0318 13:17:07.131790 4843 scope.go:117] "RemoveContainer" containerID="69a7f998a933c896afd8ce0c910ac40bbc98f2fb384b94e90dc7d63939c60962" Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.162266 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563998-pgwpx"] Mar 18 13:18:00 crc kubenswrapper[4843]: E0318 13:18:00.163377 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b04fc45-e7e8-4977-bdda-dfaf6e74fafa" containerName="oc" Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.163397 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b04fc45-e7e8-4977-bdda-dfaf6e74fafa" containerName="oc" Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.163686 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b04fc45-e7e8-4977-bdda-dfaf6e74fafa" containerName="oc" Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.164563 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-pgwpx" Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.167386 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.167595 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.168191 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.173318 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-pgwpx"] Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.261598 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2gj8\" (UniqueName: \"kubernetes.io/projected/72422f42-741a-450d-bf90-f9d8b262add3-kube-api-access-b2gj8\") pod \"auto-csr-approver-29563998-pgwpx\" (UID: \"72422f42-741a-450d-bf90-f9d8b262add3\") " pod="openshift-infra/auto-csr-approver-29563998-pgwpx" Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.363850 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gj8\" (UniqueName: \"kubernetes.io/projected/72422f42-741a-450d-bf90-f9d8b262add3-kube-api-access-b2gj8\") pod \"auto-csr-approver-29563998-pgwpx\" (UID: \"72422f42-741a-450d-bf90-f9d8b262add3\") " pod="openshift-infra/auto-csr-approver-29563998-pgwpx" Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.390620 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2gj8\" (UniqueName: \"kubernetes.io/projected/72422f42-741a-450d-bf90-f9d8b262add3-kube-api-access-b2gj8\") pod \"auto-csr-approver-29563998-pgwpx\" (UID: \"72422f42-741a-450d-bf90-f9d8b262add3\") " pod="openshift-infra/auto-csr-approver-29563998-pgwpx" Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.489851 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-pgwpx" Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.942562 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-pgwpx"] Mar 18 13:18:00 crc kubenswrapper[4843]: I0318 13:18:00.995528 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563998-pgwpx" event={"ID":"72422f42-741a-450d-bf90-f9d8b262add3","Type":"ContainerStarted","Data":"bc07a1e5cc4a2433d3ee3458e5dc86535f6d50edea22b7aaeb48e0ecd10eea58"} Mar 18 13:18:03 crc kubenswrapper[4843]: I0318 13:18:03.007845 4843 generic.go:334] "Generic (PLEG): container finished" podID="72422f42-741a-450d-bf90-f9d8b262add3" containerID="9062b74d897a53ed5c312b58f6619effe9768b4a756ac99948097595613ae473" exitCode=0 Mar 18 13:18:03 crc kubenswrapper[4843]: I0318 13:18:03.008387 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563998-pgwpx" event={"ID":"72422f42-741a-450d-bf90-f9d8b262add3","Type":"ContainerDied","Data":"9062b74d897a53ed5c312b58f6619effe9768b4a756ac99948097595613ae473"} Mar 18 13:18:04 crc kubenswrapper[4843]: I0318 13:18:04.369823 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-pgwpx" Mar 18 13:18:04 crc kubenswrapper[4843]: I0318 13:18:04.491685 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2gj8\" (UniqueName: \"kubernetes.io/projected/72422f42-741a-450d-bf90-f9d8b262add3-kube-api-access-b2gj8\") pod \"72422f42-741a-450d-bf90-f9d8b262add3\" (UID: \"72422f42-741a-450d-bf90-f9d8b262add3\") " Mar 18 13:18:04 crc kubenswrapper[4843]: I0318 13:18:04.498360 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72422f42-741a-450d-bf90-f9d8b262add3-kube-api-access-b2gj8" (OuterVolumeSpecName: "kube-api-access-b2gj8") pod "72422f42-741a-450d-bf90-f9d8b262add3" (UID: "72422f42-741a-450d-bf90-f9d8b262add3"). InnerVolumeSpecName "kube-api-access-b2gj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:18:04 crc kubenswrapper[4843]: I0318 13:18:04.594531 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2gj8\" (UniqueName: \"kubernetes.io/projected/72422f42-741a-450d-bf90-f9d8b262add3-kube-api-access-b2gj8\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:05 crc kubenswrapper[4843]: I0318 13:18:05.034915 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563998-pgwpx" event={"ID":"72422f42-741a-450d-bf90-f9d8b262add3","Type":"ContainerDied","Data":"bc07a1e5cc4a2433d3ee3458e5dc86535f6d50edea22b7aaeb48e0ecd10eea58"} Mar 18 13:18:05 crc kubenswrapper[4843]: I0318 13:18:05.035599 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc07a1e5cc4a2433d3ee3458e5dc86535f6d50edea22b7aaeb48e0ecd10eea58" Mar 18 13:18:05 crc kubenswrapper[4843]: I0318 13:18:05.034986 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-pgwpx" Mar 18 13:18:05 crc kubenswrapper[4843]: I0318 13:18:05.444103 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-hngbv"] Mar 18 13:18:05 crc kubenswrapper[4843]: I0318 13:18:05.454002 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-hngbv"] Mar 18 13:18:07 crc kubenswrapper[4843]: I0318 13:18:07.003046 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa83801f-d1ab-40f7-8ad2-4251ea55c894" path="/var/lib/kubelet/pods/fa83801f-d1ab-40f7-8ad2-4251ea55c894/volumes" Mar 18 13:18:07 crc kubenswrapper[4843]: I0318 13:18:07.235711 4843 scope.go:117] "RemoveContainer" containerID="0475c827a42ca1b8d921d6fbff542530d1407459c7b57a19c839e5fd629e9362" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.173194 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vf6dl"] Mar 18 13:18:16 crc kubenswrapper[4843]: E0318 13:18:16.175032 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72422f42-741a-450d-bf90-f9d8b262add3" containerName="oc" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.175068 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="72422f42-741a-450d-bf90-f9d8b262add3" containerName="oc" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.182166 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="72422f42-741a-450d-bf90-f9d8b262add3" containerName="oc" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.188313 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.193231 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vf6dl"] Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.220797 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjn82\" (UniqueName: \"kubernetes.io/projected/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-kube-api-access-bjn82\") pod \"redhat-operators-vf6dl\" (UID: \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\") " pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.220965 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-catalog-content\") pod \"redhat-operators-vf6dl\" (UID: \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\") " pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.221007 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-utilities\") pod \"redhat-operators-vf6dl\" (UID: \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\") " pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.322448 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-catalog-content\") pod \"redhat-operators-vf6dl\" (UID: \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\") " pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.322769 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-utilities\") pod \"redhat-operators-vf6dl\" (UID: \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\") " pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.323037 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjn82\" (UniqueName: \"kubernetes.io/projected/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-kube-api-access-bjn82\") pod \"redhat-operators-vf6dl\" (UID: \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\") " pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.323044 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-catalog-content\") pod \"redhat-operators-vf6dl\" (UID: \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\") " pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.323311 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-utilities\") pod \"redhat-operators-vf6dl\" (UID: \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\") " pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.348669 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjn82\" (UniqueName: \"kubernetes.io/projected/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-kube-api-access-bjn82\") pod \"redhat-operators-vf6dl\" (UID: \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\") " pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:16 crc kubenswrapper[4843]: I0318 13:18:16.542516 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:17 crc kubenswrapper[4843]: I0318 13:18:17.046436 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vf6dl"] Mar 18 13:18:17 crc kubenswrapper[4843]: I0318 13:18:17.136581 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf6dl" event={"ID":"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc","Type":"ContainerStarted","Data":"ad9dbecdab80cd40455da4cd25e8a320bd979b6ed59ae97f6c40c8747be41d9b"} Mar 18 13:18:18 crc kubenswrapper[4843]: I0318 13:18:18.280806 4843 generic.go:334] "Generic (PLEG): container finished" podID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerID="412afe0e99bc8992a50974d48003409c6a59629ca99b6d095402cc6e0f342dfd" exitCode=0 Mar 18 13:18:18 crc kubenswrapper[4843]: I0318 13:18:18.280916 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf6dl" event={"ID":"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc","Type":"ContainerDied","Data":"412afe0e99bc8992a50974d48003409c6a59629ca99b6d095402cc6e0f342dfd"} Mar 18 13:18:20 crc kubenswrapper[4843]: I0318 13:18:20.303121 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf6dl" event={"ID":"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc","Type":"ContainerStarted","Data":"cfa492b4f1fbe7b852e546660d424d2dd5a3cde80d16468dfe6bbb276191464d"} Mar 18 13:18:21 crc kubenswrapper[4843]: I0318 13:18:21.313307 4843 generic.go:334] "Generic (PLEG): container finished" podID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerID="cfa492b4f1fbe7b852e546660d424d2dd5a3cde80d16468dfe6bbb276191464d" exitCode=0 Mar 18 13:18:21 crc kubenswrapper[4843]: I0318 13:18:21.313823 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf6dl" event={"ID":"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc","Type":"ContainerDied","Data":"cfa492b4f1fbe7b852e546660d424d2dd5a3cde80d16468dfe6bbb276191464d"} Mar 18 13:18:22 crc kubenswrapper[4843]: I0318 13:18:22.325889 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf6dl" event={"ID":"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc","Type":"ContainerStarted","Data":"5a82aaa9727999d73722a28845c863441d4b420834d8a49b7ec8f2f0a2c8aeb6"} Mar 18 13:18:22 crc kubenswrapper[4843]: I0318 13:18:22.351576 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vf6dl" podStartSLOduration=2.768785293 podStartE2EDuration="6.351539788s" podCreationTimestamp="2026-03-18 13:18:16 +0000 UTC" firstStartedPulling="2026-03-18 13:18:18.283181376 +0000 UTC m=+4131.999006890" lastFinishedPulling="2026-03-18 13:18:21.865935861 +0000 UTC m=+4135.581761385" observedRunningTime="2026-03-18 13:18:22.350433026 +0000 UTC m=+4136.066258570" watchObservedRunningTime="2026-03-18 13:18:22.351539788 +0000 UTC m=+4136.067365312" Mar 18 13:18:23 crc kubenswrapper[4843]: I0318 13:18:23.921070 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gs5jw"] Mar 18 13:18:23 crc kubenswrapper[4843]: I0318 13:18:23.923865 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:23 crc kubenswrapper[4843]: I0318 13:18:23.943599 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gs5jw"] Mar 18 13:18:24 crc kubenswrapper[4843]: I0318 13:18:24.075568 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lfqf\" (UniqueName: \"kubernetes.io/projected/b7ab9c9f-0dae-4891-b32a-a49d11901d59-kube-api-access-9lfqf\") pod \"certified-operators-gs5jw\" (UID: \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\") " pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:24 crc kubenswrapper[4843]: I0318 13:18:24.076329 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ab9c9f-0dae-4891-b32a-a49d11901d59-utilities\") pod \"certified-operators-gs5jw\" (UID: \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\") " pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:24 crc kubenswrapper[4843]: I0318 13:18:24.076471 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ab9c9f-0dae-4891-b32a-a49d11901d59-catalog-content\") pod \"certified-operators-gs5jw\" (UID: \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\") " pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:24 crc kubenswrapper[4843]: I0318 13:18:24.182167 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ab9c9f-0dae-4891-b32a-a49d11901d59-utilities\") pod \"certified-operators-gs5jw\" (UID: \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\") " pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:24 crc kubenswrapper[4843]: I0318 13:18:24.182281 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ab9c9f-0dae-4891-b32a-a49d11901d59-catalog-content\") pod \"certified-operators-gs5jw\" (UID: \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\") " pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:24 crc kubenswrapper[4843]: I0318 13:18:24.182408 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lfqf\" (UniqueName: \"kubernetes.io/projected/b7ab9c9f-0dae-4891-b32a-a49d11901d59-kube-api-access-9lfqf\") pod \"certified-operators-gs5jw\" (UID: \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\") " pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:24 crc kubenswrapper[4843]: I0318 13:18:24.183432 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ab9c9f-0dae-4891-b32a-a49d11901d59-catalog-content\") pod \"certified-operators-gs5jw\" (UID: \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\") " pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:24 crc kubenswrapper[4843]: I0318 13:18:24.183443 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ab9c9f-0dae-4891-b32a-a49d11901d59-utilities\") pod \"certified-operators-gs5jw\" (UID: \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\") " pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:24 crc kubenswrapper[4843]: I0318 13:18:24.208521 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lfqf\" (UniqueName: \"kubernetes.io/projected/b7ab9c9f-0dae-4891-b32a-a49d11901d59-kube-api-access-9lfqf\") pod \"certified-operators-gs5jw\" (UID: \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\") " pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:24 crc kubenswrapper[4843]: I0318 13:18:24.249490 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:24 crc kubenswrapper[4843]: W0318 13:18:24.806832 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7ab9c9f_0dae_4891_b32a_a49d11901d59.slice/crio-d6aaf19742b0f7513fdbbb0e6a0b47f14a4682422d94239f9581cf5bbfb0d82b WatchSource:0}: Error finding container d6aaf19742b0f7513fdbbb0e6a0b47f14a4682422d94239f9581cf5bbfb0d82b: Status 404 returned error can't find the container with id d6aaf19742b0f7513fdbbb0e6a0b47f14a4682422d94239f9581cf5bbfb0d82b Mar 18 13:18:24 crc kubenswrapper[4843]: I0318 13:18:24.807543 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gs5jw"] Mar 18 13:18:25 crc kubenswrapper[4843]: I0318 13:18:25.355023 4843 generic.go:334] "Generic (PLEG): container finished" podID="b7ab9c9f-0dae-4891-b32a-a49d11901d59" containerID="768af08a5d1467e664225bcf0746196d4323d4bb25a97d5b3e08a965c249b2b4" exitCode=0 Mar 18 13:18:25 crc kubenswrapper[4843]: I0318 13:18:25.355092 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs5jw" event={"ID":"b7ab9c9f-0dae-4891-b32a-a49d11901d59","Type":"ContainerDied","Data":"768af08a5d1467e664225bcf0746196d4323d4bb25a97d5b3e08a965c249b2b4"} Mar 18 13:18:25 crc kubenswrapper[4843]: I0318 13:18:25.355137 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs5jw" event={"ID":"b7ab9c9f-0dae-4891-b32a-a49d11901d59","Type":"ContainerStarted","Data":"d6aaf19742b0f7513fdbbb0e6a0b47f14a4682422d94239f9581cf5bbfb0d82b"} Mar 18 13:18:26 crc kubenswrapper[4843]: I0318 13:18:26.402841 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs5jw" event={"ID":"b7ab9c9f-0dae-4891-b32a-a49d11901d59","Type":"ContainerStarted","Data":"83cc3fd095f28ef426e870b8155092628eb084ef4ac2a0baf37a8dd057ecb9ac"} Mar 18 13:18:26 crc kubenswrapper[4843]: I0318 13:18:26.543790 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:26 crc kubenswrapper[4843]: I0318 13:18:26.544835 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:27 crc kubenswrapper[4843]: I0318 13:18:27.414491 4843 generic.go:334] "Generic (PLEG): container finished" podID="b7ab9c9f-0dae-4891-b32a-a49d11901d59" containerID="83cc3fd095f28ef426e870b8155092628eb084ef4ac2a0baf37a8dd057ecb9ac" exitCode=0 Mar 18 13:18:27 crc kubenswrapper[4843]: I0318 13:18:27.414553 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs5jw" event={"ID":"b7ab9c9f-0dae-4891-b32a-a49d11901d59","Type":"ContainerDied","Data":"83cc3fd095f28ef426e870b8155092628eb084ef4ac2a0baf37a8dd057ecb9ac"} Mar 18 13:18:27 crc kubenswrapper[4843]: I0318 13:18:27.593842 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vf6dl" podUID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerName="registry-server" probeResult="failure" output=< Mar 18 13:18:27 crc kubenswrapper[4843]: timeout: failed to connect service ":50051" within 1s Mar 18 13:18:27 crc kubenswrapper[4843]: > Mar 18 13:18:28 crc kubenswrapper[4843]: I0318 13:18:28.433554 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs5jw" event={"ID":"b7ab9c9f-0dae-4891-b32a-a49d11901d59","Type":"ContainerStarted","Data":"1d720e1302e764ebf663f005bed10cb883dafbade3dda76b5abca92872632e34"} Mar 18 13:18:28 crc kubenswrapper[4843]: I0318 13:18:28.456332 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gs5jw" podStartSLOduration=3.027460744 podStartE2EDuration="5.456308228s" podCreationTimestamp="2026-03-18 13:18:23 +0000 UTC" firstStartedPulling="2026-03-18 13:18:25.357468132 +0000 UTC m=+4139.073293666" lastFinishedPulling="2026-03-18 13:18:27.786315626 +0000 UTC m=+4141.502141150" observedRunningTime="2026-03-18 13:18:28.452645274 +0000 UTC m=+4142.168470798" watchObservedRunningTime="2026-03-18 13:18:28.456308228 +0000 UTC m=+4142.172133752" Mar 18 13:18:34 crc kubenswrapper[4843]: I0318 13:18:34.250144 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:34 crc kubenswrapper[4843]: I0318 13:18:34.250513 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:34 crc kubenswrapper[4843]: I0318 13:18:34.301317 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:34 crc kubenswrapper[4843]: I0318 13:18:34.590960 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:35 crc kubenswrapper[4843]: I0318 13:18:35.905860 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gs5jw"] Mar 18 13:18:36 crc kubenswrapper[4843]: I0318 13:18:36.563858 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gs5jw" podUID="b7ab9c9f-0dae-4891-b32a-a49d11901d59" containerName="registry-server" containerID="cri-o://1d720e1302e764ebf663f005bed10cb883dafbade3dda76b5abca92872632e34" gracePeriod=2 Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.119936 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.225850 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lfqf\" (UniqueName: \"kubernetes.io/projected/b7ab9c9f-0dae-4891-b32a-a49d11901d59-kube-api-access-9lfqf\") pod \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\" (UID: \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\") " Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.225935 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ab9c9f-0dae-4891-b32a-a49d11901d59-utilities\") pod \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\" (UID: \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\") " Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.226165 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ab9c9f-0dae-4891-b32a-a49d11901d59-catalog-content\") pod \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\" (UID: \"b7ab9c9f-0dae-4891-b32a-a49d11901d59\") " Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.226792 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7ab9c9f-0dae-4891-b32a-a49d11901d59-utilities" (OuterVolumeSpecName: "utilities") pod "b7ab9c9f-0dae-4891-b32a-a49d11901d59" (UID: "b7ab9c9f-0dae-4891-b32a-a49d11901d59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.232604 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7ab9c9f-0dae-4891-b32a-a49d11901d59-kube-api-access-9lfqf" (OuterVolumeSpecName: "kube-api-access-9lfqf") pod "b7ab9c9f-0dae-4891-b32a-a49d11901d59" (UID: "b7ab9c9f-0dae-4891-b32a-a49d11901d59"). InnerVolumeSpecName "kube-api-access-9lfqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.274191 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7ab9c9f-0dae-4891-b32a-a49d11901d59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7ab9c9f-0dae-4891-b32a-a49d11901d59" (UID: "b7ab9c9f-0dae-4891-b32a-a49d11901d59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.328747 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ab9c9f-0dae-4891-b32a-a49d11901d59-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.328791 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lfqf\" (UniqueName: \"kubernetes.io/projected/b7ab9c9f-0dae-4891-b32a-a49d11901d59-kube-api-access-9lfqf\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.328805 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ab9c9f-0dae-4891-b32a-a49d11901d59-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.577512 4843 generic.go:334] "Generic (PLEG): container finished" podID="b7ab9c9f-0dae-4891-b32a-a49d11901d59" containerID="1d720e1302e764ebf663f005bed10cb883dafbade3dda76b5abca92872632e34" exitCode=0 Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.577577 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gs5jw" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.577594 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs5jw" event={"ID":"b7ab9c9f-0dae-4891-b32a-a49d11901d59","Type":"ContainerDied","Data":"1d720e1302e764ebf663f005bed10cb883dafbade3dda76b5abca92872632e34"} Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.577924 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs5jw" event={"ID":"b7ab9c9f-0dae-4891-b32a-a49d11901d59","Type":"ContainerDied","Data":"d6aaf19742b0f7513fdbbb0e6a0b47f14a4682422d94239f9581cf5bbfb0d82b"} Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.577947 4843 scope.go:117] "RemoveContainer" containerID="1d720e1302e764ebf663f005bed10cb883dafbade3dda76b5abca92872632e34" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.610773 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vf6dl" podUID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerName="registry-server" probeResult="failure" output=< Mar 18 13:18:37 crc kubenswrapper[4843]: timeout: failed to connect service ":50051" within 1s Mar 18 13:18:37 crc kubenswrapper[4843]: > Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.613608 4843 scope.go:117] "RemoveContainer" containerID="83cc3fd095f28ef426e870b8155092628eb084ef4ac2a0baf37a8dd057ecb9ac" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.620414 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gs5jw"] Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.629168 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gs5jw"] Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.637069 4843 scope.go:117] "RemoveContainer" containerID="768af08a5d1467e664225bcf0746196d4323d4bb25a97d5b3e08a965c249b2b4" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.972950 4843 scope.go:117] "RemoveContainer" containerID="1d720e1302e764ebf663f005bed10cb883dafbade3dda76b5abca92872632e34" Mar 18 13:18:37 crc kubenswrapper[4843]: E0318 13:18:37.974141 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d720e1302e764ebf663f005bed10cb883dafbade3dda76b5abca92872632e34\": container with ID starting with 1d720e1302e764ebf663f005bed10cb883dafbade3dda76b5abca92872632e34 not found: ID does not exist" containerID="1d720e1302e764ebf663f005bed10cb883dafbade3dda76b5abca92872632e34" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.974186 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d720e1302e764ebf663f005bed10cb883dafbade3dda76b5abca92872632e34"} err="failed to get container status \"1d720e1302e764ebf663f005bed10cb883dafbade3dda76b5abca92872632e34\": rpc error: code = NotFound desc = could not find container \"1d720e1302e764ebf663f005bed10cb883dafbade3dda76b5abca92872632e34\": container with ID starting with 1d720e1302e764ebf663f005bed10cb883dafbade3dda76b5abca92872632e34 not found: ID does not exist" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.974215 4843 scope.go:117] "RemoveContainer" containerID="83cc3fd095f28ef426e870b8155092628eb084ef4ac2a0baf37a8dd057ecb9ac" Mar 18 13:18:37 crc kubenswrapper[4843]: E0318 13:18:37.975025 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83cc3fd095f28ef426e870b8155092628eb084ef4ac2a0baf37a8dd057ecb9ac\": container with ID starting with 83cc3fd095f28ef426e870b8155092628eb084ef4ac2a0baf37a8dd057ecb9ac not found: ID does not exist" containerID="83cc3fd095f28ef426e870b8155092628eb084ef4ac2a0baf37a8dd057ecb9ac" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.975066 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83cc3fd095f28ef426e870b8155092628eb084ef4ac2a0baf37a8dd057ecb9ac"} err="failed to get container status \"83cc3fd095f28ef426e870b8155092628eb084ef4ac2a0baf37a8dd057ecb9ac\": rpc error: code = NotFound desc = could not find container \"83cc3fd095f28ef426e870b8155092628eb084ef4ac2a0baf37a8dd057ecb9ac\": container with ID starting with 83cc3fd095f28ef426e870b8155092628eb084ef4ac2a0baf37a8dd057ecb9ac not found: ID does not exist" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.975092 4843 scope.go:117] "RemoveContainer" containerID="768af08a5d1467e664225bcf0746196d4323d4bb25a97d5b3e08a965c249b2b4" Mar 18 13:18:37 crc kubenswrapper[4843]: E0318 13:18:37.975936 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"768af08a5d1467e664225bcf0746196d4323d4bb25a97d5b3e08a965c249b2b4\": container with ID starting with 768af08a5d1467e664225bcf0746196d4323d4bb25a97d5b3e08a965c249b2b4 not found: ID does not exist" containerID="768af08a5d1467e664225bcf0746196d4323d4bb25a97d5b3e08a965c249b2b4" Mar 18 13:18:37 crc kubenswrapper[4843]: I0318 13:18:37.976009 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"768af08a5d1467e664225bcf0746196d4323d4bb25a97d5b3e08a965c249b2b4"} err="failed to get container status \"768af08a5d1467e664225bcf0746196d4323d4bb25a97d5b3e08a965c249b2b4\": rpc error: code = NotFound desc = could not find container \"768af08a5d1467e664225bcf0746196d4323d4bb25a97d5b3e08a965c249b2b4\": container with ID starting with 768af08a5d1467e664225bcf0746196d4323d4bb25a97d5b3e08a965c249b2b4 not found: ID does not exist" Mar 18 13:18:39 crc kubenswrapper[4843]: I0318 13:18:39.025801 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7ab9c9f-0dae-4891-b32a-a49d11901d59" path="/var/lib/kubelet/pods/b7ab9c9f-0dae-4891-b32a-a49d11901d59/volumes" Mar 18 13:18:47 crc kubenswrapper[4843]: I0318 13:18:47.588558 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vf6dl" podUID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerName="registry-server" probeResult="failure" output=< Mar 18 13:18:47 crc kubenswrapper[4843]: timeout: failed to connect service ":50051" within 1s Mar 18 13:18:47 crc kubenswrapper[4843]: > Mar 18 13:18:56 crc kubenswrapper[4843]: I0318 13:18:56.676837 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:56 crc kubenswrapper[4843]: I0318 13:18:56.734023 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:56 crc kubenswrapper[4843]: I0318 13:18:56.919614 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vf6dl"] Mar 18 13:18:57 crc kubenswrapper[4843]: I0318 13:18:57.900734 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vf6dl" podUID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerName="registry-server" containerID="cri-o://5a82aaa9727999d73722a28845c863441d4b420834d8a49b7ec8f2f0a2c8aeb6" gracePeriod=2 Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.530133 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.661033 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjn82\" (UniqueName: \"kubernetes.io/projected/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-kube-api-access-bjn82\") pod \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\" (UID: \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\") " Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.661164 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-utilities\") pod \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\" (UID: \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\") " Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.661395 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-catalog-content\") pod \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\" (UID: \"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc\") " Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.662206 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-utilities" (OuterVolumeSpecName: "utilities") pod "04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" (UID: "04e05e8d-4c50-4b1f-9a57-f9fd124a85cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.670542 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-kube-api-access-bjn82" (OuterVolumeSpecName: "kube-api-access-bjn82") pod "04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" (UID: "04e05e8d-4c50-4b1f-9a57-f9fd124a85cc"). InnerVolumeSpecName "kube-api-access-bjn82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.764220 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjn82\" (UniqueName: \"kubernetes.io/projected/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-kube-api-access-bjn82\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.764481 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.784192 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" (UID: "04e05e8d-4c50-4b1f-9a57-f9fd124a85cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.866815 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.915749 4843 generic.go:334] "Generic (PLEG): container finished" podID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerID="5a82aaa9727999d73722a28845c863441d4b420834d8a49b7ec8f2f0a2c8aeb6" exitCode=0 Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.915808 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf6dl" event={"ID":"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc","Type":"ContainerDied","Data":"5a82aaa9727999d73722a28845c863441d4b420834d8a49b7ec8f2f0a2c8aeb6"} Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.915855 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vf6dl" event={"ID":"04e05e8d-4c50-4b1f-9a57-f9fd124a85cc","Type":"ContainerDied","Data":"ad9dbecdab80cd40455da4cd25e8a320bd979b6ed59ae97f6c40c8747be41d9b"} Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.915933 4843 scope.go:117] "RemoveContainer" containerID="5a82aaa9727999d73722a28845c863441d4b420834d8a49b7ec8f2f0a2c8aeb6" Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.916004 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vf6dl" Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.948003 4843 scope.go:117] "RemoveContainer" containerID="cfa492b4f1fbe7b852e546660d424d2dd5a3cde80d16468dfe6bbb276191464d" Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.966418 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vf6dl"] Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.977813 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vf6dl"] Mar 18 13:18:58 crc kubenswrapper[4843]: I0318 13:18:58.980317 4843 scope.go:117] "RemoveContainer" containerID="412afe0e99bc8992a50974d48003409c6a59629ca99b6d095402cc6e0f342dfd" Mar 18 13:18:59 crc kubenswrapper[4843]: I0318 13:18:59.002415 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" path="/var/lib/kubelet/pods/04e05e8d-4c50-4b1f-9a57-f9fd124a85cc/volumes" Mar 18 13:18:59 crc kubenswrapper[4843]: I0318 13:18:59.035398 4843 scope.go:117] "RemoveContainer" containerID="5a82aaa9727999d73722a28845c863441d4b420834d8a49b7ec8f2f0a2c8aeb6" Mar 18 13:18:59 crc kubenswrapper[4843]: E0318 13:18:59.037042 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a82aaa9727999d73722a28845c863441d4b420834d8a49b7ec8f2f0a2c8aeb6\": container with ID starting with 5a82aaa9727999d73722a28845c863441d4b420834d8a49b7ec8f2f0a2c8aeb6 not found: ID does not exist" containerID="5a82aaa9727999d73722a28845c863441d4b420834d8a49b7ec8f2f0a2c8aeb6" Mar 18 13:18:59 crc kubenswrapper[4843]: I0318 13:18:59.037100 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a82aaa9727999d73722a28845c863441d4b420834d8a49b7ec8f2f0a2c8aeb6"} err="failed to get container status \"5a82aaa9727999d73722a28845c863441d4b420834d8a49b7ec8f2f0a2c8aeb6\": rpc error: code = NotFound desc = could not find container \"5a82aaa9727999d73722a28845c863441d4b420834d8a49b7ec8f2f0a2c8aeb6\": container with ID starting with 5a82aaa9727999d73722a28845c863441d4b420834d8a49b7ec8f2f0a2c8aeb6 not found: ID does not exist" Mar 18 13:18:59 crc kubenswrapper[4843]: I0318 13:18:59.037134 4843 scope.go:117] "RemoveContainer" containerID="cfa492b4f1fbe7b852e546660d424d2dd5a3cde80d16468dfe6bbb276191464d" Mar 18 13:18:59 crc kubenswrapper[4843]: E0318 13:18:59.037734 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa492b4f1fbe7b852e546660d424d2dd5a3cde80d16468dfe6bbb276191464d\": container with ID starting with cfa492b4f1fbe7b852e546660d424d2dd5a3cde80d16468dfe6bbb276191464d not found: ID does not exist" containerID="cfa492b4f1fbe7b852e546660d424d2dd5a3cde80d16468dfe6bbb276191464d" Mar 18 13:18:59 crc kubenswrapper[4843]: I0318 13:18:59.037789 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa492b4f1fbe7b852e546660d424d2dd5a3cde80d16468dfe6bbb276191464d"} err="failed to get container status \"cfa492b4f1fbe7b852e546660d424d2dd5a3cde80d16468dfe6bbb276191464d\": rpc error: code = NotFound desc = could not find container \"cfa492b4f1fbe7b852e546660d424d2dd5a3cde80d16468dfe6bbb276191464d\": container with ID starting with cfa492b4f1fbe7b852e546660d424d2dd5a3cde80d16468dfe6bbb276191464d not found: ID does not exist" Mar 18 13:18:59 crc kubenswrapper[4843]: I0318 13:18:59.037818 4843 scope.go:117] "RemoveContainer" containerID="412afe0e99bc8992a50974d48003409c6a59629ca99b6d095402cc6e0f342dfd" Mar 18 13:18:59 crc kubenswrapper[4843]: E0318 13:18:59.038215 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"412afe0e99bc8992a50974d48003409c6a59629ca99b6d095402cc6e0f342dfd\": container with ID starting with 412afe0e99bc8992a50974d48003409c6a59629ca99b6d095402cc6e0f342dfd not found: ID does not exist" containerID="412afe0e99bc8992a50974d48003409c6a59629ca99b6d095402cc6e0f342dfd" Mar 18 13:18:59 crc kubenswrapper[4843]: I0318 13:18:59.038255 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"412afe0e99bc8992a50974d48003409c6a59629ca99b6d095402cc6e0f342dfd"} err="failed to get container status \"412afe0e99bc8992a50974d48003409c6a59629ca99b6d095402cc6e0f342dfd\": rpc error: code = NotFound desc = could not find container \"412afe0e99bc8992a50974d48003409c6a59629ca99b6d095402cc6e0f342dfd\": container with ID starting with 412afe0e99bc8992a50974d48003409c6a59629ca99b6d095402cc6e0f342dfd not found: ID does not exist" Mar 18 13:19:20 crc kubenswrapper[4843]: I0318 13:19:20.034600 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:19:20 crc kubenswrapper[4843]: I0318 13:19:20.035235 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.275082 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jg5qg"] Mar 18 13:19:30 crc kubenswrapper[4843]: E0318 13:19:30.276729 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ab9c9f-0dae-4891-b32a-a49d11901d59" containerName="extract-content" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.276751 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ab9c9f-0dae-4891-b32a-a49d11901d59" containerName="extract-content" Mar 18 13:19:30 crc kubenswrapper[4843]: E0318 13:19:30.276779 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerName="extract-utilities" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.276790 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerName="extract-utilities" Mar 18 13:19:30 crc kubenswrapper[4843]: E0318 13:19:30.276804 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ab9c9f-0dae-4891-b32a-a49d11901d59" containerName="registry-server" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.276814 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ab9c9f-0dae-4891-b32a-a49d11901d59" containerName="registry-server" Mar 18 13:19:30 crc kubenswrapper[4843]: E0318 13:19:30.276836 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerName="extract-content" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.276843 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerName="extract-content" Mar 18 13:19:30 crc kubenswrapper[4843]: E0318 13:19:30.276872 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerName="registry-server" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.276878 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerName="registry-server" Mar 18 13:19:30 crc kubenswrapper[4843]: E0318 13:19:30.276888 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ab9c9f-0dae-4891-b32a-a49d11901d59" containerName="extract-utilities" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.276895 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ab9c9f-0dae-4891-b32a-a49d11901d59" containerName="extract-utilities" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.277136 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7ab9c9f-0dae-4891-b32a-a49d11901d59" containerName="registry-server" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.277156 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e05e8d-4c50-4b1f-9a57-f9fd124a85cc" containerName="registry-server" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.279321 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.297753 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jg5qg"] Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.390636 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dckg9\" (UniqueName: \"kubernetes.io/projected/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-kube-api-access-dckg9\") pod \"redhat-marketplace-jg5qg\" (UID: \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\") " pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.390756 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-catalog-content\") pod \"redhat-marketplace-jg5qg\" (UID: \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\") " pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.390808 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-utilities\") pod \"redhat-marketplace-jg5qg\" (UID: \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\") " pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.498762 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-catalog-content\") pod \"redhat-marketplace-jg5qg\" (UID: \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\") " pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.498822 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-utilities\") pod \"redhat-marketplace-jg5qg\" (UID: \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\") " pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.498961 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dckg9\" (UniqueName: \"kubernetes.io/projected/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-kube-api-access-dckg9\") pod \"redhat-marketplace-jg5qg\" (UID: \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\") " pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.499353 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-catalog-content\") pod \"redhat-marketplace-jg5qg\" (UID: \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\") " pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.499464 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-utilities\") pod \"redhat-marketplace-jg5qg\" (UID: \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\") " pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.703746 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dckg9\" (UniqueName: \"kubernetes.io/projected/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-kube-api-access-dckg9\") pod \"redhat-marketplace-jg5qg\" (UID: \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\") " pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:30 crc kubenswrapper[4843]: I0318 13:19:30.904108 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:31 crc kubenswrapper[4843]: I0318 13:19:31.582060 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jg5qg"] Mar 18 13:19:32 crc kubenswrapper[4843]: I0318 13:19:32.267339 4843 generic.go:334] "Generic (PLEG): container finished" podID="129f853c-8e9a-4cfa-8081-3212b1d0c9d5" containerID="9eebc6c25f85a0bc37c4bf2bc4bd7d2c1bde29f288f103f31f39d92fe630c6e1" exitCode=0 Mar 18 13:19:32 crc kubenswrapper[4843]: I0318 13:19:32.267446 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jg5qg" event={"ID":"129f853c-8e9a-4cfa-8081-3212b1d0c9d5","Type":"ContainerDied","Data":"9eebc6c25f85a0bc37c4bf2bc4bd7d2c1bde29f288f103f31f39d92fe630c6e1"} Mar 18 13:19:32 crc kubenswrapper[4843]: I0318 13:19:32.267753 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jg5qg" event={"ID":"129f853c-8e9a-4cfa-8081-3212b1d0c9d5","Type":"ContainerStarted","Data":"bff08f306b59c95de40f303643391d119d4fbae761a705822907b47a588ef319"} Mar 18 13:19:32 crc kubenswrapper[4843]: I0318 13:19:32.270408 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:19:34 crc kubenswrapper[4843]: I0318 13:19:34.287558 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jg5qg" event={"ID":"129f853c-8e9a-4cfa-8081-3212b1d0c9d5","Type":"ContainerStarted","Data":"cb0ab28e4a017ad75cbf2fb395724b0cefe2fd9aaaac3bc91a90b8cc6924da63"} Mar 18 13:19:35 crc kubenswrapper[4843]: I0318 13:19:35.299575 4843 generic.go:334] "Generic (PLEG): container finished" podID="129f853c-8e9a-4cfa-8081-3212b1d0c9d5" containerID="cb0ab28e4a017ad75cbf2fb395724b0cefe2fd9aaaac3bc91a90b8cc6924da63" exitCode=0 Mar 18 13:19:35 crc kubenswrapper[4843]: I0318 13:19:35.300305 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jg5qg" event={"ID":"129f853c-8e9a-4cfa-8081-3212b1d0c9d5","Type":"ContainerDied","Data":"cb0ab28e4a017ad75cbf2fb395724b0cefe2fd9aaaac3bc91a90b8cc6924da63"} Mar 18 13:19:36 crc kubenswrapper[4843]: I0318 13:19:36.310729 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jg5qg" event={"ID":"129f853c-8e9a-4cfa-8081-3212b1d0c9d5","Type":"ContainerStarted","Data":"eb5659028519b66bf6103d856e196d675e5457638d55e751ba0ae69fde9a9d6a"} Mar 18 13:19:36 crc kubenswrapper[4843]: I0318 13:19:36.337804 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jg5qg" podStartSLOduration=2.868337542 podStartE2EDuration="6.337746457s" podCreationTimestamp="2026-03-18 13:19:30 +0000 UTC" firstStartedPulling="2026-03-18 13:19:32.269926958 +0000 UTC m=+4205.985752482" lastFinishedPulling="2026-03-18 13:19:35.739335873 +0000 UTC m=+4209.455161397" observedRunningTime="2026-03-18 13:19:36.326846129 +0000 UTC m=+4210.042671663" watchObservedRunningTime="2026-03-18 13:19:36.337746457 +0000 UTC m=+4210.053571981" Mar 18 13:19:40 crc kubenswrapper[4843]: I0318 13:19:40.905198 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:40 crc kubenswrapper[4843]: I0318 13:19:40.905556 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:40 crc kubenswrapper[4843]: I0318 13:19:40.958324 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:41 crc kubenswrapper[4843]: I0318 13:19:41.458645 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:41 crc kubenswrapper[4843]: I0318 13:19:41.528927 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jg5qg"] Mar 18 13:19:43 crc kubenswrapper[4843]: I0318 13:19:43.423109 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jg5qg" podUID="129f853c-8e9a-4cfa-8081-3212b1d0c9d5" containerName="registry-server" containerID="cri-o://eb5659028519b66bf6103d856e196d675e5457638d55e751ba0ae69fde9a9d6a" gracePeriod=2 Mar 18 13:19:43 crc kubenswrapper[4843]: I0318 13:19:43.922936 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.082480 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dckg9\" (UniqueName: \"kubernetes.io/projected/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-kube-api-access-dckg9\") pod \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\" (UID: \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\") " Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.082635 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-catalog-content\") pod \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\" (UID: \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\") " Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.082790 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-utilities\") pod \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\" (UID: \"129f853c-8e9a-4cfa-8081-3212b1d0c9d5\") " Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.083771 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-utilities" (OuterVolumeSpecName: "utilities") pod "129f853c-8e9a-4cfa-8081-3212b1d0c9d5" (UID: "129f853c-8e9a-4cfa-8081-3212b1d0c9d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.088969 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-kube-api-access-dckg9" (OuterVolumeSpecName: "kube-api-access-dckg9") pod "129f853c-8e9a-4cfa-8081-3212b1d0c9d5" (UID: "129f853c-8e9a-4cfa-8081-3212b1d0c9d5"). InnerVolumeSpecName "kube-api-access-dckg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.110963 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "129f853c-8e9a-4cfa-8081-3212b1d0c9d5" (UID: "129f853c-8e9a-4cfa-8081-3212b1d0c9d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.184842 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dckg9\" (UniqueName: \"kubernetes.io/projected/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-kube-api-access-dckg9\") on node \"crc\" DevicePath \"\"" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.184890 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.184904 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129f853c-8e9a-4cfa-8081-3212b1d0c9d5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.434626 4843 generic.go:334] "Generic (PLEG): container finished" podID="129f853c-8e9a-4cfa-8081-3212b1d0c9d5" containerID="eb5659028519b66bf6103d856e196d675e5457638d55e751ba0ae69fde9a9d6a" exitCode=0 Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.434696 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jg5qg" event={"ID":"129f853c-8e9a-4cfa-8081-3212b1d0c9d5","Type":"ContainerDied","Data":"eb5659028519b66bf6103d856e196d675e5457638d55e751ba0ae69fde9a9d6a"} Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.434738 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jg5qg" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.434765 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jg5qg" event={"ID":"129f853c-8e9a-4cfa-8081-3212b1d0c9d5","Type":"ContainerDied","Data":"bff08f306b59c95de40f303643391d119d4fbae761a705822907b47a588ef319"} Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.434797 4843 scope.go:117] "RemoveContainer" containerID="eb5659028519b66bf6103d856e196d675e5457638d55e751ba0ae69fde9a9d6a" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.469315 4843 scope.go:117] "RemoveContainer" containerID="cb0ab28e4a017ad75cbf2fb395724b0cefe2fd9aaaac3bc91a90b8cc6924da63" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.500886 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jg5qg"] Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.501884 4843 scope.go:117] "RemoveContainer" containerID="9eebc6c25f85a0bc37c4bf2bc4bd7d2c1bde29f288f103f31f39d92fe630c6e1" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.509398 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jg5qg"] Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.557041 4843 scope.go:117] "RemoveContainer" containerID="eb5659028519b66bf6103d856e196d675e5457638d55e751ba0ae69fde9a9d6a" Mar 18 13:19:44 crc kubenswrapper[4843]: E0318 13:19:44.559003 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb5659028519b66bf6103d856e196d675e5457638d55e751ba0ae69fde9a9d6a\": container with ID starting with eb5659028519b66bf6103d856e196d675e5457638d55e751ba0ae69fde9a9d6a not found: ID does not exist" containerID="eb5659028519b66bf6103d856e196d675e5457638d55e751ba0ae69fde9a9d6a" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.559063 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb5659028519b66bf6103d856e196d675e5457638d55e751ba0ae69fde9a9d6a"} err="failed to get container status \"eb5659028519b66bf6103d856e196d675e5457638d55e751ba0ae69fde9a9d6a\": rpc error: code = NotFound desc = could not find container \"eb5659028519b66bf6103d856e196d675e5457638d55e751ba0ae69fde9a9d6a\": container with ID starting with eb5659028519b66bf6103d856e196d675e5457638d55e751ba0ae69fde9a9d6a not found: ID does not exist" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.559093 4843 scope.go:117] "RemoveContainer" containerID="cb0ab28e4a017ad75cbf2fb395724b0cefe2fd9aaaac3bc91a90b8cc6924da63" Mar 18 13:19:44 crc kubenswrapper[4843]: E0318 13:19:44.559592 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0ab28e4a017ad75cbf2fb395724b0cefe2fd9aaaac3bc91a90b8cc6924da63\": container with ID starting with cb0ab28e4a017ad75cbf2fb395724b0cefe2fd9aaaac3bc91a90b8cc6924da63 not found: ID does not exist" containerID="cb0ab28e4a017ad75cbf2fb395724b0cefe2fd9aaaac3bc91a90b8cc6924da63" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.559817 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0ab28e4a017ad75cbf2fb395724b0cefe2fd9aaaac3bc91a90b8cc6924da63"} err="failed to get container status \"cb0ab28e4a017ad75cbf2fb395724b0cefe2fd9aaaac3bc91a90b8cc6924da63\": rpc error: code = NotFound desc = could not find container \"cb0ab28e4a017ad75cbf2fb395724b0cefe2fd9aaaac3bc91a90b8cc6924da63\": container with ID starting with cb0ab28e4a017ad75cbf2fb395724b0cefe2fd9aaaac3bc91a90b8cc6924da63 not found: ID does not exist" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.559873 4843 scope.go:117] "RemoveContainer" containerID="9eebc6c25f85a0bc37c4bf2bc4bd7d2c1bde29f288f103f31f39d92fe630c6e1" Mar 18 13:19:44 crc kubenswrapper[4843]: E0318 13:19:44.560614 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eebc6c25f85a0bc37c4bf2bc4bd7d2c1bde29f288f103f31f39d92fe630c6e1\": container with ID starting with 9eebc6c25f85a0bc37c4bf2bc4bd7d2c1bde29f288f103f31f39d92fe630c6e1 not found: ID does not exist" containerID="9eebc6c25f85a0bc37c4bf2bc4bd7d2c1bde29f288f103f31f39d92fe630c6e1" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.560718 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eebc6c25f85a0bc37c4bf2bc4bd7d2c1bde29f288f103f31f39d92fe630c6e1"} err="failed to get container status \"9eebc6c25f85a0bc37c4bf2bc4bd7d2c1bde29f288f103f31f39d92fe630c6e1\": rpc error: code = NotFound desc = could not find container \"9eebc6c25f85a0bc37c4bf2bc4bd7d2c1bde29f288f103f31f39d92fe630c6e1\": container with ID starting with 9eebc6c25f85a0bc37c4bf2bc4bd7d2c1bde29f288f103f31f39d92fe630c6e1 not found: ID does not exist" Mar 18 13:19:44 crc kubenswrapper[4843]: I0318 13:19:44.995756 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="129f853c-8e9a-4cfa-8081-3212b1d0c9d5" path="/var/lib/kubelet/pods/129f853c-8e9a-4cfa-8081-3212b1d0c9d5/volumes" Mar 18 13:19:50 crc kubenswrapper[4843]: I0318 13:19:50.034820 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:19:50 crc kubenswrapper[4843]: I0318 13:19:50.035443 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.150437 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564000-dv7ht"] Mar 18 13:20:00 crc kubenswrapper[4843]: E0318 13:20:00.151523 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129f853c-8e9a-4cfa-8081-3212b1d0c9d5" containerName="extract-content" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.151539 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="129f853c-8e9a-4cfa-8081-3212b1d0c9d5" containerName="extract-content" Mar 18 13:20:00 crc kubenswrapper[4843]: E0318 13:20:00.151551 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129f853c-8e9a-4cfa-8081-3212b1d0c9d5" containerName="extract-utilities" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.151558 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="129f853c-8e9a-4cfa-8081-3212b1d0c9d5" containerName="extract-utilities" Mar 18 13:20:00 crc kubenswrapper[4843]: E0318 13:20:00.151576 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129f853c-8e9a-4cfa-8081-3212b1d0c9d5" containerName="registry-server" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.151582 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="129f853c-8e9a-4cfa-8081-3212b1d0c9d5" containerName="registry-server" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.151807 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="129f853c-8e9a-4cfa-8081-3212b1d0c9d5" containerName="registry-server" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.152539 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-dv7ht" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.155393 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.155566 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.155885 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.168266 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-dv7ht"] Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.259907 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97g7j\" (UniqueName: \"kubernetes.io/projected/f3b556a4-8479-4cc6-bcaa-45e09a55684a-kube-api-access-97g7j\") pod \"auto-csr-approver-29564000-dv7ht\" (UID: \"f3b556a4-8479-4cc6-bcaa-45e09a55684a\") " pod="openshift-infra/auto-csr-approver-29564000-dv7ht" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.362428 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97g7j\" (UniqueName: \"kubernetes.io/projected/f3b556a4-8479-4cc6-bcaa-45e09a55684a-kube-api-access-97g7j\") pod \"auto-csr-approver-29564000-dv7ht\" (UID: \"f3b556a4-8479-4cc6-bcaa-45e09a55684a\") " pod="openshift-infra/auto-csr-approver-29564000-dv7ht" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.382863 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97g7j\" (UniqueName: \"kubernetes.io/projected/f3b556a4-8479-4cc6-bcaa-45e09a55684a-kube-api-access-97g7j\") pod \"auto-csr-approver-29564000-dv7ht\" (UID: \"f3b556a4-8479-4cc6-bcaa-45e09a55684a\") " pod="openshift-infra/auto-csr-approver-29564000-dv7ht" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.477435 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-dv7ht" Mar 18 13:20:00 crc kubenswrapper[4843]: I0318 13:20:00.939049 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-dv7ht"] Mar 18 13:20:01 crc kubenswrapper[4843]: I0318 13:20:01.607465 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564000-dv7ht" event={"ID":"f3b556a4-8479-4cc6-bcaa-45e09a55684a","Type":"ContainerStarted","Data":"51cfcc582ba0ae3e19cb1b5f0e27575f55a880c355bb35dbee4e0dd92d446561"} Mar 18 13:20:03 crc kubenswrapper[4843]: I0318 13:20:03.647217 4843 generic.go:334] "Generic (PLEG): container finished" podID="f3b556a4-8479-4cc6-bcaa-45e09a55684a" containerID="09e88ad04b5eeefabf359c3900ee5949c8e89644cce1e24a178d32a911d541cb" exitCode=0 Mar 18 13:20:03 crc kubenswrapper[4843]: I0318 13:20:03.647321 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564000-dv7ht" event={"ID":"f3b556a4-8479-4cc6-bcaa-45e09a55684a","Type":"ContainerDied","Data":"09e88ad04b5eeefabf359c3900ee5949c8e89644cce1e24a178d32a911d541cb"} Mar 18 13:20:05 crc kubenswrapper[4843]: I0318 13:20:05.029495 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-dv7ht" Mar 18 13:20:05 crc kubenswrapper[4843]: I0318 13:20:05.150892 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97g7j\" (UniqueName: \"kubernetes.io/projected/f3b556a4-8479-4cc6-bcaa-45e09a55684a-kube-api-access-97g7j\") pod \"f3b556a4-8479-4cc6-bcaa-45e09a55684a\" (UID: \"f3b556a4-8479-4cc6-bcaa-45e09a55684a\") " Mar 18 13:20:05 crc kubenswrapper[4843]: I0318 13:20:05.157544 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3b556a4-8479-4cc6-bcaa-45e09a55684a-kube-api-access-97g7j" (OuterVolumeSpecName: "kube-api-access-97g7j") pod "f3b556a4-8479-4cc6-bcaa-45e09a55684a" (UID: "f3b556a4-8479-4cc6-bcaa-45e09a55684a"). InnerVolumeSpecName "kube-api-access-97g7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:20:05 crc kubenswrapper[4843]: I0318 13:20:05.254015 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97g7j\" (UniqueName: \"kubernetes.io/projected/f3b556a4-8479-4cc6-bcaa-45e09a55684a-kube-api-access-97g7j\") on node \"crc\" DevicePath \"\"" Mar 18 13:20:05 crc kubenswrapper[4843]: I0318 13:20:05.666514 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564000-dv7ht" event={"ID":"f3b556a4-8479-4cc6-bcaa-45e09a55684a","Type":"ContainerDied","Data":"51cfcc582ba0ae3e19cb1b5f0e27575f55a880c355bb35dbee4e0dd92d446561"} Mar 18 13:20:05 crc kubenswrapper[4843]: I0318 13:20:05.666774 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51cfcc582ba0ae3e19cb1b5f0e27575f55a880c355bb35dbee4e0dd92d446561" Mar 18 13:20:05 crc kubenswrapper[4843]: I0318 13:20:05.666579 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-dv7ht" Mar 18 13:20:06 crc kubenswrapper[4843]: I0318 13:20:06.114924 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-rzz2k"] Mar 18 13:20:06 crc kubenswrapper[4843]: I0318 13:20:06.124267 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-rzz2k"] Mar 18 13:20:06 crc kubenswrapper[4843]: I0318 13:20:06.996346 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c47fc0c-e590-4f47-aaa2-bff437f222a6" path="/var/lib/kubelet/pods/1c47fc0c-e590-4f47-aaa2-bff437f222a6/volumes" Mar 18 13:20:07 crc kubenswrapper[4843]: I0318 13:20:07.363498 4843 scope.go:117] "RemoveContainer" containerID="837a8b3b533caa7b0aafbcb35b2e7b0248f451e8d7f0949e71e4b4dfc6442d35" Mar 18 13:20:20 crc kubenswrapper[4843]: I0318 13:20:20.034778 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:20:20 crc kubenswrapper[4843]: I0318 13:20:20.035337 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:20:20 crc kubenswrapper[4843]: I0318 13:20:20.035404 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 13:20:20 crc kubenswrapper[4843]: I0318 13:20:20.036398 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7974653c49645239160ac43ca7d8c3d2bf1a27727cb40316647ec5569b7697ac"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:20:20 crc kubenswrapper[4843]: I0318 13:20:20.036521 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://7974653c49645239160ac43ca7d8c3d2bf1a27727cb40316647ec5569b7697ac" gracePeriod=600 Mar 18 13:20:20 crc kubenswrapper[4843]: I0318 13:20:20.831044 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="7974653c49645239160ac43ca7d8c3d2bf1a27727cb40316647ec5569b7697ac" exitCode=0 Mar 18 13:20:20 crc kubenswrapper[4843]: I0318 13:20:20.831117 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"7974653c49645239160ac43ca7d8c3d2bf1a27727cb40316647ec5569b7697ac"} Mar 18 13:20:20 crc kubenswrapper[4843]: I0318 13:20:20.832050 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0"} Mar 18 13:20:20 crc kubenswrapper[4843]: I0318 13:20:20.832183 4843 scope.go:117] "RemoveContainer" containerID="811c3406e9e48231233b37679e7e18ce956c658cc02d332f142f433878b9341a" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.455076 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l8p6k"] Mar 18 13:21:35 crc kubenswrapper[4843]: E0318 13:21:35.457058 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3b556a4-8479-4cc6-bcaa-45e09a55684a" containerName="oc" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.457084 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b556a4-8479-4cc6-bcaa-45e09a55684a" containerName="oc" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.457520 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3b556a4-8479-4cc6-bcaa-45e09a55684a" containerName="oc" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.460392 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.471286 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l8p6k"] Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.601971 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x99dx\" (UniqueName: \"kubernetes.io/projected/18454655-344b-443b-9ca2-3e2b402f2a00-kube-api-access-x99dx\") pod \"community-operators-l8p6k\" (UID: \"18454655-344b-443b-9ca2-3e2b402f2a00\") " pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.602386 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18454655-344b-443b-9ca2-3e2b402f2a00-catalog-content\") pod \"community-operators-l8p6k\" (UID: \"18454655-344b-443b-9ca2-3e2b402f2a00\") " pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.602468 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18454655-344b-443b-9ca2-3e2b402f2a00-utilities\") pod \"community-operators-l8p6k\" (UID: \"18454655-344b-443b-9ca2-3e2b402f2a00\") " pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.704089 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x99dx\" (UniqueName: \"kubernetes.io/projected/18454655-344b-443b-9ca2-3e2b402f2a00-kube-api-access-x99dx\") pod \"community-operators-l8p6k\" (UID: \"18454655-344b-443b-9ca2-3e2b402f2a00\") " pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.704193 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18454655-344b-443b-9ca2-3e2b402f2a00-catalog-content\") pod \"community-operators-l8p6k\" (UID: \"18454655-344b-443b-9ca2-3e2b402f2a00\") " pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.704223 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18454655-344b-443b-9ca2-3e2b402f2a00-utilities\") pod \"community-operators-l8p6k\" (UID: \"18454655-344b-443b-9ca2-3e2b402f2a00\") " pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.704818 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18454655-344b-443b-9ca2-3e2b402f2a00-utilities\") pod \"community-operators-l8p6k\" (UID: \"18454655-344b-443b-9ca2-3e2b402f2a00\") " pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.704817 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18454655-344b-443b-9ca2-3e2b402f2a00-catalog-content\") pod \"community-operators-l8p6k\" (UID: \"18454655-344b-443b-9ca2-3e2b402f2a00\") " pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.731263 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x99dx\" (UniqueName: \"kubernetes.io/projected/18454655-344b-443b-9ca2-3e2b402f2a00-kube-api-access-x99dx\") pod \"community-operators-l8p6k\" (UID: \"18454655-344b-443b-9ca2-3e2b402f2a00\") " pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:35 crc kubenswrapper[4843]: I0318 13:21:35.784036 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:36 crc kubenswrapper[4843]: I0318 13:21:36.348942 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l8p6k"] Mar 18 13:21:36 crc kubenswrapper[4843]: I0318 13:21:36.759116 4843 generic.go:334] "Generic (PLEG): container finished" podID="18454655-344b-443b-9ca2-3e2b402f2a00" containerID="8cc9f08e05e5e04e4da90d9213362bf9ab5f31712ef93908ff820c6edd0b9e9b" exitCode=0 Mar 18 13:21:36 crc kubenswrapper[4843]: I0318 13:21:36.759171 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8p6k" event={"ID":"18454655-344b-443b-9ca2-3e2b402f2a00","Type":"ContainerDied","Data":"8cc9f08e05e5e04e4da90d9213362bf9ab5f31712ef93908ff820c6edd0b9e9b"} Mar 18 13:21:36 crc kubenswrapper[4843]: I0318 13:21:36.759208 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8p6k" event={"ID":"18454655-344b-443b-9ca2-3e2b402f2a00","Type":"ContainerStarted","Data":"c60ac87b52bbaaa461b8a55f73f607edc898112968411dc118acec5f98d9b95f"} Mar 18 13:21:38 crc kubenswrapper[4843]: I0318 13:21:38.781501 4843 generic.go:334] "Generic (PLEG): container finished" podID="18454655-344b-443b-9ca2-3e2b402f2a00" containerID="1d6d112157bd64c1a48dfefe9f9c973ed7bba5a77088078e793da62021c688fa" exitCode=0 Mar 18 13:21:38 crc kubenswrapper[4843]: I0318 13:21:38.781603 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8p6k" event={"ID":"18454655-344b-443b-9ca2-3e2b402f2a00","Type":"ContainerDied","Data":"1d6d112157bd64c1a48dfefe9f9c973ed7bba5a77088078e793da62021c688fa"} Mar 18 13:21:39 crc kubenswrapper[4843]: I0318 13:21:39.794180 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8p6k" event={"ID":"18454655-344b-443b-9ca2-3e2b402f2a00","Type":"ContainerStarted","Data":"05da4c72bd942345908ef88bb7ff52945f7187ae3205522938d1800521fa7293"} Mar 18 13:21:39 crc kubenswrapper[4843]: I0318 13:21:39.833789 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l8p6k" podStartSLOduration=2.326850418 podStartE2EDuration="4.833742313s" podCreationTimestamp="2026-03-18 13:21:35 +0000 UTC" firstStartedPulling="2026-03-18 13:21:36.761019097 +0000 UTC m=+4330.476844621" lastFinishedPulling="2026-03-18 13:21:39.267910992 +0000 UTC m=+4332.983736516" observedRunningTime="2026-03-18 13:21:39.821127646 +0000 UTC m=+4333.536953170" watchObservedRunningTime="2026-03-18 13:21:39.833742313 +0000 UTC m=+4333.549567837" Mar 18 13:21:45 crc kubenswrapper[4843]: I0318 13:21:45.784913 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:45 crc kubenswrapper[4843]: I0318 13:21:45.785494 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:45 crc kubenswrapper[4843]: I0318 13:21:45.834712 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:45 crc kubenswrapper[4843]: I0318 13:21:45.905787 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:46 crc kubenswrapper[4843]: I0318 13:21:46.080118 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l8p6k"] Mar 18 13:21:47 crc kubenswrapper[4843]: I0318 13:21:47.869588 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l8p6k" podUID="18454655-344b-443b-9ca2-3e2b402f2a00" containerName="registry-server" containerID="cri-o://05da4c72bd942345908ef88bb7ff52945f7187ae3205522938d1800521fa7293" gracePeriod=2 Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.566786 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.625888 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18454655-344b-443b-9ca2-3e2b402f2a00-catalog-content\") pod \"18454655-344b-443b-9ca2-3e2b402f2a00\" (UID: \"18454655-344b-443b-9ca2-3e2b402f2a00\") " Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.626024 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18454655-344b-443b-9ca2-3e2b402f2a00-utilities\") pod \"18454655-344b-443b-9ca2-3e2b402f2a00\" (UID: \"18454655-344b-443b-9ca2-3e2b402f2a00\") " Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.626080 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x99dx\" (UniqueName: \"kubernetes.io/projected/18454655-344b-443b-9ca2-3e2b402f2a00-kube-api-access-x99dx\") pod \"18454655-344b-443b-9ca2-3e2b402f2a00\" (UID: \"18454655-344b-443b-9ca2-3e2b402f2a00\") " Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.634063 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18454655-344b-443b-9ca2-3e2b402f2a00-utilities" (OuterVolumeSpecName: "utilities") pod "18454655-344b-443b-9ca2-3e2b402f2a00" (UID: "18454655-344b-443b-9ca2-3e2b402f2a00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.639129 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18454655-344b-443b-9ca2-3e2b402f2a00-kube-api-access-x99dx" (OuterVolumeSpecName: "kube-api-access-x99dx") pod "18454655-344b-443b-9ca2-3e2b402f2a00" (UID: "18454655-344b-443b-9ca2-3e2b402f2a00"). InnerVolumeSpecName "kube-api-access-x99dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.697615 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18454655-344b-443b-9ca2-3e2b402f2a00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18454655-344b-443b-9ca2-3e2b402f2a00" (UID: "18454655-344b-443b-9ca2-3e2b402f2a00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.727953 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18454655-344b-443b-9ca2-3e2b402f2a00-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.727992 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18454655-344b-443b-9ca2-3e2b402f2a00-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.728011 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x99dx\" (UniqueName: \"kubernetes.io/projected/18454655-344b-443b-9ca2-3e2b402f2a00-kube-api-access-x99dx\") on node \"crc\" DevicePath \"\"" Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.882716 4843 generic.go:334] "Generic (PLEG): container finished" podID="18454655-344b-443b-9ca2-3e2b402f2a00" containerID="05da4c72bd942345908ef88bb7ff52945f7187ae3205522938d1800521fa7293" exitCode=0 Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.882774 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8p6k" event={"ID":"18454655-344b-443b-9ca2-3e2b402f2a00","Type":"ContainerDied","Data":"05da4c72bd942345908ef88bb7ff52945f7187ae3205522938d1800521fa7293"} Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.882800 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8p6k" Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.882806 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8p6k" event={"ID":"18454655-344b-443b-9ca2-3e2b402f2a00","Type":"ContainerDied","Data":"c60ac87b52bbaaa461b8a55f73f607edc898112968411dc118acec5f98d9b95f"} Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.882858 4843 scope.go:117] "RemoveContainer" containerID="05da4c72bd942345908ef88bb7ff52945f7187ae3205522938d1800521fa7293" Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.924781 4843 scope.go:117] "RemoveContainer" containerID="1d6d112157bd64c1a48dfefe9f9c973ed7bba5a77088078e793da62021c688fa" Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.950787 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l8p6k"] Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.958034 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l8p6k"] Mar 18 13:21:48 crc kubenswrapper[4843]: I0318 13:21:48.977341 4843 scope.go:117] "RemoveContainer" containerID="8cc9f08e05e5e04e4da90d9213362bf9ab5f31712ef93908ff820c6edd0b9e9b" Mar 18 13:21:49 crc kubenswrapper[4843]: I0318 13:21:49.009534 4843 scope.go:117] "RemoveContainer" containerID="05da4c72bd942345908ef88bb7ff52945f7187ae3205522938d1800521fa7293" Mar 18 13:21:49 crc kubenswrapper[4843]: E0318 13:21:49.010941 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05da4c72bd942345908ef88bb7ff52945f7187ae3205522938d1800521fa7293\": container with ID starting with 05da4c72bd942345908ef88bb7ff52945f7187ae3205522938d1800521fa7293 not found: ID does not exist" containerID="05da4c72bd942345908ef88bb7ff52945f7187ae3205522938d1800521fa7293" Mar 18 13:21:49 crc kubenswrapper[4843]: I0318 13:21:49.011017 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05da4c72bd942345908ef88bb7ff52945f7187ae3205522938d1800521fa7293"} err="failed to get container status \"05da4c72bd942345908ef88bb7ff52945f7187ae3205522938d1800521fa7293\": rpc error: code = NotFound desc = could not find container \"05da4c72bd942345908ef88bb7ff52945f7187ae3205522938d1800521fa7293\": container with ID starting with 05da4c72bd942345908ef88bb7ff52945f7187ae3205522938d1800521fa7293 not found: ID does not exist" Mar 18 13:21:49 crc kubenswrapper[4843]: I0318 13:21:49.011054 4843 scope.go:117] "RemoveContainer" containerID="1d6d112157bd64c1a48dfefe9f9c973ed7bba5a77088078e793da62021c688fa" Mar 18 13:21:49 crc kubenswrapper[4843]: E0318 13:21:49.011830 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d6d112157bd64c1a48dfefe9f9c973ed7bba5a77088078e793da62021c688fa\": container with ID starting with 1d6d112157bd64c1a48dfefe9f9c973ed7bba5a77088078e793da62021c688fa not found: ID does not exist" containerID="1d6d112157bd64c1a48dfefe9f9c973ed7bba5a77088078e793da62021c688fa" Mar 18 13:21:49 crc kubenswrapper[4843]: I0318 13:21:49.011882 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6d112157bd64c1a48dfefe9f9c973ed7bba5a77088078e793da62021c688fa"} err="failed to get container status \"1d6d112157bd64c1a48dfefe9f9c973ed7bba5a77088078e793da62021c688fa\": rpc error: code = NotFound desc = could not find container \"1d6d112157bd64c1a48dfefe9f9c973ed7bba5a77088078e793da62021c688fa\": container with ID starting with 1d6d112157bd64c1a48dfefe9f9c973ed7bba5a77088078e793da62021c688fa not found: ID does not exist" Mar 18 13:21:49 crc kubenswrapper[4843]: I0318 13:21:49.011929 4843 scope.go:117] "RemoveContainer" containerID="8cc9f08e05e5e04e4da90d9213362bf9ab5f31712ef93908ff820c6edd0b9e9b" Mar 18 13:21:49 crc kubenswrapper[4843]: E0318 13:21:49.012482 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc9f08e05e5e04e4da90d9213362bf9ab5f31712ef93908ff820c6edd0b9e9b\": container with ID starting with 8cc9f08e05e5e04e4da90d9213362bf9ab5f31712ef93908ff820c6edd0b9e9b not found: ID does not exist" containerID="8cc9f08e05e5e04e4da90d9213362bf9ab5f31712ef93908ff820c6edd0b9e9b" Mar 18 13:21:49 crc kubenswrapper[4843]: I0318 13:21:49.012530 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc9f08e05e5e04e4da90d9213362bf9ab5f31712ef93908ff820c6edd0b9e9b"} err="failed to get container status \"8cc9f08e05e5e04e4da90d9213362bf9ab5f31712ef93908ff820c6edd0b9e9b\": rpc error: code = NotFound desc = could not find container \"8cc9f08e05e5e04e4da90d9213362bf9ab5f31712ef93908ff820c6edd0b9e9b\": container with ID starting with 8cc9f08e05e5e04e4da90d9213362bf9ab5f31712ef93908ff820c6edd0b9e9b not found: ID does not exist" Mar 18 13:21:49 crc kubenswrapper[4843]: I0318 13:21:49.017938 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18454655-344b-443b-9ca2-3e2b402f2a00" path="/var/lib/kubelet/pods/18454655-344b-443b-9ca2-3e2b402f2a00/volumes" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.169847 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564002-7z2s5"] Mar 18 13:22:00 crc kubenswrapper[4843]: E0318 13:22:00.170906 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18454655-344b-443b-9ca2-3e2b402f2a00" containerName="extract-content" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.170926 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="18454655-344b-443b-9ca2-3e2b402f2a00" containerName="extract-content" Mar 18 13:22:00 crc kubenswrapper[4843]: E0318 13:22:00.170958 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18454655-344b-443b-9ca2-3e2b402f2a00" containerName="extract-utilities" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.170964 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="18454655-344b-443b-9ca2-3e2b402f2a00" containerName="extract-utilities" Mar 18 13:22:00 crc kubenswrapper[4843]: E0318 13:22:00.170973 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18454655-344b-443b-9ca2-3e2b402f2a00" containerName="registry-server" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.170979 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="18454655-344b-443b-9ca2-3e2b402f2a00" containerName="registry-server" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.171164 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="18454655-344b-443b-9ca2-3e2b402f2a00" containerName="registry-server" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.171864 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-7z2s5" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.174772 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.175078 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.175268 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.180118 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-7z2s5"] Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.265355 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6b9v\" (UniqueName: \"kubernetes.io/projected/e428b728-cfb8-48af-8b76-c731cd0e7489-kube-api-access-f6b9v\") pod \"auto-csr-approver-29564002-7z2s5\" (UID: \"e428b728-cfb8-48af-8b76-c731cd0e7489\") " pod="openshift-infra/auto-csr-approver-29564002-7z2s5" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.368094 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6b9v\" (UniqueName: \"kubernetes.io/projected/e428b728-cfb8-48af-8b76-c731cd0e7489-kube-api-access-f6b9v\") pod \"auto-csr-approver-29564002-7z2s5\" (UID: \"e428b728-cfb8-48af-8b76-c731cd0e7489\") " pod="openshift-infra/auto-csr-approver-29564002-7z2s5" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.394277 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6b9v\" (UniqueName: \"kubernetes.io/projected/e428b728-cfb8-48af-8b76-c731cd0e7489-kube-api-access-f6b9v\") pod \"auto-csr-approver-29564002-7z2s5\" (UID: \"e428b728-cfb8-48af-8b76-c731cd0e7489\") " pod="openshift-infra/auto-csr-approver-29564002-7z2s5" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.494147 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-7z2s5" Mar 18 13:22:00 crc kubenswrapper[4843]: I0318 13:22:00.941591 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-7z2s5"] Mar 18 13:22:01 crc kubenswrapper[4843]: I0318 13:22:01.001729 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564002-7z2s5" event={"ID":"e428b728-cfb8-48af-8b76-c731cd0e7489","Type":"ContainerStarted","Data":"2443ce0c73e679488b0ada7ac9401f82e61de8eb7d35e4d35c894b7f3dffc47a"} Mar 18 13:22:03 crc kubenswrapper[4843]: I0318 13:22:03.025224 4843 generic.go:334] "Generic (PLEG): container finished" podID="e428b728-cfb8-48af-8b76-c731cd0e7489" containerID="e666d9075a21ecf67b17d876ead993e8bdb03625c410a39d491687b83e4ca5ba" exitCode=0 Mar 18 13:22:03 crc kubenswrapper[4843]: I0318 13:22:03.025301 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564002-7z2s5" event={"ID":"e428b728-cfb8-48af-8b76-c731cd0e7489","Type":"ContainerDied","Data":"e666d9075a21ecf67b17d876ead993e8bdb03625c410a39d491687b83e4ca5ba"} Mar 18 13:22:04 crc kubenswrapper[4843]: I0318 13:22:04.398080 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-7z2s5" Mar 18 13:22:04 crc kubenswrapper[4843]: I0318 13:22:04.508346 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6b9v\" (UniqueName: \"kubernetes.io/projected/e428b728-cfb8-48af-8b76-c731cd0e7489-kube-api-access-f6b9v\") pod \"e428b728-cfb8-48af-8b76-c731cd0e7489\" (UID: \"e428b728-cfb8-48af-8b76-c731cd0e7489\") " Mar 18 13:22:04 crc kubenswrapper[4843]: I0318 13:22:04.514631 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e428b728-cfb8-48af-8b76-c731cd0e7489-kube-api-access-f6b9v" (OuterVolumeSpecName: "kube-api-access-f6b9v") pod "e428b728-cfb8-48af-8b76-c731cd0e7489" (UID: "e428b728-cfb8-48af-8b76-c731cd0e7489"). InnerVolumeSpecName "kube-api-access-f6b9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:22:04 crc kubenswrapper[4843]: I0318 13:22:04.611264 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6b9v\" (UniqueName: \"kubernetes.io/projected/e428b728-cfb8-48af-8b76-c731cd0e7489-kube-api-access-f6b9v\") on node \"crc\" DevicePath \"\"" Mar 18 13:22:05 crc kubenswrapper[4843]: I0318 13:22:05.044188 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564002-7z2s5" event={"ID":"e428b728-cfb8-48af-8b76-c731cd0e7489","Type":"ContainerDied","Data":"2443ce0c73e679488b0ada7ac9401f82e61de8eb7d35e4d35c894b7f3dffc47a"} Mar 18 13:22:05 crc kubenswrapper[4843]: I0318 13:22:05.044232 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2443ce0c73e679488b0ada7ac9401f82e61de8eb7d35e4d35c894b7f3dffc47a" Mar 18 13:22:05 crc kubenswrapper[4843]: I0318 13:22:05.044262 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-7z2s5" Mar 18 13:22:05 crc kubenswrapper[4843]: I0318 13:22:05.488386 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-vjkpt"] Mar 18 13:22:05 crc kubenswrapper[4843]: I0318 13:22:05.501704 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-vjkpt"] Mar 18 13:22:06 crc kubenswrapper[4843]: I0318 13:22:06.994453 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b04fc45-e7e8-4977-bdda-dfaf6e74fafa" path="/var/lib/kubelet/pods/8b04fc45-e7e8-4977-bdda-dfaf6e74fafa/volumes" Mar 18 13:22:20 crc kubenswrapper[4843]: I0318 13:22:20.035686 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:22:20 crc kubenswrapper[4843]: I0318 13:22:20.036390 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:22:50 crc kubenswrapper[4843]: I0318 13:22:50.036082 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:22:50 crc kubenswrapper[4843]: I0318 13:22:50.036624 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:23:07 crc kubenswrapper[4843]: I0318 13:23:07.526778 4843 scope.go:117] "RemoveContainer" containerID="2c02efb9eed4c2d6a9d31c9d0387859da266fec7c30f4a78795831a7225104dc" Mar 18 13:23:20 crc kubenswrapper[4843]: I0318 13:23:20.034680 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:23:20 crc kubenswrapper[4843]: I0318 13:23:20.035203 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:23:20 crc kubenswrapper[4843]: I0318 13:23:20.035258 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 13:23:20 crc kubenswrapper[4843]: I0318 13:23:20.036263 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:23:20 crc kubenswrapper[4843]: I0318 13:23:20.036330 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" gracePeriod=600 Mar 18 13:23:20 crc kubenswrapper[4843]: E0318 13:23:20.167227 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:23:20 crc kubenswrapper[4843]: I0318 13:23:20.988567 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" exitCode=0 Mar 18 13:23:20 crc kubenswrapper[4843]: I0318 13:23:20.995365 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0"} Mar 18 13:23:20 crc kubenswrapper[4843]: I0318 13:23:20.995435 4843 scope.go:117] "RemoveContainer" containerID="7974653c49645239160ac43ca7d8c3d2bf1a27727cb40316647ec5569b7697ac" Mar 18 13:23:20 crc kubenswrapper[4843]: I0318 13:23:20.996791 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:23:21 crc kubenswrapper[4843]: E0318 13:23:21.001512 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:23:32 crc kubenswrapper[4843]: I0318 13:23:32.984186 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:23:32 crc kubenswrapper[4843]: E0318 13:23:32.985031 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:23:44 crc kubenswrapper[4843]: I0318 13:23:44.984556 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:23:44 crc kubenswrapper[4843]: E0318 13:23:44.985316 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:23:59 crc kubenswrapper[4843]: I0318 13:23:59.984613 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:23:59 crc kubenswrapper[4843]: E0318 13:23:59.985476 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:24:00 crc kubenswrapper[4843]: I0318 13:24:00.158056 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564004-xtbtf"] Mar 18 13:24:00 crc kubenswrapper[4843]: E0318 13:24:00.158527 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e428b728-cfb8-48af-8b76-c731cd0e7489" containerName="oc" Mar 18 13:24:00 crc kubenswrapper[4843]: I0318 13:24:00.158551 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="e428b728-cfb8-48af-8b76-c731cd0e7489" containerName="oc" Mar 18 13:24:00 crc kubenswrapper[4843]: I0318 13:24:00.158810 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="e428b728-cfb8-48af-8b76-c731cd0e7489" containerName="oc" Mar 18 13:24:00 crc kubenswrapper[4843]: I0318 13:24:00.159616 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-xtbtf" Mar 18 13:24:00 crc kubenswrapper[4843]: I0318 13:24:00.162823 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:24:00 crc kubenswrapper[4843]: I0318 13:24:00.163315 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:24:00 crc kubenswrapper[4843]: I0318 13:24:00.163500 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:24:00 crc kubenswrapper[4843]: I0318 13:24:00.168707 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-xtbtf"] Mar 18 13:24:00 crc kubenswrapper[4843]: I0318 13:24:00.354414 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mf58\" (UniqueName: \"kubernetes.io/projected/6c4dcad9-fe20-4189-8b62-c6fd7d613838-kube-api-access-7mf58\") pod \"auto-csr-approver-29564004-xtbtf\" (UID: \"6c4dcad9-fe20-4189-8b62-c6fd7d613838\") " pod="openshift-infra/auto-csr-approver-29564004-xtbtf" Mar 18 13:24:00 crc kubenswrapper[4843]: I0318 13:24:00.457378 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mf58\" (UniqueName: \"kubernetes.io/projected/6c4dcad9-fe20-4189-8b62-c6fd7d613838-kube-api-access-7mf58\") pod \"auto-csr-approver-29564004-xtbtf\" (UID: \"6c4dcad9-fe20-4189-8b62-c6fd7d613838\") " pod="openshift-infra/auto-csr-approver-29564004-xtbtf" Mar 18 13:24:00 crc kubenswrapper[4843]: I0318 13:24:00.480963 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mf58\" (UniqueName: \"kubernetes.io/projected/6c4dcad9-fe20-4189-8b62-c6fd7d613838-kube-api-access-7mf58\") pod \"auto-csr-approver-29564004-xtbtf\" (UID: \"6c4dcad9-fe20-4189-8b62-c6fd7d613838\") " pod="openshift-infra/auto-csr-approver-29564004-xtbtf" Mar 18 13:24:00 crc kubenswrapper[4843]: I0318 13:24:00.779069 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-xtbtf" Mar 18 13:24:01 crc kubenswrapper[4843]: I0318 13:24:01.275413 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-xtbtf"] Mar 18 13:24:01 crc kubenswrapper[4843]: I0318 13:24:01.387286 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-xtbtf" event={"ID":"6c4dcad9-fe20-4189-8b62-c6fd7d613838","Type":"ContainerStarted","Data":"8c30fcce27f71b299ce266675b74b9b7b3f409ead8e9b8c9922040754e2685ee"} Mar 18 13:24:03 crc kubenswrapper[4843]: I0318 13:24:03.505952 4843 generic.go:334] "Generic (PLEG): container finished" podID="6c4dcad9-fe20-4189-8b62-c6fd7d613838" containerID="949b327e89c7408f0c9e4a417ba9404dbd118fd9861d4b282846186f34d1e5e4" exitCode=0 Mar 18 13:24:03 crc kubenswrapper[4843]: I0318 13:24:03.506116 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-xtbtf" event={"ID":"6c4dcad9-fe20-4189-8b62-c6fd7d613838","Type":"ContainerDied","Data":"949b327e89c7408f0c9e4a417ba9404dbd118fd9861d4b282846186f34d1e5e4"} Mar 18 13:24:04 crc kubenswrapper[4843]: I0318 13:24:04.848494 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-xtbtf" Mar 18 13:24:04 crc kubenswrapper[4843]: I0318 13:24:04.959803 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mf58\" (UniqueName: \"kubernetes.io/projected/6c4dcad9-fe20-4189-8b62-c6fd7d613838-kube-api-access-7mf58\") pod \"6c4dcad9-fe20-4189-8b62-c6fd7d613838\" (UID: \"6c4dcad9-fe20-4189-8b62-c6fd7d613838\") " Mar 18 13:24:05 crc kubenswrapper[4843]: I0318 13:24:05.403318 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c4dcad9-fe20-4189-8b62-c6fd7d613838-kube-api-access-7mf58" (OuterVolumeSpecName: "kube-api-access-7mf58") pod "6c4dcad9-fe20-4189-8b62-c6fd7d613838" (UID: "6c4dcad9-fe20-4189-8b62-c6fd7d613838"). InnerVolumeSpecName "kube-api-access-7mf58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:05 crc kubenswrapper[4843]: I0318 13:24:05.470702 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mf58\" (UniqueName: \"kubernetes.io/projected/6c4dcad9-fe20-4189-8b62-c6fd7d613838-kube-api-access-7mf58\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:05 crc kubenswrapper[4843]: I0318 13:24:05.527690 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-xtbtf" event={"ID":"6c4dcad9-fe20-4189-8b62-c6fd7d613838","Type":"ContainerDied","Data":"8c30fcce27f71b299ce266675b74b9b7b3f409ead8e9b8c9922040754e2685ee"} Mar 18 13:24:05 crc kubenswrapper[4843]: I0318 13:24:05.527752 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c30fcce27f71b299ce266675b74b9b7b3f409ead8e9b8c9922040754e2685ee" Mar 18 13:24:05 crc kubenswrapper[4843]: I0318 13:24:05.527756 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-xtbtf" Mar 18 13:24:05 crc kubenswrapper[4843]: I0318 13:24:05.938411 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-pgwpx"] Mar 18 13:24:05 crc kubenswrapper[4843]: I0318 13:24:05.947369 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-pgwpx"] Mar 18 13:24:06 crc kubenswrapper[4843]: I0318 13:24:06.999222 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72422f42-741a-450d-bf90-f9d8b262add3" path="/var/lib/kubelet/pods/72422f42-741a-450d-bf90-f9d8b262add3/volumes" Mar 18 13:24:07 crc kubenswrapper[4843]: I0318 13:24:07.608940 4843 scope.go:117] "RemoveContainer" containerID="9062b74d897a53ed5c312b58f6619effe9768b4a756ac99948097595613ae473" Mar 18 13:24:10 crc kubenswrapper[4843]: I0318 13:24:10.984692 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:24:10 crc kubenswrapper[4843]: E0318 13:24:10.985351 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:24:22 crc kubenswrapper[4843]: I0318 13:24:22.985114 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:24:22 crc kubenswrapper[4843]: E0318 13:24:22.986827 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:24:33 crc kubenswrapper[4843]: I0318 13:24:33.984770 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:24:33 crc kubenswrapper[4843]: E0318 13:24:33.985547 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:24:47 crc kubenswrapper[4843]: I0318 13:24:47.083009 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:24:47 crc kubenswrapper[4843]: E0318 13:24:47.083817 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:25:00 crc kubenswrapper[4843]: I0318 13:25:00.984080 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:25:00 crc kubenswrapper[4843]: E0318 13:25:00.985149 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:25:12 crc kubenswrapper[4843]: I0318 13:25:12.984643 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:25:12 crc kubenswrapper[4843]: E0318 13:25:12.985536 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:25:24 crc kubenswrapper[4843]: I0318 13:25:24.986918 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:25:24 crc kubenswrapper[4843]: E0318 13:25:24.990969 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:25:39 crc kubenswrapper[4843]: I0318 13:25:39.983778 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:25:39 crc kubenswrapper[4843]: E0318 13:25:39.984488 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:25:52 crc kubenswrapper[4843]: I0318 13:25:52.984894 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:25:52 crc kubenswrapper[4843]: E0318 13:25:52.985770 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:26:00 crc kubenswrapper[4843]: I0318 13:26:00.151388 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564006-phwm8"] Mar 18 13:26:00 crc kubenswrapper[4843]: E0318 13:26:00.152559 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4dcad9-fe20-4189-8b62-c6fd7d613838" containerName="oc" Mar 18 13:26:00 crc kubenswrapper[4843]: I0318 13:26:00.152582 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4dcad9-fe20-4189-8b62-c6fd7d613838" containerName="oc" Mar 18 13:26:00 crc kubenswrapper[4843]: I0318 13:26:00.152943 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c4dcad9-fe20-4189-8b62-c6fd7d613838" containerName="oc" Mar 18 13:26:00 crc kubenswrapper[4843]: I0318 13:26:00.154117 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-phwm8" Mar 18 13:26:00 crc kubenswrapper[4843]: I0318 13:26:00.156823 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:26:00 crc kubenswrapper[4843]: I0318 13:26:00.156952 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:26:00 crc kubenswrapper[4843]: I0318 13:26:00.156825 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:26:00 crc kubenswrapper[4843]: I0318 13:26:00.166195 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-phwm8"] Mar 18 13:26:00 crc kubenswrapper[4843]: I0318 13:26:00.279312 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9klxf\" (UniqueName: \"kubernetes.io/projected/9b6a6ecc-776a-4f9f-b77f-06df5c0da886-kube-api-access-9klxf\") pod \"auto-csr-approver-29564006-phwm8\" (UID: \"9b6a6ecc-776a-4f9f-b77f-06df5c0da886\") " pod="openshift-infra/auto-csr-approver-29564006-phwm8" Mar 18 13:26:00 crc kubenswrapper[4843]: I0318 13:26:00.382166 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9klxf\" (UniqueName: \"kubernetes.io/projected/9b6a6ecc-776a-4f9f-b77f-06df5c0da886-kube-api-access-9klxf\") pod \"auto-csr-approver-29564006-phwm8\" (UID: \"9b6a6ecc-776a-4f9f-b77f-06df5c0da886\") " pod="openshift-infra/auto-csr-approver-29564006-phwm8" Mar 18 13:26:00 crc kubenswrapper[4843]: I0318 13:26:00.408100 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9klxf\" (UniqueName: \"kubernetes.io/projected/9b6a6ecc-776a-4f9f-b77f-06df5c0da886-kube-api-access-9klxf\") pod \"auto-csr-approver-29564006-phwm8\" (UID: \"9b6a6ecc-776a-4f9f-b77f-06df5c0da886\") " pod="openshift-infra/auto-csr-approver-29564006-phwm8" Mar 18 13:26:00 crc kubenswrapper[4843]: I0318 13:26:00.487808 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-phwm8" Mar 18 13:26:01 crc kubenswrapper[4843]: I0318 13:26:01.349680 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-phwm8"] Mar 18 13:26:01 crc kubenswrapper[4843]: I0318 13:26:01.356426 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:26:01 crc kubenswrapper[4843]: I0318 13:26:01.427036 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564006-phwm8" event={"ID":"9b6a6ecc-776a-4f9f-b77f-06df5c0da886","Type":"ContainerStarted","Data":"12620ab711ce54359a032841095e3c167998193083d2e125a3614712b14d6717"} Mar 18 13:26:04 crc kubenswrapper[4843]: I0318 13:26:04.523223 4843 generic.go:334] "Generic (PLEG): container finished" podID="9b6a6ecc-776a-4f9f-b77f-06df5c0da886" containerID="cecdb8b0cafadf56aba4cba269db269d53d81b1147eed3f4fc26c576c3852e4a" exitCode=0 Mar 18 13:26:04 crc kubenswrapper[4843]: I0318 13:26:04.523338 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564006-phwm8" event={"ID":"9b6a6ecc-776a-4f9f-b77f-06df5c0da886","Type":"ContainerDied","Data":"cecdb8b0cafadf56aba4cba269db269d53d81b1147eed3f4fc26c576c3852e4a"} Mar 18 13:26:05 crc kubenswrapper[4843]: I0318 13:26:05.934969 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-phwm8" Mar 18 13:26:06 crc kubenswrapper[4843]: I0318 13:26:06.035095 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9klxf\" (UniqueName: \"kubernetes.io/projected/9b6a6ecc-776a-4f9f-b77f-06df5c0da886-kube-api-access-9klxf\") pod \"9b6a6ecc-776a-4f9f-b77f-06df5c0da886\" (UID: \"9b6a6ecc-776a-4f9f-b77f-06df5c0da886\") " Mar 18 13:26:06 crc kubenswrapper[4843]: I0318 13:26:06.044094 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b6a6ecc-776a-4f9f-b77f-06df5c0da886-kube-api-access-9klxf" (OuterVolumeSpecName: "kube-api-access-9klxf") pod "9b6a6ecc-776a-4f9f-b77f-06df5c0da886" (UID: "9b6a6ecc-776a-4f9f-b77f-06df5c0da886"). InnerVolumeSpecName "kube-api-access-9klxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:06 crc kubenswrapper[4843]: I0318 13:26:06.138206 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9klxf\" (UniqueName: \"kubernetes.io/projected/9b6a6ecc-776a-4f9f-b77f-06df5c0da886-kube-api-access-9klxf\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:06 crc kubenswrapper[4843]: I0318 13:26:06.603046 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564006-phwm8" event={"ID":"9b6a6ecc-776a-4f9f-b77f-06df5c0da886","Type":"ContainerDied","Data":"12620ab711ce54359a032841095e3c167998193083d2e125a3614712b14d6717"} Mar 18 13:26:06 crc kubenswrapper[4843]: I0318 13:26:06.603452 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12620ab711ce54359a032841095e3c167998193083d2e125a3614712b14d6717" Mar 18 13:26:06 crc kubenswrapper[4843]: I0318 13:26:06.603579 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-phwm8" Mar 18 13:26:07 crc kubenswrapper[4843]: I0318 13:26:07.028310 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-dv7ht"] Mar 18 13:26:07 crc kubenswrapper[4843]: I0318 13:26:07.039971 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-dv7ht"] Mar 18 13:26:07 crc kubenswrapper[4843]: I0318 13:26:07.983917 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:26:07 crc kubenswrapper[4843]: E0318 13:26:07.984256 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:26:09 crc kubenswrapper[4843]: I0318 13:26:09.111252 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3b556a4-8479-4cc6-bcaa-45e09a55684a" path="/var/lib/kubelet/pods/f3b556a4-8479-4cc6-bcaa-45e09a55684a/volumes" Mar 18 13:26:19 crc kubenswrapper[4843]: I0318 13:26:19.984676 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:26:19 crc kubenswrapper[4843]: E0318 13:26:19.985493 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:26:34 crc kubenswrapper[4843]: I0318 13:26:34.998507 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:26:35 crc kubenswrapper[4843]: E0318 13:26:35.000871 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:26:47 crc kubenswrapper[4843]: I0318 13:26:47.983169 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:26:47 crc kubenswrapper[4843]: E0318 13:26:47.983808 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:27:00 crc kubenswrapper[4843]: I0318 13:27:00.984527 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:27:00 crc kubenswrapper[4843]: E0318 13:27:00.985406 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:27:07 crc kubenswrapper[4843]: I0318 13:27:07.726217 4843 scope.go:117] "RemoveContainer" containerID="09e88ad04b5eeefabf359c3900ee5949c8e89644cce1e24a178d32a911d541cb" Mar 18 13:27:11 crc kubenswrapper[4843]: I0318 13:27:11.984466 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:27:11 crc kubenswrapper[4843]: E0318 13:27:11.985273 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:27:27 crc kubenswrapper[4843]: I0318 13:27:27.046098 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:27:27 crc kubenswrapper[4843]: E0318 13:27:27.046954 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:27:37 crc kubenswrapper[4843]: I0318 13:27:37.994948 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:27:37 crc kubenswrapper[4843]: E0318 13:27:37.996076 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:27:48 crc kubenswrapper[4843]: I0318 13:27:48.984882 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:27:48 crc kubenswrapper[4843]: E0318 13:27:48.985713 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:28:00 crc kubenswrapper[4843]: I0318 13:28:00.149797 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564008-mnt7p"] Mar 18 13:28:00 crc kubenswrapper[4843]: E0318 13:28:00.150816 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6a6ecc-776a-4f9f-b77f-06df5c0da886" containerName="oc" Mar 18 13:28:00 crc kubenswrapper[4843]: I0318 13:28:00.150836 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6a6ecc-776a-4f9f-b77f-06df5c0da886" containerName="oc" Mar 18 13:28:00 crc kubenswrapper[4843]: I0318 13:28:00.151205 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6a6ecc-776a-4f9f-b77f-06df5c0da886" containerName="oc" Mar 18 13:28:00 crc kubenswrapper[4843]: I0318 13:28:00.152291 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-mnt7p" Mar 18 13:28:00 crc kubenswrapper[4843]: I0318 13:28:00.156957 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:28:00 crc kubenswrapper[4843]: I0318 13:28:00.157370 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:28:00 crc kubenswrapper[4843]: I0318 13:28:00.158330 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:28:00 crc kubenswrapper[4843]: I0318 13:28:00.167071 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-mnt7p"] Mar 18 13:28:00 crc kubenswrapper[4843]: I0318 13:28:00.290903 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwzl4\" (UniqueName: \"kubernetes.io/projected/8d261b82-2683-4060-9190-d2f67e530dec-kube-api-access-jwzl4\") pod \"auto-csr-approver-29564008-mnt7p\" (UID: \"8d261b82-2683-4060-9190-d2f67e530dec\") " pod="openshift-infra/auto-csr-approver-29564008-mnt7p" Mar 18 13:28:00 crc kubenswrapper[4843]: I0318 13:28:00.393928 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwzl4\" (UniqueName: \"kubernetes.io/projected/8d261b82-2683-4060-9190-d2f67e530dec-kube-api-access-jwzl4\") pod \"auto-csr-approver-29564008-mnt7p\" (UID: \"8d261b82-2683-4060-9190-d2f67e530dec\") " pod="openshift-infra/auto-csr-approver-29564008-mnt7p" Mar 18 13:28:00 crc kubenswrapper[4843]: I0318 13:28:00.812051 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwzl4\" (UniqueName: \"kubernetes.io/projected/8d261b82-2683-4060-9190-d2f67e530dec-kube-api-access-jwzl4\") pod \"auto-csr-approver-29564008-mnt7p\" (UID: \"8d261b82-2683-4060-9190-d2f67e530dec\") " pod="openshift-infra/auto-csr-approver-29564008-mnt7p" Mar 18 13:28:01 crc kubenswrapper[4843]: I0318 13:28:01.080109 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-mnt7p" Mar 18 13:28:01 crc kubenswrapper[4843]: I0318 13:28:01.511009 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-mnt7p"] Mar 18 13:28:02 crc kubenswrapper[4843]: I0318 13:28:02.062035 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564008-mnt7p" event={"ID":"8d261b82-2683-4060-9190-d2f67e530dec","Type":"ContainerStarted","Data":"9793fdd8456e54efca515f044fb823be8314ded18ed985842e282e6770e8c547"} Mar 18 13:28:03 crc kubenswrapper[4843]: I0318 13:28:03.984036 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:28:03 crc kubenswrapper[4843]: E0318 13:28:03.985629 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:28:04 crc kubenswrapper[4843]: I0318 13:28:04.088530 4843 generic.go:334] "Generic (PLEG): container finished" podID="8d261b82-2683-4060-9190-d2f67e530dec" containerID="86619cf95f3e0ad0922344a0afcd35d19f4da71eaabe58b1bdf8c7c3988fad86" exitCode=0 Mar 18 13:28:04 crc kubenswrapper[4843]: I0318 13:28:04.088579 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564008-mnt7p" event={"ID":"8d261b82-2683-4060-9190-d2f67e530dec","Type":"ContainerDied","Data":"86619cf95f3e0ad0922344a0afcd35d19f4da71eaabe58b1bdf8c7c3988fad86"} Mar 18 13:28:05 crc kubenswrapper[4843]: I0318 13:28:05.620433 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-mnt7p" Mar 18 13:28:05 crc kubenswrapper[4843]: I0318 13:28:05.765014 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwzl4\" (UniqueName: \"kubernetes.io/projected/8d261b82-2683-4060-9190-d2f67e530dec-kube-api-access-jwzl4\") pod \"8d261b82-2683-4060-9190-d2f67e530dec\" (UID: \"8d261b82-2683-4060-9190-d2f67e530dec\") " Mar 18 13:28:05 crc kubenswrapper[4843]: I0318 13:28:05.782744 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d261b82-2683-4060-9190-d2f67e530dec-kube-api-access-jwzl4" (OuterVolumeSpecName: "kube-api-access-jwzl4") pod "8d261b82-2683-4060-9190-d2f67e530dec" (UID: "8d261b82-2683-4060-9190-d2f67e530dec"). InnerVolumeSpecName "kube-api-access-jwzl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:05 crc kubenswrapper[4843]: I0318 13:28:05.868162 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwzl4\" (UniqueName: \"kubernetes.io/projected/8d261b82-2683-4060-9190-d2f67e530dec-kube-api-access-jwzl4\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:06 crc kubenswrapper[4843]: I0318 13:28:06.110195 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564008-mnt7p" event={"ID":"8d261b82-2683-4060-9190-d2f67e530dec","Type":"ContainerDied","Data":"9793fdd8456e54efca515f044fb823be8314ded18ed985842e282e6770e8c547"} Mar 18 13:28:06 crc kubenswrapper[4843]: I0318 13:28:06.110250 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9793fdd8456e54efca515f044fb823be8314ded18ed985842e282e6770e8c547" Mar 18 13:28:06 crc kubenswrapper[4843]: I0318 13:28:06.110295 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-mnt7p" Mar 18 13:28:06 crc kubenswrapper[4843]: E0318 13:28:06.162579 4843 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d261b82_2683_4060_9190_d2f67e530dec.slice\": RecentStats: unable to find data in memory cache]" Mar 18 13:28:06 crc kubenswrapper[4843]: I0318 13:28:06.740359 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-7z2s5"] Mar 18 13:28:06 crc kubenswrapper[4843]: I0318 13:28:06.751860 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-7z2s5"] Mar 18 13:28:06 crc kubenswrapper[4843]: I0318 13:28:06.995013 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e428b728-cfb8-48af-8b76-c731cd0e7489" path="/var/lib/kubelet/pods/e428b728-cfb8-48af-8b76-c731cd0e7489/volumes" Mar 18 13:28:07 crc kubenswrapper[4843]: I0318 13:28:07.907574 4843 scope.go:117] "RemoveContainer" containerID="e666d9075a21ecf67b17d876ead993e8bdb03625c410a39d491687b83e4ca5ba" Mar 18 13:28:15 crc kubenswrapper[4843]: I0318 13:28:15.983997 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:28:15 crc kubenswrapper[4843]: E0318 13:28:15.985059 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:28:26 crc kubenswrapper[4843]: I0318 13:28:26.990617 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:28:27 crc kubenswrapper[4843]: I0318 13:28:27.326870 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"48d9422627d1f2b320842204ec90814b68898c02c8f99d441d7ad28939b9e238"} Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.166219 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-45tz9"] Mar 18 13:28:55 crc kubenswrapper[4843]: E0318 13:28:55.167336 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d261b82-2683-4060-9190-d2f67e530dec" containerName="oc" Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.167357 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d261b82-2683-4060-9190-d2f67e530dec" containerName="oc" Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.167592 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d261b82-2683-4060-9190-d2f67e530dec" containerName="oc" Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.169477 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.184661 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-45tz9"] Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.309496 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct9ph\" (UniqueName: \"kubernetes.io/projected/8c1e9bb5-6165-48cc-89a9-1711c1c13853-kube-api-access-ct9ph\") pod \"redhat-operators-45tz9\" (UID: \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\") " pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.309731 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c1e9bb5-6165-48cc-89a9-1711c1c13853-catalog-content\") pod \"redhat-operators-45tz9\" (UID: \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\") " pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.309828 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c1e9bb5-6165-48cc-89a9-1711c1c13853-utilities\") pod \"redhat-operators-45tz9\" (UID: \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\") " pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.411992 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c1e9bb5-6165-48cc-89a9-1711c1c13853-utilities\") pod \"redhat-operators-45tz9\" (UID: \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\") " pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.412087 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct9ph\" (UniqueName: \"kubernetes.io/projected/8c1e9bb5-6165-48cc-89a9-1711c1c13853-kube-api-access-ct9ph\") pod \"redhat-operators-45tz9\" (UID: \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\") " pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.412311 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c1e9bb5-6165-48cc-89a9-1711c1c13853-catalog-content\") pod \"redhat-operators-45tz9\" (UID: \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\") " pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.412590 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c1e9bb5-6165-48cc-89a9-1711c1c13853-utilities\") pod \"redhat-operators-45tz9\" (UID: \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\") " pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.412749 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c1e9bb5-6165-48cc-89a9-1711c1c13853-catalog-content\") pod \"redhat-operators-45tz9\" (UID: \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\") " pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.432368 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct9ph\" (UniqueName: \"kubernetes.io/projected/8c1e9bb5-6165-48cc-89a9-1711c1c13853-kube-api-access-ct9ph\") pod \"redhat-operators-45tz9\" (UID: \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\") " pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:28:55 crc kubenswrapper[4843]: I0318 13:28:55.499056 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:28:56 crc kubenswrapper[4843]: I0318 13:28:56.000912 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-45tz9"] Mar 18 13:28:56 crc kubenswrapper[4843]: I0318 13:28:56.665089 4843 generic.go:334] "Generic (PLEG): container finished" podID="8c1e9bb5-6165-48cc-89a9-1711c1c13853" containerID="2607ae20ad7d2556062356b273d5e07e6f8ac44db70d7bd2d747c49783c37b5d" exitCode=0 Mar 18 13:28:56 crc kubenswrapper[4843]: I0318 13:28:56.665180 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tz9" event={"ID":"8c1e9bb5-6165-48cc-89a9-1711c1c13853","Type":"ContainerDied","Data":"2607ae20ad7d2556062356b273d5e07e6f8ac44db70d7bd2d747c49783c37b5d"} Mar 18 13:28:56 crc kubenswrapper[4843]: I0318 13:28:56.667444 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tz9" event={"ID":"8c1e9bb5-6165-48cc-89a9-1711c1c13853","Type":"ContainerStarted","Data":"51d527e3ff81e2ab094384314e2e19ea67d3264241962c02fe9244a1da1b0026"} Mar 18 13:28:58 crc kubenswrapper[4843]: I0318 13:28:58.688064 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tz9" event={"ID":"8c1e9bb5-6165-48cc-89a9-1711c1c13853","Type":"ContainerStarted","Data":"239b60627d54434403b8aa123386b5528cabe83b941c21633e370d73135bbef4"} Mar 18 13:28:59 crc kubenswrapper[4843]: I0318 13:28:59.701396 4843 generic.go:334] "Generic (PLEG): container finished" podID="8c1e9bb5-6165-48cc-89a9-1711c1c13853" containerID="239b60627d54434403b8aa123386b5528cabe83b941c21633e370d73135bbef4" exitCode=0 Mar 18 13:28:59 crc kubenswrapper[4843]: I0318 13:28:59.701517 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tz9" event={"ID":"8c1e9bb5-6165-48cc-89a9-1711c1c13853","Type":"ContainerDied","Data":"239b60627d54434403b8aa123386b5528cabe83b941c21633e370d73135bbef4"} Mar 18 13:29:00 crc kubenswrapper[4843]: I0318 13:29:00.716302 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tz9" event={"ID":"8c1e9bb5-6165-48cc-89a9-1711c1c13853","Type":"ContainerStarted","Data":"2ae8b9eea81bbdc7ba560722e88e66d7388b0b9396468dc00b2f461b58b3ae0a"} Mar 18 13:29:00 crc kubenswrapper[4843]: I0318 13:29:00.743226 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-45tz9" podStartSLOduration=2.254234711 podStartE2EDuration="5.743180496s" podCreationTimestamp="2026-03-18 13:28:55 +0000 UTC" firstStartedPulling="2026-03-18 13:28:56.667172105 +0000 UTC m=+4770.382997629" lastFinishedPulling="2026-03-18 13:29:00.15611789 +0000 UTC m=+4773.871943414" observedRunningTime="2026-03-18 13:29:00.733358638 +0000 UTC m=+4774.449184162" watchObservedRunningTime="2026-03-18 13:29:00.743180496 +0000 UTC m=+4774.459006020" Mar 18 13:29:05 crc kubenswrapper[4843]: I0318 13:29:05.499148 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:29:05 crc kubenswrapper[4843]: I0318 13:29:05.499686 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:29:07 crc kubenswrapper[4843]: I0318 13:29:07.142271 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-45tz9" podUID="8c1e9bb5-6165-48cc-89a9-1711c1c13853" containerName="registry-server" probeResult="failure" output=< Mar 18 13:29:07 crc kubenswrapper[4843]: timeout: failed to connect service ":50051" within 1s Mar 18 13:29:07 crc kubenswrapper[4843]: > Mar 18 13:29:15 crc kubenswrapper[4843]: I0318 13:29:15.548765 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:29:15 crc kubenswrapper[4843]: I0318 13:29:15.598358 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:29:16 crc kubenswrapper[4843]: I0318 13:29:16.442306 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-45tz9"] Mar 18 13:29:16 crc kubenswrapper[4843]: I0318 13:29:16.881197 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-45tz9" podUID="8c1e9bb5-6165-48cc-89a9-1711c1c13853" containerName="registry-server" containerID="cri-o://2ae8b9eea81bbdc7ba560722e88e66d7388b0b9396468dc00b2f461b58b3ae0a" gracePeriod=2 Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.323784 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.470494 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c1e9bb5-6165-48cc-89a9-1711c1c13853-utilities\") pod \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\" (UID: \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\") " Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.470730 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c1e9bb5-6165-48cc-89a9-1711c1c13853-catalog-content\") pod \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\" (UID: \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\") " Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.470857 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct9ph\" (UniqueName: \"kubernetes.io/projected/8c1e9bb5-6165-48cc-89a9-1711c1c13853-kube-api-access-ct9ph\") pod \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\" (UID: \"8c1e9bb5-6165-48cc-89a9-1711c1c13853\") " Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.472068 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c1e9bb5-6165-48cc-89a9-1711c1c13853-utilities" (OuterVolumeSpecName: "utilities") pod "8c1e9bb5-6165-48cc-89a9-1711c1c13853" (UID: "8c1e9bb5-6165-48cc-89a9-1711c1c13853"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.475634 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c1e9bb5-6165-48cc-89a9-1711c1c13853-kube-api-access-ct9ph" (OuterVolumeSpecName: "kube-api-access-ct9ph") pod "8c1e9bb5-6165-48cc-89a9-1711c1c13853" (UID: "8c1e9bb5-6165-48cc-89a9-1711c1c13853"). InnerVolumeSpecName "kube-api-access-ct9ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.573046 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct9ph\" (UniqueName: \"kubernetes.io/projected/8c1e9bb5-6165-48cc-89a9-1711c1c13853-kube-api-access-ct9ph\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.573091 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c1e9bb5-6165-48cc-89a9-1711c1c13853-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.710449 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c1e9bb5-6165-48cc-89a9-1711c1c13853-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c1e9bb5-6165-48cc-89a9-1711c1c13853" (UID: "8c1e9bb5-6165-48cc-89a9-1711c1c13853"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.803332 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c1e9bb5-6165-48cc-89a9-1711c1c13853-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.920100 4843 generic.go:334] "Generic (PLEG): container finished" podID="8c1e9bb5-6165-48cc-89a9-1711c1c13853" containerID="2ae8b9eea81bbdc7ba560722e88e66d7388b0b9396468dc00b2f461b58b3ae0a" exitCode=0 Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.920143 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tz9" event={"ID":"8c1e9bb5-6165-48cc-89a9-1711c1c13853","Type":"ContainerDied","Data":"2ae8b9eea81bbdc7ba560722e88e66d7388b0b9396468dc00b2f461b58b3ae0a"} Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.920176 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tz9" event={"ID":"8c1e9bb5-6165-48cc-89a9-1711c1c13853","Type":"ContainerDied","Data":"51d527e3ff81e2ab094384314e2e19ea67d3264241962c02fe9244a1da1b0026"} Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.920196 4843 scope.go:117] "RemoveContainer" containerID="2ae8b9eea81bbdc7ba560722e88e66d7388b0b9396468dc00b2f461b58b3ae0a" Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.920905 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45tz9" Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.944201 4843 scope.go:117] "RemoveContainer" containerID="239b60627d54434403b8aa123386b5528cabe83b941c21633e370d73135bbef4" Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.968263 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-45tz9"] Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.977205 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-45tz9"] Mar 18 13:29:17 crc kubenswrapper[4843]: I0318 13:29:17.987992 4843 scope.go:117] "RemoveContainer" containerID="2607ae20ad7d2556062356b273d5e07e6f8ac44db70d7bd2d747c49783c37b5d" Mar 18 13:29:18 crc kubenswrapper[4843]: I0318 13:29:18.013328 4843 scope.go:117] "RemoveContainer" containerID="2ae8b9eea81bbdc7ba560722e88e66d7388b0b9396468dc00b2f461b58b3ae0a" Mar 18 13:29:18 crc kubenswrapper[4843]: E0318 13:29:18.013858 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae8b9eea81bbdc7ba560722e88e66d7388b0b9396468dc00b2f461b58b3ae0a\": container with ID starting with 2ae8b9eea81bbdc7ba560722e88e66d7388b0b9396468dc00b2f461b58b3ae0a not found: ID does not exist" containerID="2ae8b9eea81bbdc7ba560722e88e66d7388b0b9396468dc00b2f461b58b3ae0a" Mar 18 13:29:18 crc kubenswrapper[4843]: I0318 13:29:18.013931 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae8b9eea81bbdc7ba560722e88e66d7388b0b9396468dc00b2f461b58b3ae0a"} err="failed to get container status \"2ae8b9eea81bbdc7ba560722e88e66d7388b0b9396468dc00b2f461b58b3ae0a\": rpc error: code = NotFound desc = could not find container \"2ae8b9eea81bbdc7ba560722e88e66d7388b0b9396468dc00b2f461b58b3ae0a\": container with ID starting with 2ae8b9eea81bbdc7ba560722e88e66d7388b0b9396468dc00b2f461b58b3ae0a not found: ID does not exist" Mar 18 13:29:18 crc kubenswrapper[4843]: I0318 13:29:18.013963 4843 scope.go:117] "RemoveContainer" containerID="239b60627d54434403b8aa123386b5528cabe83b941c21633e370d73135bbef4" Mar 18 13:29:18 crc kubenswrapper[4843]: E0318 13:29:18.014259 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"239b60627d54434403b8aa123386b5528cabe83b941c21633e370d73135bbef4\": container with ID starting with 239b60627d54434403b8aa123386b5528cabe83b941c21633e370d73135bbef4 not found: ID does not exist" containerID="239b60627d54434403b8aa123386b5528cabe83b941c21633e370d73135bbef4" Mar 18 13:29:18 crc kubenswrapper[4843]: I0318 13:29:18.014290 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"239b60627d54434403b8aa123386b5528cabe83b941c21633e370d73135bbef4"} err="failed to get container status \"239b60627d54434403b8aa123386b5528cabe83b941c21633e370d73135bbef4\": rpc error: code = NotFound desc = could not find container \"239b60627d54434403b8aa123386b5528cabe83b941c21633e370d73135bbef4\": container with ID starting with 239b60627d54434403b8aa123386b5528cabe83b941c21633e370d73135bbef4 not found: ID does not exist" Mar 18 13:29:18 crc kubenswrapper[4843]: I0318 13:29:18.014308 4843 scope.go:117] "RemoveContainer" containerID="2607ae20ad7d2556062356b273d5e07e6f8ac44db70d7bd2d747c49783c37b5d" Mar 18 13:29:18 crc kubenswrapper[4843]: E0318 13:29:18.014523 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2607ae20ad7d2556062356b273d5e07e6f8ac44db70d7bd2d747c49783c37b5d\": container with ID starting with 2607ae20ad7d2556062356b273d5e07e6f8ac44db70d7bd2d747c49783c37b5d not found: ID does not exist" containerID="2607ae20ad7d2556062356b273d5e07e6f8ac44db70d7bd2d747c49783c37b5d" Mar 18 13:29:18 crc kubenswrapper[4843]: I0318 13:29:18.014547 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2607ae20ad7d2556062356b273d5e07e6f8ac44db70d7bd2d747c49783c37b5d"} err="failed to get container status \"2607ae20ad7d2556062356b273d5e07e6f8ac44db70d7bd2d747c49783c37b5d\": rpc error: code = NotFound desc = could not find container \"2607ae20ad7d2556062356b273d5e07e6f8ac44db70d7bd2d747c49783c37b5d\": container with ID starting with 2607ae20ad7d2556062356b273d5e07e6f8ac44db70d7bd2d747c49783c37b5d not found: ID does not exist" Mar 18 13:29:18 crc kubenswrapper[4843]: I0318 13:29:18.995845 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c1e9bb5-6165-48cc-89a9-1711c1c13853" path="/var/lib/kubelet/pods/8c1e9bb5-6165-48cc-89a9-1711c1c13853/volumes" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.534954 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zh6dm"] Mar 18 13:29:40 crc kubenswrapper[4843]: E0318 13:29:40.536018 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1e9bb5-6165-48cc-89a9-1711c1c13853" containerName="extract-content" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.536035 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1e9bb5-6165-48cc-89a9-1711c1c13853" containerName="extract-content" Mar 18 13:29:40 crc kubenswrapper[4843]: E0318 13:29:40.536069 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1e9bb5-6165-48cc-89a9-1711c1c13853" containerName="registry-server" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.536078 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1e9bb5-6165-48cc-89a9-1711c1c13853" containerName="registry-server" Mar 18 13:29:40 crc kubenswrapper[4843]: E0318 13:29:40.536091 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1e9bb5-6165-48cc-89a9-1711c1c13853" containerName="extract-utilities" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.536102 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1e9bb5-6165-48cc-89a9-1711c1c13853" containerName="extract-utilities" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.536394 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c1e9bb5-6165-48cc-89a9-1711c1c13853" containerName="registry-server" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.538326 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.565208 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zh6dm"] Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.665262 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-catalog-content\") pod \"certified-operators-zh6dm\" (UID: \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\") " pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.665396 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt7hq\" (UniqueName: \"kubernetes.io/projected/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-kube-api-access-kt7hq\") pod \"certified-operators-zh6dm\" (UID: \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\") " pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.666392 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-utilities\") pod \"certified-operators-zh6dm\" (UID: \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\") " pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.768401 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-utilities\") pod \"certified-operators-zh6dm\" (UID: \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\") " pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.768527 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-catalog-content\") pod \"certified-operators-zh6dm\" (UID: \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\") " pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.768600 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt7hq\" (UniqueName: \"kubernetes.io/projected/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-kube-api-access-kt7hq\") pod \"certified-operators-zh6dm\" (UID: \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\") " pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.768969 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-utilities\") pod \"certified-operators-zh6dm\" (UID: \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\") " pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.769009 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-catalog-content\") pod \"certified-operators-zh6dm\" (UID: \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\") " pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.788921 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt7hq\" (UniqueName: \"kubernetes.io/projected/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-kube-api-access-kt7hq\") pod \"certified-operators-zh6dm\" (UID: \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\") " pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:40 crc kubenswrapper[4843]: I0318 13:29:40.860839 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:41 crc kubenswrapper[4843]: I0318 13:29:41.325565 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zh6dm"] Mar 18 13:29:41 crc kubenswrapper[4843]: W0318 13:29:41.609938 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44e82195_9b77_45ac_a6d8_c45c7a49d1a5.slice/crio-5a3c63a4b98544418c212ba9b6a3b28ae0fc361a03e8961f8dc4f8a9a45e7e08 WatchSource:0}: Error finding container 5a3c63a4b98544418c212ba9b6a3b28ae0fc361a03e8961f8dc4f8a9a45e7e08: Status 404 returned error can't find the container with id 5a3c63a4b98544418c212ba9b6a3b28ae0fc361a03e8961f8dc4f8a9a45e7e08 Mar 18 13:29:42 crc kubenswrapper[4843]: I0318 13:29:42.128387 4843 generic.go:334] "Generic (PLEG): container finished" podID="44e82195-9b77-45ac-a6d8-c45c7a49d1a5" containerID="0198ca33beb96ee356345f6c4b8468eb5fe9db5c9f70c5cb8d7d9609e8447241" exitCode=0 Mar 18 13:29:42 crc kubenswrapper[4843]: I0318 13:29:42.128437 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zh6dm" event={"ID":"44e82195-9b77-45ac-a6d8-c45c7a49d1a5","Type":"ContainerDied","Data":"0198ca33beb96ee356345f6c4b8468eb5fe9db5c9f70c5cb8d7d9609e8447241"} Mar 18 13:29:42 crc kubenswrapper[4843]: I0318 13:29:42.128469 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zh6dm" event={"ID":"44e82195-9b77-45ac-a6d8-c45c7a49d1a5","Type":"ContainerStarted","Data":"5a3c63a4b98544418c212ba9b6a3b28ae0fc361a03e8961f8dc4f8a9a45e7e08"} Mar 18 13:29:44 crc kubenswrapper[4843]: I0318 13:29:44.149155 4843 generic.go:334] "Generic (PLEG): container finished" podID="44e82195-9b77-45ac-a6d8-c45c7a49d1a5" containerID="7fbe9cc3d281b2e2248d206b624af039c78689ae8e310c72123fb2b37139e45f" exitCode=0 Mar 18 13:29:44 crc kubenswrapper[4843]: I0318 13:29:44.149270 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zh6dm" event={"ID":"44e82195-9b77-45ac-a6d8-c45c7a49d1a5","Type":"ContainerDied","Data":"7fbe9cc3d281b2e2248d206b624af039c78689ae8e310c72123fb2b37139e45f"} Mar 18 13:29:45 crc kubenswrapper[4843]: I0318 13:29:45.162854 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zh6dm" event={"ID":"44e82195-9b77-45ac-a6d8-c45c7a49d1a5","Type":"ContainerStarted","Data":"637016cd11896f2a039c75798ab5b3ec29bce5c8f6612897f94756dc75e57592"} Mar 18 13:29:45 crc kubenswrapper[4843]: I0318 13:29:45.194536 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zh6dm" podStartSLOduration=2.592134007 podStartE2EDuration="5.194496008s" podCreationTimestamp="2026-03-18 13:29:40 +0000 UTC" firstStartedPulling="2026-03-18 13:29:42.131281898 +0000 UTC m=+4815.847107442" lastFinishedPulling="2026-03-18 13:29:44.733643919 +0000 UTC m=+4818.449469443" observedRunningTime="2026-03-18 13:29:45.18505714 +0000 UTC m=+4818.900882674" watchObservedRunningTime="2026-03-18 13:29:45.194496008 +0000 UTC m=+4818.910321542" Mar 18 13:29:50 crc kubenswrapper[4843]: I0318 13:29:50.862899 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:50 crc kubenswrapper[4843]: I0318 13:29:50.863522 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:51 crc kubenswrapper[4843]: I0318 13:29:51.244480 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:51 crc kubenswrapper[4843]: I0318 13:29:51.292134 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:51 crc kubenswrapper[4843]: I0318 13:29:51.485576 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zh6dm"] Mar 18 13:29:53 crc kubenswrapper[4843]: I0318 13:29:53.243010 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zh6dm" podUID="44e82195-9b77-45ac-a6d8-c45c7a49d1a5" containerName="registry-server" containerID="cri-o://637016cd11896f2a039c75798ab5b3ec29bce5c8f6612897f94756dc75e57592" gracePeriod=2 Mar 18 13:29:53 crc kubenswrapper[4843]: I0318 13:29:53.808588 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.000223 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-utilities\") pod \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\" (UID: \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\") " Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.000287 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-catalog-content\") pod \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\" (UID: \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\") " Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.000827 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt7hq\" (UniqueName: \"kubernetes.io/projected/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-kube-api-access-kt7hq\") pod \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\" (UID: \"44e82195-9b77-45ac-a6d8-c45c7a49d1a5\") " Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.001366 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-utilities" (OuterVolumeSpecName: "utilities") pod "44e82195-9b77-45ac-a6d8-c45c7a49d1a5" (UID: "44e82195-9b77-45ac-a6d8-c45c7a49d1a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.001803 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.011429 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-kube-api-access-kt7hq" (OuterVolumeSpecName: "kube-api-access-kt7hq") pod "44e82195-9b77-45ac-a6d8-c45c7a49d1a5" (UID: "44e82195-9b77-45ac-a6d8-c45c7a49d1a5"). InnerVolumeSpecName "kube-api-access-kt7hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.103445 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt7hq\" (UniqueName: \"kubernetes.io/projected/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-kube-api-access-kt7hq\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.255780 4843 generic.go:334] "Generic (PLEG): container finished" podID="44e82195-9b77-45ac-a6d8-c45c7a49d1a5" containerID="637016cd11896f2a039c75798ab5b3ec29bce5c8f6612897f94756dc75e57592" exitCode=0 Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.255831 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zh6dm" event={"ID":"44e82195-9b77-45ac-a6d8-c45c7a49d1a5","Type":"ContainerDied","Data":"637016cd11896f2a039c75798ab5b3ec29bce5c8f6612897f94756dc75e57592"} Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.255861 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zh6dm" event={"ID":"44e82195-9b77-45ac-a6d8-c45c7a49d1a5","Type":"ContainerDied","Data":"5a3c63a4b98544418c212ba9b6a3b28ae0fc361a03e8961f8dc4f8a9a45e7e08"} Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.255881 4843 scope.go:117] "RemoveContainer" containerID="637016cd11896f2a039c75798ab5b3ec29bce5c8f6612897f94756dc75e57592" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.255891 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zh6dm" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.274937 4843 scope.go:117] "RemoveContainer" containerID="7fbe9cc3d281b2e2248d206b624af039c78689ae8e310c72123fb2b37139e45f" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.304310 4843 scope.go:117] "RemoveContainer" containerID="0198ca33beb96ee356345f6c4b8468eb5fe9db5c9f70c5cb8d7d9609e8447241" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.341263 4843 scope.go:117] "RemoveContainer" containerID="637016cd11896f2a039c75798ab5b3ec29bce5c8f6612897f94756dc75e57592" Mar 18 13:29:54 crc kubenswrapper[4843]: E0318 13:29:54.342351 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"637016cd11896f2a039c75798ab5b3ec29bce5c8f6612897f94756dc75e57592\": container with ID starting with 637016cd11896f2a039c75798ab5b3ec29bce5c8f6612897f94756dc75e57592 not found: ID does not exist" containerID="637016cd11896f2a039c75798ab5b3ec29bce5c8f6612897f94756dc75e57592" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.342411 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"637016cd11896f2a039c75798ab5b3ec29bce5c8f6612897f94756dc75e57592"} err="failed to get container status \"637016cd11896f2a039c75798ab5b3ec29bce5c8f6612897f94756dc75e57592\": rpc error: code = NotFound desc = could not find container \"637016cd11896f2a039c75798ab5b3ec29bce5c8f6612897f94756dc75e57592\": container with ID starting with 637016cd11896f2a039c75798ab5b3ec29bce5c8f6612897f94756dc75e57592 not found: ID does not exist" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.342447 4843 scope.go:117] "RemoveContainer" containerID="7fbe9cc3d281b2e2248d206b624af039c78689ae8e310c72123fb2b37139e45f" Mar 18 13:29:54 crc kubenswrapper[4843]: E0318 13:29:54.342923 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fbe9cc3d281b2e2248d206b624af039c78689ae8e310c72123fb2b37139e45f\": container with ID starting with 7fbe9cc3d281b2e2248d206b624af039c78689ae8e310c72123fb2b37139e45f not found: ID does not exist" containerID="7fbe9cc3d281b2e2248d206b624af039c78689ae8e310c72123fb2b37139e45f" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.342961 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fbe9cc3d281b2e2248d206b624af039c78689ae8e310c72123fb2b37139e45f"} err="failed to get container status \"7fbe9cc3d281b2e2248d206b624af039c78689ae8e310c72123fb2b37139e45f\": rpc error: code = NotFound desc = could not find container \"7fbe9cc3d281b2e2248d206b624af039c78689ae8e310c72123fb2b37139e45f\": container with ID starting with 7fbe9cc3d281b2e2248d206b624af039c78689ae8e310c72123fb2b37139e45f not found: ID does not exist" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.342986 4843 scope.go:117] "RemoveContainer" containerID="0198ca33beb96ee356345f6c4b8468eb5fe9db5c9f70c5cb8d7d9609e8447241" Mar 18 13:29:54 crc kubenswrapper[4843]: E0318 13:29:54.343391 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0198ca33beb96ee356345f6c4b8468eb5fe9db5c9f70c5cb8d7d9609e8447241\": container with ID starting with 0198ca33beb96ee356345f6c4b8468eb5fe9db5c9f70c5cb8d7d9609e8447241 not found: ID does not exist" containerID="0198ca33beb96ee356345f6c4b8468eb5fe9db5c9f70c5cb8d7d9609e8447241" Mar 18 13:29:54 crc kubenswrapper[4843]: I0318 13:29:54.343410 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0198ca33beb96ee356345f6c4b8468eb5fe9db5c9f70c5cb8d7d9609e8447241"} err="failed to get container status \"0198ca33beb96ee356345f6c4b8468eb5fe9db5c9f70c5cb8d7d9609e8447241\": rpc error: code = NotFound desc = could not find container \"0198ca33beb96ee356345f6c4b8468eb5fe9db5c9f70c5cb8d7d9609e8447241\": container with ID starting with 0198ca33beb96ee356345f6c4b8468eb5fe9db5c9f70c5cb8d7d9609e8447241 not found: ID does not exist" Mar 18 13:29:55 crc kubenswrapper[4843]: I0318 13:29:55.089709 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44e82195-9b77-45ac-a6d8-c45c7a49d1a5" (UID: "44e82195-9b77-45ac-a6d8-c45c7a49d1a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:29:55 crc kubenswrapper[4843]: I0318 13:29:55.124930 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44e82195-9b77-45ac-a6d8-c45c7a49d1a5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:29:55 crc kubenswrapper[4843]: I0318 13:29:55.196712 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zh6dm"] Mar 18 13:29:55 crc kubenswrapper[4843]: I0318 13:29:55.210637 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zh6dm"] Mar 18 13:29:57 crc kubenswrapper[4843]: I0318 13:29:57.060563 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44e82195-9b77-45ac-a6d8-c45c7a49d1a5" path="/var/lib/kubelet/pods/44e82195-9b77-45ac-a6d8-c45c7a49d1a5/volumes" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.372282 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564010-726q7"] Mar 18 13:30:00 crc kubenswrapper[4843]: E0318 13:30:00.374202 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e82195-9b77-45ac-a6d8-c45c7a49d1a5" containerName="extract-utilities" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.374226 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e82195-9b77-45ac-a6d8-c45c7a49d1a5" containerName="extract-utilities" Mar 18 13:30:00 crc kubenswrapper[4843]: E0318 13:30:00.374240 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e82195-9b77-45ac-a6d8-c45c7a49d1a5" containerName="extract-content" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.374249 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e82195-9b77-45ac-a6d8-c45c7a49d1a5" containerName="extract-content" Mar 18 13:30:00 crc kubenswrapper[4843]: E0318 13:30:00.374281 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e82195-9b77-45ac-a6d8-c45c7a49d1a5" containerName="registry-server" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.374290 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e82195-9b77-45ac-a6d8-c45c7a49d1a5" containerName="registry-server" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.374586 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="44e82195-9b77-45ac-a6d8-c45c7a49d1a5" containerName="registry-server" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.375690 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-726q7" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.377899 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.378216 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.378677 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.388701 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj"] Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.390059 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.404801 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.405165 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.407820 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-726q7"] Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.423259 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj"] Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.442952 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76083e12-0c8d-4d19-a6bf-94df8de8cba4-secret-volume\") pod \"collect-profiles-29564010-rpclj\" (UID: \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.443141 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76083e12-0c8d-4d19-a6bf-94df8de8cba4-config-volume\") pod \"collect-profiles-29564010-rpclj\" (UID: \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.443181 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvwqs\" (UniqueName: \"kubernetes.io/projected/d57360d1-f8c3-4af7-958d-f5cf1308d69c-kube-api-access-kvwqs\") pod \"auto-csr-approver-29564010-726q7\" (UID: \"d57360d1-f8c3-4af7-958d-f5cf1308d69c\") " pod="openshift-infra/auto-csr-approver-29564010-726q7" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.443515 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfsls\" (UniqueName: \"kubernetes.io/projected/76083e12-0c8d-4d19-a6bf-94df8de8cba4-kube-api-access-wfsls\") pod \"collect-profiles-29564010-rpclj\" (UID: \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.545544 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvwqs\" (UniqueName: \"kubernetes.io/projected/d57360d1-f8c3-4af7-958d-f5cf1308d69c-kube-api-access-kvwqs\") pod \"auto-csr-approver-29564010-726q7\" (UID: \"d57360d1-f8c3-4af7-958d-f5cf1308d69c\") " pod="openshift-infra/auto-csr-approver-29564010-726q7" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.547258 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfsls\" (UniqueName: \"kubernetes.io/projected/76083e12-0c8d-4d19-a6bf-94df8de8cba4-kube-api-access-wfsls\") pod \"collect-profiles-29564010-rpclj\" (UID: \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.547469 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76083e12-0c8d-4d19-a6bf-94df8de8cba4-secret-volume\") pod \"collect-profiles-29564010-rpclj\" (UID: \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.547803 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76083e12-0c8d-4d19-a6bf-94df8de8cba4-config-volume\") pod \"collect-profiles-29564010-rpclj\" (UID: \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.549140 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76083e12-0c8d-4d19-a6bf-94df8de8cba4-config-volume\") pod \"collect-profiles-29564010-rpclj\" (UID: \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.559746 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76083e12-0c8d-4d19-a6bf-94df8de8cba4-secret-volume\") pod \"collect-profiles-29564010-rpclj\" (UID: \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.564597 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfsls\" (UniqueName: \"kubernetes.io/projected/76083e12-0c8d-4d19-a6bf-94df8de8cba4-kube-api-access-wfsls\") pod \"collect-profiles-29564010-rpclj\" (UID: \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.565141 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvwqs\" (UniqueName: \"kubernetes.io/projected/d57360d1-f8c3-4af7-958d-f5cf1308d69c-kube-api-access-kvwqs\") pod \"auto-csr-approver-29564010-726q7\" (UID: \"d57360d1-f8c3-4af7-958d-f5cf1308d69c\") " pod="openshift-infra/auto-csr-approver-29564010-726q7" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.723531 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-726q7" Mar 18 13:30:00 crc kubenswrapper[4843]: I0318 13:30:00.736770 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" Mar 18 13:30:01 crc kubenswrapper[4843]: I0318 13:30:01.188391 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-726q7"] Mar 18 13:30:01 crc kubenswrapper[4843]: I0318 13:30:01.300148 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj"] Mar 18 13:30:01 crc kubenswrapper[4843]: W0318 13:30:01.301680 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76083e12_0c8d_4d19_a6bf_94df8de8cba4.slice/crio-6848596025b72df2471b027ef8fe3c78c90cf090aeaa46fc10ba7883c9df9150 WatchSource:0}: Error finding container 6848596025b72df2471b027ef8fe3c78c90cf090aeaa46fc10ba7883c9df9150: Status 404 returned error can't find the container with id 6848596025b72df2471b027ef8fe3c78c90cf090aeaa46fc10ba7883c9df9150 Mar 18 13:30:01 crc kubenswrapper[4843]: I0318 13:30:01.372429 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564010-726q7" event={"ID":"d57360d1-f8c3-4af7-958d-f5cf1308d69c","Type":"ContainerStarted","Data":"5c9a88019f307909659095dc08f11c1da6d86a0e33cdb6a362b99ab1a6cd8068"} Mar 18 13:30:01 crc kubenswrapper[4843]: I0318 13:30:01.375750 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" event={"ID":"76083e12-0c8d-4d19-a6bf-94df8de8cba4","Type":"ContainerStarted","Data":"6848596025b72df2471b027ef8fe3c78c90cf090aeaa46fc10ba7883c9df9150"} Mar 18 13:30:02 crc kubenswrapper[4843]: I0318 13:30:02.386989 4843 generic.go:334] "Generic (PLEG): container finished" podID="76083e12-0c8d-4d19-a6bf-94df8de8cba4" containerID="39be30efa56ba2a8e5e30e78a8b2a7ba80578eaa6426e420fe71b6ec0569e605" exitCode=0 Mar 18 13:30:02 crc kubenswrapper[4843]: I0318 13:30:02.387082 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" event={"ID":"76083e12-0c8d-4d19-a6bf-94df8de8cba4","Type":"ContainerDied","Data":"39be30efa56ba2a8e5e30e78a8b2a7ba80578eaa6426e420fe71b6ec0569e605"} Mar 18 13:30:03 crc kubenswrapper[4843]: I0318 13:30:03.708109 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" Mar 18 13:30:03 crc kubenswrapper[4843]: I0318 13:30:03.711568 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfsls\" (UniqueName: \"kubernetes.io/projected/76083e12-0c8d-4d19-a6bf-94df8de8cba4-kube-api-access-wfsls\") pod \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\" (UID: \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\") " Mar 18 13:30:03 crc kubenswrapper[4843]: I0318 13:30:03.711613 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76083e12-0c8d-4d19-a6bf-94df8de8cba4-secret-volume\") pod \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\" (UID: \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\") " Mar 18 13:30:03 crc kubenswrapper[4843]: I0318 13:30:03.717495 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76083e12-0c8d-4d19-a6bf-94df8de8cba4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "76083e12-0c8d-4d19-a6bf-94df8de8cba4" (UID: "76083e12-0c8d-4d19-a6bf-94df8de8cba4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:03 crc kubenswrapper[4843]: I0318 13:30:03.717540 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76083e12-0c8d-4d19-a6bf-94df8de8cba4-kube-api-access-wfsls" (OuterVolumeSpecName: "kube-api-access-wfsls") pod "76083e12-0c8d-4d19-a6bf-94df8de8cba4" (UID: "76083e12-0c8d-4d19-a6bf-94df8de8cba4"). InnerVolumeSpecName "kube-api-access-wfsls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:03 crc kubenswrapper[4843]: I0318 13:30:03.812794 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76083e12-0c8d-4d19-a6bf-94df8de8cba4-config-volume\") pod \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\" (UID: \"76083e12-0c8d-4d19-a6bf-94df8de8cba4\") " Mar 18 13:30:03 crc kubenswrapper[4843]: I0318 13:30:03.813404 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfsls\" (UniqueName: \"kubernetes.io/projected/76083e12-0c8d-4d19-a6bf-94df8de8cba4-kube-api-access-wfsls\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:03 crc kubenswrapper[4843]: I0318 13:30:03.813443 4843 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76083e12-0c8d-4d19-a6bf-94df8de8cba4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:03 crc kubenswrapper[4843]: I0318 13:30:03.813554 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76083e12-0c8d-4d19-a6bf-94df8de8cba4-config-volume" (OuterVolumeSpecName: "config-volume") pod "76083e12-0c8d-4d19-a6bf-94df8de8cba4" (UID: "76083e12-0c8d-4d19-a6bf-94df8de8cba4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:30:03 crc kubenswrapper[4843]: I0318 13:30:03.915216 4843 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76083e12-0c8d-4d19-a6bf-94df8de8cba4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:04 crc kubenswrapper[4843]: I0318 13:30:04.411381 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" event={"ID":"76083e12-0c8d-4d19-a6bf-94df8de8cba4","Type":"ContainerDied","Data":"6848596025b72df2471b027ef8fe3c78c90cf090aeaa46fc10ba7883c9df9150"} Mar 18 13:30:04 crc kubenswrapper[4843]: I0318 13:30:04.411437 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6848596025b72df2471b027ef8fe3c78c90cf090aeaa46fc10ba7883c9df9150" Mar 18 13:30:04 crc kubenswrapper[4843]: I0318 13:30:04.411489 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-rpclj" Mar 18 13:30:04 crc kubenswrapper[4843]: I0318 13:30:04.792192 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k"] Mar 18 13:30:04 crc kubenswrapper[4843]: I0318 13:30:04.800983 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-2mv2k"] Mar 18 13:30:04 crc kubenswrapper[4843]: I0318 13:30:04.995476 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f70510-4798-4903-89e6-853498d3e42d" path="/var/lib/kubelet/pods/51f70510-4798-4903-89e6-853498d3e42d/volumes" Mar 18 13:30:05 crc kubenswrapper[4843]: I0318 13:30:05.425725 4843 generic.go:334] "Generic (PLEG): container finished" podID="d57360d1-f8c3-4af7-958d-f5cf1308d69c" containerID="ccadf14b5ed7c4b5ab7efe7916330533a2af70fd57e7a91feebf227de07f4fd7" exitCode=0 Mar 18 13:30:05 crc kubenswrapper[4843]: I0318 13:30:05.425887 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564010-726q7" event={"ID":"d57360d1-f8c3-4af7-958d-f5cf1308d69c","Type":"ContainerDied","Data":"ccadf14b5ed7c4b5ab7efe7916330533a2af70fd57e7a91feebf227de07f4fd7"} Mar 18 13:30:06 crc kubenswrapper[4843]: I0318 13:30:06.815065 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-726q7" Mar 18 13:30:06 crc kubenswrapper[4843]: I0318 13:30:06.952838 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvwqs\" (UniqueName: \"kubernetes.io/projected/d57360d1-f8c3-4af7-958d-f5cf1308d69c-kube-api-access-kvwqs\") pod \"d57360d1-f8c3-4af7-958d-f5cf1308d69c\" (UID: \"d57360d1-f8c3-4af7-958d-f5cf1308d69c\") " Mar 18 13:30:06 crc kubenswrapper[4843]: I0318 13:30:06.960598 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d57360d1-f8c3-4af7-958d-f5cf1308d69c-kube-api-access-kvwqs" (OuterVolumeSpecName: "kube-api-access-kvwqs") pod "d57360d1-f8c3-4af7-958d-f5cf1308d69c" (UID: "d57360d1-f8c3-4af7-958d-f5cf1308d69c"). InnerVolumeSpecName "kube-api-access-kvwqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:07 crc kubenswrapper[4843]: I0318 13:30:07.056397 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvwqs\" (UniqueName: \"kubernetes.io/projected/d57360d1-f8c3-4af7-958d-f5cf1308d69c-kube-api-access-kvwqs\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:07 crc kubenswrapper[4843]: I0318 13:30:07.478235 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564010-726q7" event={"ID":"d57360d1-f8c3-4af7-958d-f5cf1308d69c","Type":"ContainerDied","Data":"5c9a88019f307909659095dc08f11c1da6d86a0e33cdb6a362b99ab1a6cd8068"} Mar 18 13:30:07 crc kubenswrapper[4843]: I0318 13:30:07.478276 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c9a88019f307909659095dc08f11c1da6d86a0e33cdb6a362b99ab1a6cd8068" Mar 18 13:30:07 crc kubenswrapper[4843]: I0318 13:30:07.478340 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-726q7" Mar 18 13:30:07 crc kubenswrapper[4843]: I0318 13:30:07.874167 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-xtbtf"] Mar 18 13:30:07 crc kubenswrapper[4843]: I0318 13:30:07.883302 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-xtbtf"] Mar 18 13:30:08 crc kubenswrapper[4843]: I0318 13:30:08.025211 4843 scope.go:117] "RemoveContainer" containerID="7d5ec4e7f16d9bbb4470d6272da4f4b7db0b9ad0f164d8b2efa526527e5429a9" Mar 18 13:30:08 crc kubenswrapper[4843]: I0318 13:30:08.994720 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c4dcad9-fe20-4189-8b62-c6fd7d613838" path="/var/lib/kubelet/pods/6c4dcad9-fe20-4189-8b62-c6fd7d613838/volumes" Mar 18 13:30:50 crc kubenswrapper[4843]: I0318 13:30:50.034511 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:30:50 crc kubenswrapper[4843]: I0318 13:30:50.035244 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:31:08 crc kubenswrapper[4843]: I0318 13:31:08.106157 4843 scope.go:117] "RemoveContainer" containerID="949b327e89c7408f0c9e4a417ba9404dbd118fd9861d4b282846186f34d1e5e4" Mar 18 13:31:20 crc kubenswrapper[4843]: I0318 13:31:20.035257 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:31:20 crc kubenswrapper[4843]: I0318 13:31:20.035841 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:31:50 crc kubenswrapper[4843]: I0318 13:31:50.035256 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:31:50 crc kubenswrapper[4843]: I0318 13:31:50.035891 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:31:50 crc kubenswrapper[4843]: I0318 13:31:50.035960 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 13:31:50 crc kubenswrapper[4843]: I0318 13:31:50.036869 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"48d9422627d1f2b320842204ec90814b68898c02c8f99d441d7ad28939b9e238"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:31:50 crc kubenswrapper[4843]: I0318 13:31:50.036945 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://48d9422627d1f2b320842204ec90814b68898c02c8f99d441d7ad28939b9e238" gracePeriod=600 Mar 18 13:31:50 crc kubenswrapper[4843]: I0318 13:31:50.668147 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="48d9422627d1f2b320842204ec90814b68898c02c8f99d441d7ad28939b9e238" exitCode=0 Mar 18 13:31:50 crc kubenswrapper[4843]: I0318 13:31:50.668213 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"48d9422627d1f2b320842204ec90814b68898c02c8f99d441d7ad28939b9e238"} Mar 18 13:31:50 crc kubenswrapper[4843]: I0318 13:31:50.668528 4843 scope.go:117] "RemoveContainer" containerID="36eb60065b865005338ad5d55977c92613ee4ada088f5aeeccaf6e6417ef47c0" Mar 18 13:31:51 crc kubenswrapper[4843]: I0318 13:31:51.681780 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f"} Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.163229 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564012-p4gnp"] Mar 18 13:32:00 crc kubenswrapper[4843]: E0318 13:32:00.165194 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57360d1-f8c3-4af7-958d-f5cf1308d69c" containerName="oc" Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.165219 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57360d1-f8c3-4af7-958d-f5cf1308d69c" containerName="oc" Mar 18 13:32:00 crc kubenswrapper[4843]: E0318 13:32:00.165255 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76083e12-0c8d-4d19-a6bf-94df8de8cba4" containerName="collect-profiles" Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.165264 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="76083e12-0c8d-4d19-a6bf-94df8de8cba4" containerName="collect-profiles" Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.165534 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57360d1-f8c3-4af7-958d-f5cf1308d69c" containerName="oc" Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.165590 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="76083e12-0c8d-4d19-a6bf-94df8de8cba4" containerName="collect-profiles" Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.166777 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-p4gnp" Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.170263 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.170510 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.170710 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.176103 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-p4gnp"] Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.306162 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs2n6\" (UniqueName: \"kubernetes.io/projected/47bf1a53-8b55-491a-82c9-7ce936932faa-kube-api-access-bs2n6\") pod \"auto-csr-approver-29564012-p4gnp\" (UID: \"47bf1a53-8b55-491a-82c9-7ce936932faa\") " pod="openshift-infra/auto-csr-approver-29564012-p4gnp" Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.408872 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs2n6\" (UniqueName: \"kubernetes.io/projected/47bf1a53-8b55-491a-82c9-7ce936932faa-kube-api-access-bs2n6\") pod \"auto-csr-approver-29564012-p4gnp\" (UID: \"47bf1a53-8b55-491a-82c9-7ce936932faa\") " pod="openshift-infra/auto-csr-approver-29564012-p4gnp" Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.430629 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs2n6\" (UniqueName: \"kubernetes.io/projected/47bf1a53-8b55-491a-82c9-7ce936932faa-kube-api-access-bs2n6\") pod \"auto-csr-approver-29564012-p4gnp\" (UID: \"47bf1a53-8b55-491a-82c9-7ce936932faa\") " pod="openshift-infra/auto-csr-approver-29564012-p4gnp" Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.492716 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-p4gnp" Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.947677 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-p4gnp"] Mar 18 13:32:00 crc kubenswrapper[4843]: I0318 13:32:00.965180 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:32:01 crc kubenswrapper[4843]: I0318 13:32:01.922339 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564012-p4gnp" event={"ID":"47bf1a53-8b55-491a-82c9-7ce936932faa","Type":"ContainerStarted","Data":"415dbca55dbc2085ec1be41a329fa71f217c78a1edfbed8b88c0aaa50ce41c10"} Mar 18 13:32:04 crc kubenswrapper[4843]: I0318 13:32:04.997189 4843 generic.go:334] "Generic (PLEG): container finished" podID="47bf1a53-8b55-491a-82c9-7ce936932faa" containerID="ed7686f318c84c76c262c8efdd47cb3cfaded7bfbff3d145876b94aec1b2953b" exitCode=0 Mar 18 13:32:04 crc kubenswrapper[4843]: I0318 13:32:04.997765 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564012-p4gnp" event={"ID":"47bf1a53-8b55-491a-82c9-7ce936932faa","Type":"ContainerDied","Data":"ed7686f318c84c76c262c8efdd47cb3cfaded7bfbff3d145876b94aec1b2953b"} Mar 18 13:32:06 crc kubenswrapper[4843]: I0318 13:32:06.630135 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-p4gnp" Mar 18 13:32:06 crc kubenswrapper[4843]: I0318 13:32:06.730142 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs2n6\" (UniqueName: \"kubernetes.io/projected/47bf1a53-8b55-491a-82c9-7ce936932faa-kube-api-access-bs2n6\") pod \"47bf1a53-8b55-491a-82c9-7ce936932faa\" (UID: \"47bf1a53-8b55-491a-82c9-7ce936932faa\") " Mar 18 13:32:06 crc kubenswrapper[4843]: I0318 13:32:06.737995 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47bf1a53-8b55-491a-82c9-7ce936932faa-kube-api-access-bs2n6" (OuterVolumeSpecName: "kube-api-access-bs2n6") pod "47bf1a53-8b55-491a-82c9-7ce936932faa" (UID: "47bf1a53-8b55-491a-82c9-7ce936932faa"). InnerVolumeSpecName "kube-api-access-bs2n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:32:06 crc kubenswrapper[4843]: I0318 13:32:06.833901 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs2n6\" (UniqueName: \"kubernetes.io/projected/47bf1a53-8b55-491a-82c9-7ce936932faa-kube-api-access-bs2n6\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.017539 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564012-p4gnp" event={"ID":"47bf1a53-8b55-491a-82c9-7ce936932faa","Type":"ContainerDied","Data":"415dbca55dbc2085ec1be41a329fa71f217c78a1edfbed8b88c0aaa50ce41c10"} Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.017867 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="415dbca55dbc2085ec1be41a329fa71f217c78a1edfbed8b88c0aaa50ce41c10" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.017607 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-p4gnp" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.170185 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4299p"] Mar 18 13:32:07 crc kubenswrapper[4843]: E0318 13:32:07.170734 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47bf1a53-8b55-491a-82c9-7ce936932faa" containerName="oc" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.170757 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="47bf1a53-8b55-491a-82c9-7ce936932faa" containerName="oc" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.171083 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="47bf1a53-8b55-491a-82c9-7ce936932faa" containerName="oc" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.173077 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.191007 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4299p"] Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.241675 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b6g5\" (UniqueName: \"kubernetes.io/projected/76d62ea9-8c7a-4672-b0ad-d393959b3c09-kube-api-access-2b6g5\") pod \"community-operators-4299p\" (UID: \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\") " pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.242038 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d62ea9-8c7a-4672-b0ad-d393959b3c09-utilities\") pod \"community-operators-4299p\" (UID: \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\") " pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.242175 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d62ea9-8c7a-4672-b0ad-d393959b3c09-catalog-content\") pod \"community-operators-4299p\" (UID: \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\") " pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.344113 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d62ea9-8c7a-4672-b0ad-d393959b3c09-utilities\") pod \"community-operators-4299p\" (UID: \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\") " pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.344191 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d62ea9-8c7a-4672-b0ad-d393959b3c09-catalog-content\") pod \"community-operators-4299p\" (UID: \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\") " pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.344306 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b6g5\" (UniqueName: \"kubernetes.io/projected/76d62ea9-8c7a-4672-b0ad-d393959b3c09-kube-api-access-2b6g5\") pod \"community-operators-4299p\" (UID: \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\") " pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.344900 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d62ea9-8c7a-4672-b0ad-d393959b3c09-catalog-content\") pod \"community-operators-4299p\" (UID: \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\") " pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.345070 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d62ea9-8c7a-4672-b0ad-d393959b3c09-utilities\") pod \"community-operators-4299p\" (UID: \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\") " pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.367084 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b6g5\" (UniqueName: \"kubernetes.io/projected/76d62ea9-8c7a-4672-b0ad-d393959b3c09-kube-api-access-2b6g5\") pod \"community-operators-4299p\" (UID: \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\") " pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.494272 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.795805 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-phwm8"] Mar 18 13:32:07 crc kubenswrapper[4843]: I0318 13:32:07.806542 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-phwm8"] Mar 18 13:32:08 crc kubenswrapper[4843]: I0318 13:32:08.028123 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4299p"] Mar 18 13:32:09 crc kubenswrapper[4843]: I0318 13:32:09.028471 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b6a6ecc-776a-4f9f-b77f-06df5c0da886" path="/var/lib/kubelet/pods/9b6a6ecc-776a-4f9f-b77f-06df5c0da886/volumes" Mar 18 13:32:09 crc kubenswrapper[4843]: I0318 13:32:09.037684 4843 generic.go:334] "Generic (PLEG): container finished" podID="76d62ea9-8c7a-4672-b0ad-d393959b3c09" containerID="aaa02113a97a60a2a447c629bdf6a4ba5e6647046cd1d7f78441c6e5281d5305" exitCode=0 Mar 18 13:32:09 crc kubenswrapper[4843]: I0318 13:32:09.037733 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4299p" event={"ID":"76d62ea9-8c7a-4672-b0ad-d393959b3c09","Type":"ContainerDied","Data":"aaa02113a97a60a2a447c629bdf6a4ba5e6647046cd1d7f78441c6e5281d5305"} Mar 18 13:32:09 crc kubenswrapper[4843]: I0318 13:32:09.037760 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4299p" event={"ID":"76d62ea9-8c7a-4672-b0ad-d393959b3c09","Type":"ContainerStarted","Data":"7b3ee5f337b5184916511d51e1424e9a10a0ad1b1143610227ea3c2c840bf61d"} Mar 18 13:32:11 crc kubenswrapper[4843]: I0318 13:32:11.063755 4843 generic.go:334] "Generic (PLEG): container finished" podID="76d62ea9-8c7a-4672-b0ad-d393959b3c09" containerID="2c0a5a8ecab8072d05b8db8697bb7c827bc631e446ddd7afec0ec0282d650f59" exitCode=0 Mar 18 13:32:11 crc kubenswrapper[4843]: I0318 13:32:11.063854 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4299p" event={"ID":"76d62ea9-8c7a-4672-b0ad-d393959b3c09","Type":"ContainerDied","Data":"2c0a5a8ecab8072d05b8db8697bb7c827bc631e446ddd7afec0ec0282d650f59"} Mar 18 13:32:12 crc kubenswrapper[4843]: I0318 13:32:12.077485 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4299p" event={"ID":"76d62ea9-8c7a-4672-b0ad-d393959b3c09","Type":"ContainerStarted","Data":"bc3465d0919e794dd25c87909520140fa27dd4632cbef3e9ccafcaa1b927009b"} Mar 18 13:32:12 crc kubenswrapper[4843]: I0318 13:32:12.110125 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4299p" podStartSLOduration=2.472840517 podStartE2EDuration="5.110074259s" podCreationTimestamp="2026-03-18 13:32:07 +0000 UTC" firstStartedPulling="2026-03-18 13:32:09.039902689 +0000 UTC m=+4962.755728213" lastFinishedPulling="2026-03-18 13:32:11.677136431 +0000 UTC m=+4965.392961955" observedRunningTime="2026-03-18 13:32:12.097632326 +0000 UTC m=+4965.813457860" watchObservedRunningTime="2026-03-18 13:32:12.110074259 +0000 UTC m=+4965.825899793" Mar 18 13:32:17 crc kubenswrapper[4843]: I0318 13:32:17.495135 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:17 crc kubenswrapper[4843]: I0318 13:32:17.495725 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:17 crc kubenswrapper[4843]: I0318 13:32:17.542238 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:18 crc kubenswrapper[4843]: I0318 13:32:18.212947 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:18 crc kubenswrapper[4843]: I0318 13:32:18.274333 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4299p"] Mar 18 13:32:20 crc kubenswrapper[4843]: I0318 13:32:20.184722 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4299p" podUID="76d62ea9-8c7a-4672-b0ad-d393959b3c09" containerName="registry-server" containerID="cri-o://bc3465d0919e794dd25c87909520140fa27dd4632cbef3e9ccafcaa1b927009b" gracePeriod=2 Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.613913 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.615694 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4299p" event={"ID":"76d62ea9-8c7a-4672-b0ad-d393959b3c09","Type":"ContainerDied","Data":"bc3465d0919e794dd25c87909520140fa27dd4632cbef3e9ccafcaa1b927009b"} Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.615760 4843 scope.go:117] "RemoveContainer" containerID="bc3465d0919e794dd25c87909520140fa27dd4632cbef3e9ccafcaa1b927009b" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.615803 4843 generic.go:334] "Generic (PLEG): container finished" podID="76d62ea9-8c7a-4672-b0ad-d393959b3c09" containerID="bc3465d0919e794dd25c87909520140fa27dd4632cbef3e9ccafcaa1b927009b" exitCode=0 Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.616775 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4299p" event={"ID":"76d62ea9-8c7a-4672-b0ad-d393959b3c09","Type":"ContainerDied","Data":"7b3ee5f337b5184916511d51e1424e9a10a0ad1b1143610227ea3c2c840bf61d"} Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.652871 4843 scope.go:117] "RemoveContainer" containerID="2c0a5a8ecab8072d05b8db8697bb7c827bc631e446ddd7afec0ec0282d650f59" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.683422 4843 scope.go:117] "RemoveContainer" containerID="aaa02113a97a60a2a447c629bdf6a4ba5e6647046cd1d7f78441c6e5281d5305" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.713643 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d62ea9-8c7a-4672-b0ad-d393959b3c09-utilities\") pod \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\" (UID: \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\") " Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.713925 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b6g5\" (UniqueName: \"kubernetes.io/projected/76d62ea9-8c7a-4672-b0ad-d393959b3c09-kube-api-access-2b6g5\") pod \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\" (UID: \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\") " Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.714008 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d62ea9-8c7a-4672-b0ad-d393959b3c09-catalog-content\") pod \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\" (UID: \"76d62ea9-8c7a-4672-b0ad-d393959b3c09\") " Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.714732 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d62ea9-8c7a-4672-b0ad-d393959b3c09-utilities" (OuterVolumeSpecName: "utilities") pod "76d62ea9-8c7a-4672-b0ad-d393959b3c09" (UID: "76d62ea9-8c7a-4672-b0ad-d393959b3c09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.715685 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d62ea9-8c7a-4672-b0ad-d393959b3c09-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.728018 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d62ea9-8c7a-4672-b0ad-d393959b3c09-kube-api-access-2b6g5" (OuterVolumeSpecName: "kube-api-access-2b6g5") pod "76d62ea9-8c7a-4672-b0ad-d393959b3c09" (UID: "76d62ea9-8c7a-4672-b0ad-d393959b3c09"). InnerVolumeSpecName "kube-api-access-2b6g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.775507 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d62ea9-8c7a-4672-b0ad-d393959b3c09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76d62ea9-8c7a-4672-b0ad-d393959b3c09" (UID: "76d62ea9-8c7a-4672-b0ad-d393959b3c09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.808598 4843 scope.go:117] "RemoveContainer" containerID="bc3465d0919e794dd25c87909520140fa27dd4632cbef3e9ccafcaa1b927009b" Mar 18 13:32:22 crc kubenswrapper[4843]: E0318 13:32:22.808843 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3465d0919e794dd25c87909520140fa27dd4632cbef3e9ccafcaa1b927009b\": container with ID starting with bc3465d0919e794dd25c87909520140fa27dd4632cbef3e9ccafcaa1b927009b not found: ID does not exist" containerID="bc3465d0919e794dd25c87909520140fa27dd4632cbef3e9ccafcaa1b927009b" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.808871 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3465d0919e794dd25c87909520140fa27dd4632cbef3e9ccafcaa1b927009b"} err="failed to get container status \"bc3465d0919e794dd25c87909520140fa27dd4632cbef3e9ccafcaa1b927009b\": rpc error: code = NotFound desc = could not find container \"bc3465d0919e794dd25c87909520140fa27dd4632cbef3e9ccafcaa1b927009b\": container with ID starting with bc3465d0919e794dd25c87909520140fa27dd4632cbef3e9ccafcaa1b927009b not found: ID does not exist" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.808895 4843 scope.go:117] "RemoveContainer" containerID="2c0a5a8ecab8072d05b8db8697bb7c827bc631e446ddd7afec0ec0282d650f59" Mar 18 13:32:22 crc kubenswrapper[4843]: E0318 13:32:22.809253 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0a5a8ecab8072d05b8db8697bb7c827bc631e446ddd7afec0ec0282d650f59\": container with ID starting with 2c0a5a8ecab8072d05b8db8697bb7c827bc631e446ddd7afec0ec0282d650f59 not found: ID does not exist" containerID="2c0a5a8ecab8072d05b8db8697bb7c827bc631e446ddd7afec0ec0282d650f59" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.809269 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0a5a8ecab8072d05b8db8697bb7c827bc631e446ddd7afec0ec0282d650f59"} err="failed to get container status \"2c0a5a8ecab8072d05b8db8697bb7c827bc631e446ddd7afec0ec0282d650f59\": rpc error: code = NotFound desc = could not find container \"2c0a5a8ecab8072d05b8db8697bb7c827bc631e446ddd7afec0ec0282d650f59\": container with ID starting with 2c0a5a8ecab8072d05b8db8697bb7c827bc631e446ddd7afec0ec0282d650f59 not found: ID does not exist" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.809282 4843 scope.go:117] "RemoveContainer" containerID="aaa02113a97a60a2a447c629bdf6a4ba5e6647046cd1d7f78441c6e5281d5305" Mar 18 13:32:22 crc kubenswrapper[4843]: E0318 13:32:22.809610 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaa02113a97a60a2a447c629bdf6a4ba5e6647046cd1d7f78441c6e5281d5305\": container with ID starting with aaa02113a97a60a2a447c629bdf6a4ba5e6647046cd1d7f78441c6e5281d5305 not found: ID does not exist" containerID="aaa02113a97a60a2a447c629bdf6a4ba5e6647046cd1d7f78441c6e5281d5305" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.809639 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaa02113a97a60a2a447c629bdf6a4ba5e6647046cd1d7f78441c6e5281d5305"} err="failed to get container status \"aaa02113a97a60a2a447c629bdf6a4ba5e6647046cd1d7f78441c6e5281d5305\": rpc error: code = NotFound desc = could not find container \"aaa02113a97a60a2a447c629bdf6a4ba5e6647046cd1d7f78441c6e5281d5305\": container with ID starting with aaa02113a97a60a2a447c629bdf6a4ba5e6647046cd1d7f78441c6e5281d5305 not found: ID does not exist" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.817131 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b6g5\" (UniqueName: \"kubernetes.io/projected/76d62ea9-8c7a-4672-b0ad-d393959b3c09-kube-api-access-2b6g5\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:22 crc kubenswrapper[4843]: I0318 13:32:22.817168 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d62ea9-8c7a-4672-b0ad-d393959b3c09-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:23 crc kubenswrapper[4843]: I0318 13:32:23.627389 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4299p" Mar 18 13:32:23 crc kubenswrapper[4843]: I0318 13:32:23.654716 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4299p"] Mar 18 13:32:23 crc kubenswrapper[4843]: I0318 13:32:23.663757 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4299p"] Mar 18 13:32:24 crc kubenswrapper[4843]: I0318 13:32:24.999459 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d62ea9-8c7a-4672-b0ad-d393959b3c09" path="/var/lib/kubelet/pods/76d62ea9-8c7a-4672-b0ad-d393959b3c09/volumes" Mar 18 13:33:08 crc kubenswrapper[4843]: I0318 13:33:08.225284 4843 scope.go:117] "RemoveContainer" containerID="cecdb8b0cafadf56aba4cba269db269d53d81b1147eed3f4fc26c576c3852e4a" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.674453 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k8x7b"] Mar 18 13:33:47 crc kubenswrapper[4843]: E0318 13:33:47.675558 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d62ea9-8c7a-4672-b0ad-d393959b3c09" containerName="registry-server" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.675575 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d62ea9-8c7a-4672-b0ad-d393959b3c09" containerName="registry-server" Mar 18 13:33:47 crc kubenswrapper[4843]: E0318 13:33:47.675591 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d62ea9-8c7a-4672-b0ad-d393959b3c09" containerName="extract-content" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.675600 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d62ea9-8c7a-4672-b0ad-d393959b3c09" containerName="extract-content" Mar 18 13:33:47 crc kubenswrapper[4843]: E0318 13:33:47.675837 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d62ea9-8c7a-4672-b0ad-d393959b3c09" containerName="extract-utilities" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.675852 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d62ea9-8c7a-4672-b0ad-d393959b3c09" containerName="extract-utilities" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.676206 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d62ea9-8c7a-4672-b0ad-d393959b3c09" containerName="registry-server" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.683713 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.684917 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8x7b"] Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.861900 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9f66\" (UniqueName: \"kubernetes.io/projected/6555b595-97d4-4712-92d6-d820574a0d36-kube-api-access-t9f66\") pod \"redhat-marketplace-k8x7b\" (UID: \"6555b595-97d4-4712-92d6-d820574a0d36\") " pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.863773 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6555b595-97d4-4712-92d6-d820574a0d36-utilities\") pod \"redhat-marketplace-k8x7b\" (UID: \"6555b595-97d4-4712-92d6-d820574a0d36\") " pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.864026 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6555b595-97d4-4712-92d6-d820574a0d36-catalog-content\") pod \"redhat-marketplace-k8x7b\" (UID: \"6555b595-97d4-4712-92d6-d820574a0d36\") " pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.966266 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6555b595-97d4-4712-92d6-d820574a0d36-utilities\") pod \"redhat-marketplace-k8x7b\" (UID: \"6555b595-97d4-4712-92d6-d820574a0d36\") " pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.966629 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6555b595-97d4-4712-92d6-d820574a0d36-catalog-content\") pod \"redhat-marketplace-k8x7b\" (UID: \"6555b595-97d4-4712-92d6-d820574a0d36\") " pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.967159 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9f66\" (UniqueName: \"kubernetes.io/projected/6555b595-97d4-4712-92d6-d820574a0d36-kube-api-access-t9f66\") pod \"redhat-marketplace-k8x7b\" (UID: \"6555b595-97d4-4712-92d6-d820574a0d36\") " pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.967256 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6555b595-97d4-4712-92d6-d820574a0d36-catalog-content\") pod \"redhat-marketplace-k8x7b\" (UID: \"6555b595-97d4-4712-92d6-d820574a0d36\") " pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.966985 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6555b595-97d4-4712-92d6-d820574a0d36-utilities\") pod \"redhat-marketplace-k8x7b\" (UID: \"6555b595-97d4-4712-92d6-d820574a0d36\") " pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:47 crc kubenswrapper[4843]: I0318 13:33:47.995787 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9f66\" (UniqueName: \"kubernetes.io/projected/6555b595-97d4-4712-92d6-d820574a0d36-kube-api-access-t9f66\") pod \"redhat-marketplace-k8x7b\" (UID: \"6555b595-97d4-4712-92d6-d820574a0d36\") " pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:48 crc kubenswrapper[4843]: I0318 13:33:48.064228 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:49 crc kubenswrapper[4843]: I0318 13:33:49.174720 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8x7b"] Mar 18 13:33:49 crc kubenswrapper[4843]: I0318 13:33:49.829703 4843 generic.go:334] "Generic (PLEG): container finished" podID="6555b595-97d4-4712-92d6-d820574a0d36" containerID="75eb6ce6fa38433f9cf0b2b5b9170ff9a9f327cb340b7d86cff245948b89f873" exitCode=0 Mar 18 13:33:49 crc kubenswrapper[4843]: I0318 13:33:49.829763 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8x7b" event={"ID":"6555b595-97d4-4712-92d6-d820574a0d36","Type":"ContainerDied","Data":"75eb6ce6fa38433f9cf0b2b5b9170ff9a9f327cb340b7d86cff245948b89f873"} Mar 18 13:33:49 crc kubenswrapper[4843]: I0318 13:33:49.829966 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8x7b" event={"ID":"6555b595-97d4-4712-92d6-d820574a0d36","Type":"ContainerStarted","Data":"50687601a9810219874b4443237453aea5e8592bf1b3a42e7875f76bbd49749d"} Mar 18 13:33:50 crc kubenswrapper[4843]: I0318 13:33:50.035183 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:33:50 crc kubenswrapper[4843]: I0318 13:33:50.035309 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:33:51 crc kubenswrapper[4843]: I0318 13:33:51.847726 4843 generic.go:334] "Generic (PLEG): container finished" podID="6555b595-97d4-4712-92d6-d820574a0d36" containerID="47986d57bd2c37a7dca78b8d2835332255da8e844b9eb57f48a0ec8d3e8d1b9b" exitCode=0 Mar 18 13:33:51 crc kubenswrapper[4843]: I0318 13:33:51.847940 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8x7b" event={"ID":"6555b595-97d4-4712-92d6-d820574a0d36","Type":"ContainerDied","Data":"47986d57bd2c37a7dca78b8d2835332255da8e844b9eb57f48a0ec8d3e8d1b9b"} Mar 18 13:33:52 crc kubenswrapper[4843]: I0318 13:33:52.860098 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8x7b" event={"ID":"6555b595-97d4-4712-92d6-d820574a0d36","Type":"ContainerStarted","Data":"6f65cda50a79173bda1d4f7512797da06cbd612e8cdfecc5ff3e50cf3e7dcfaa"} Mar 18 13:33:52 crc kubenswrapper[4843]: I0318 13:33:52.902116 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k8x7b" podStartSLOduration=3.264823687 podStartE2EDuration="5.902063129s" podCreationTimestamp="2026-03-18 13:33:47 +0000 UTC" firstStartedPulling="2026-03-18 13:33:49.831950482 +0000 UTC m=+5063.547776006" lastFinishedPulling="2026-03-18 13:33:52.469189924 +0000 UTC m=+5066.185015448" observedRunningTime="2026-03-18 13:33:52.894295209 +0000 UTC m=+5066.610120733" watchObservedRunningTime="2026-03-18 13:33:52.902063129 +0000 UTC m=+5066.617888653" Mar 18 13:33:58 crc kubenswrapper[4843]: I0318 13:33:58.065976 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:58 crc kubenswrapper[4843]: I0318 13:33:58.066492 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:58 crc kubenswrapper[4843]: I0318 13:33:58.127596 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:33:59 crc kubenswrapper[4843]: I0318 13:33:59.438032 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:34:00 crc kubenswrapper[4843]: I0318 13:34:00.153351 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564014-kmr7n"] Mar 18 13:34:00 crc kubenswrapper[4843]: I0318 13:34:00.155410 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-kmr7n" Mar 18 13:34:00 crc kubenswrapper[4843]: I0318 13:34:00.157823 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:34:00 crc kubenswrapper[4843]: I0318 13:34:00.158923 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:34:00 crc kubenswrapper[4843]: I0318 13:34:00.162842 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:34:00 crc kubenswrapper[4843]: I0318 13:34:00.167408 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-kmr7n"] Mar 18 13:34:00 crc kubenswrapper[4843]: I0318 13:34:00.251329 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8x7b"] Mar 18 13:34:00 crc kubenswrapper[4843]: I0318 13:34:00.316019 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm55r\" (UniqueName: \"kubernetes.io/projected/9f8b955e-7dbb-422e-9924-a4bbf836a579-kube-api-access-lm55r\") pod \"auto-csr-approver-29564014-kmr7n\" (UID: \"9f8b955e-7dbb-422e-9924-a4bbf836a579\") " pod="openshift-infra/auto-csr-approver-29564014-kmr7n" Mar 18 13:34:00 crc kubenswrapper[4843]: I0318 13:34:00.418232 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm55r\" (UniqueName: \"kubernetes.io/projected/9f8b955e-7dbb-422e-9924-a4bbf836a579-kube-api-access-lm55r\") pod \"auto-csr-approver-29564014-kmr7n\" (UID: \"9f8b955e-7dbb-422e-9924-a4bbf836a579\") " pod="openshift-infra/auto-csr-approver-29564014-kmr7n" Mar 18 13:34:00 crc kubenswrapper[4843]: I0318 13:34:00.437110 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm55r\" (UniqueName: \"kubernetes.io/projected/9f8b955e-7dbb-422e-9924-a4bbf836a579-kube-api-access-lm55r\") pod \"auto-csr-approver-29564014-kmr7n\" (UID: \"9f8b955e-7dbb-422e-9924-a4bbf836a579\") " pod="openshift-infra/auto-csr-approver-29564014-kmr7n" Mar 18 13:34:00 crc kubenswrapper[4843]: I0318 13:34:00.499613 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-kmr7n" Mar 18 13:34:00 crc kubenswrapper[4843]: I0318 13:34:00.938342 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k8x7b" podUID="6555b595-97d4-4712-92d6-d820574a0d36" containerName="registry-server" containerID="cri-o://6f65cda50a79173bda1d4f7512797da06cbd612e8cdfecc5ff3e50cf3e7dcfaa" gracePeriod=2 Mar 18 13:34:00 crc kubenswrapper[4843]: I0318 13:34:00.994365 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-kmr7n"] Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.319710 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.449240 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9f66\" (UniqueName: \"kubernetes.io/projected/6555b595-97d4-4712-92d6-d820574a0d36-kube-api-access-t9f66\") pod \"6555b595-97d4-4712-92d6-d820574a0d36\" (UID: \"6555b595-97d4-4712-92d6-d820574a0d36\") " Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.449335 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6555b595-97d4-4712-92d6-d820574a0d36-catalog-content\") pod \"6555b595-97d4-4712-92d6-d820574a0d36\" (UID: \"6555b595-97d4-4712-92d6-d820574a0d36\") " Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.449678 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6555b595-97d4-4712-92d6-d820574a0d36-utilities\") pod \"6555b595-97d4-4712-92d6-d820574a0d36\" (UID: \"6555b595-97d4-4712-92d6-d820574a0d36\") " Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.450837 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6555b595-97d4-4712-92d6-d820574a0d36-utilities" (OuterVolumeSpecName: "utilities") pod "6555b595-97d4-4712-92d6-d820574a0d36" (UID: "6555b595-97d4-4712-92d6-d820574a0d36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.457064 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6555b595-97d4-4712-92d6-d820574a0d36-kube-api-access-t9f66" (OuterVolumeSpecName: "kube-api-access-t9f66") pod "6555b595-97d4-4712-92d6-d820574a0d36" (UID: "6555b595-97d4-4712-92d6-d820574a0d36"). InnerVolumeSpecName "kube-api-access-t9f66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.484643 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6555b595-97d4-4712-92d6-d820574a0d36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6555b595-97d4-4712-92d6-d820574a0d36" (UID: "6555b595-97d4-4712-92d6-d820574a0d36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.625108 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9f66\" (UniqueName: \"kubernetes.io/projected/6555b595-97d4-4712-92d6-d820574a0d36-kube-api-access-t9f66\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.625139 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6555b595-97d4-4712-92d6-d820574a0d36-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.625152 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6555b595-97d4-4712-92d6-d820574a0d36-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.947987 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564014-kmr7n" event={"ID":"9f8b955e-7dbb-422e-9924-a4bbf836a579","Type":"ContainerStarted","Data":"20e727245659a90df5d7060d372c1c54d2b624c97a5578c59894fed91f5824db"} Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.950219 4843 generic.go:334] "Generic (PLEG): container finished" podID="6555b595-97d4-4712-92d6-d820574a0d36" containerID="6f65cda50a79173bda1d4f7512797da06cbd612e8cdfecc5ff3e50cf3e7dcfaa" exitCode=0 Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.950241 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8x7b" event={"ID":"6555b595-97d4-4712-92d6-d820574a0d36","Type":"ContainerDied","Data":"6f65cda50a79173bda1d4f7512797da06cbd612e8cdfecc5ff3e50cf3e7dcfaa"} Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.950386 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8x7b" event={"ID":"6555b595-97d4-4712-92d6-d820574a0d36","Type":"ContainerDied","Data":"50687601a9810219874b4443237453aea5e8592bf1b3a42e7875f76bbd49749d"} Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.950375 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8x7b" Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.950406 4843 scope.go:117] "RemoveContainer" containerID="6f65cda50a79173bda1d4f7512797da06cbd612e8cdfecc5ff3e50cf3e7dcfaa" Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.971619 4843 scope.go:117] "RemoveContainer" containerID="47986d57bd2c37a7dca78b8d2835332255da8e844b9eb57f48a0ec8d3e8d1b9b" Mar 18 13:34:01 crc kubenswrapper[4843]: I0318 13:34:01.996790 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8x7b"] Mar 18 13:34:02 crc kubenswrapper[4843]: I0318 13:34:02.004905 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8x7b"] Mar 18 13:34:02 crc kubenswrapper[4843]: I0318 13:34:02.012328 4843 scope.go:117] "RemoveContainer" containerID="75eb6ce6fa38433f9cf0b2b5b9170ff9a9f327cb340b7d86cff245948b89f873" Mar 18 13:34:02 crc kubenswrapper[4843]: I0318 13:34:02.052194 4843 scope.go:117] "RemoveContainer" containerID="6f65cda50a79173bda1d4f7512797da06cbd612e8cdfecc5ff3e50cf3e7dcfaa" Mar 18 13:34:02 crc kubenswrapper[4843]: E0318 13:34:02.052801 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f65cda50a79173bda1d4f7512797da06cbd612e8cdfecc5ff3e50cf3e7dcfaa\": container with ID starting with 6f65cda50a79173bda1d4f7512797da06cbd612e8cdfecc5ff3e50cf3e7dcfaa not found: ID does not exist" containerID="6f65cda50a79173bda1d4f7512797da06cbd612e8cdfecc5ff3e50cf3e7dcfaa" Mar 18 13:34:02 crc kubenswrapper[4843]: I0318 13:34:02.052950 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f65cda50a79173bda1d4f7512797da06cbd612e8cdfecc5ff3e50cf3e7dcfaa"} err="failed to get container status \"6f65cda50a79173bda1d4f7512797da06cbd612e8cdfecc5ff3e50cf3e7dcfaa\": rpc error: code = NotFound desc = could not find container \"6f65cda50a79173bda1d4f7512797da06cbd612e8cdfecc5ff3e50cf3e7dcfaa\": container with ID starting with 6f65cda50a79173bda1d4f7512797da06cbd612e8cdfecc5ff3e50cf3e7dcfaa not found: ID does not exist" Mar 18 13:34:02 crc kubenswrapper[4843]: I0318 13:34:02.053057 4843 scope.go:117] "RemoveContainer" containerID="47986d57bd2c37a7dca78b8d2835332255da8e844b9eb57f48a0ec8d3e8d1b9b" Mar 18 13:34:02 crc kubenswrapper[4843]: E0318 13:34:02.053434 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47986d57bd2c37a7dca78b8d2835332255da8e844b9eb57f48a0ec8d3e8d1b9b\": container with ID starting with 47986d57bd2c37a7dca78b8d2835332255da8e844b9eb57f48a0ec8d3e8d1b9b not found: ID does not exist" containerID="47986d57bd2c37a7dca78b8d2835332255da8e844b9eb57f48a0ec8d3e8d1b9b" Mar 18 13:34:02 crc kubenswrapper[4843]: I0318 13:34:02.053462 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47986d57bd2c37a7dca78b8d2835332255da8e844b9eb57f48a0ec8d3e8d1b9b"} err="failed to get container status \"47986d57bd2c37a7dca78b8d2835332255da8e844b9eb57f48a0ec8d3e8d1b9b\": rpc error: code = NotFound desc = could not find container \"47986d57bd2c37a7dca78b8d2835332255da8e844b9eb57f48a0ec8d3e8d1b9b\": container with ID starting with 47986d57bd2c37a7dca78b8d2835332255da8e844b9eb57f48a0ec8d3e8d1b9b not found: ID does not exist" Mar 18 13:34:02 crc kubenswrapper[4843]: I0318 13:34:02.053484 4843 scope.go:117] "RemoveContainer" containerID="75eb6ce6fa38433f9cf0b2b5b9170ff9a9f327cb340b7d86cff245948b89f873" Mar 18 13:34:02 crc kubenswrapper[4843]: E0318 13:34:02.053779 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75eb6ce6fa38433f9cf0b2b5b9170ff9a9f327cb340b7d86cff245948b89f873\": container with ID starting with 75eb6ce6fa38433f9cf0b2b5b9170ff9a9f327cb340b7d86cff245948b89f873 not found: ID does not exist" containerID="75eb6ce6fa38433f9cf0b2b5b9170ff9a9f327cb340b7d86cff245948b89f873" Mar 18 13:34:02 crc kubenswrapper[4843]: I0318 13:34:02.053811 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75eb6ce6fa38433f9cf0b2b5b9170ff9a9f327cb340b7d86cff245948b89f873"} err="failed to get container status \"75eb6ce6fa38433f9cf0b2b5b9170ff9a9f327cb340b7d86cff245948b89f873\": rpc error: code = NotFound desc = could not find container \"75eb6ce6fa38433f9cf0b2b5b9170ff9a9f327cb340b7d86cff245948b89f873\": container with ID starting with 75eb6ce6fa38433f9cf0b2b5b9170ff9a9f327cb340b7d86cff245948b89f873 not found: ID does not exist" Mar 18 13:34:02 crc kubenswrapper[4843]: I0318 13:34:02.959209 4843 generic.go:334] "Generic (PLEG): container finished" podID="9f8b955e-7dbb-422e-9924-a4bbf836a579" containerID="ec9706175d8e9936451caa65c278bce56918d4a65e792237e27c640f66485d60" exitCode=0 Mar 18 13:34:02 crc kubenswrapper[4843]: I0318 13:34:02.959282 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564014-kmr7n" event={"ID":"9f8b955e-7dbb-422e-9924-a4bbf836a579","Type":"ContainerDied","Data":"ec9706175d8e9936451caa65c278bce56918d4a65e792237e27c640f66485d60"} Mar 18 13:34:02 crc kubenswrapper[4843]: I0318 13:34:02.996164 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6555b595-97d4-4712-92d6-d820574a0d36" path="/var/lib/kubelet/pods/6555b595-97d4-4712-92d6-d820574a0d36/volumes" Mar 18 13:34:04 crc kubenswrapper[4843]: I0318 13:34:04.296210 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-kmr7n" Mar 18 13:34:04 crc kubenswrapper[4843]: I0318 13:34:04.405205 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm55r\" (UniqueName: \"kubernetes.io/projected/9f8b955e-7dbb-422e-9924-a4bbf836a579-kube-api-access-lm55r\") pod \"9f8b955e-7dbb-422e-9924-a4bbf836a579\" (UID: \"9f8b955e-7dbb-422e-9924-a4bbf836a579\") " Mar 18 13:34:04 crc kubenswrapper[4843]: I0318 13:34:04.412501 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f8b955e-7dbb-422e-9924-a4bbf836a579-kube-api-access-lm55r" (OuterVolumeSpecName: "kube-api-access-lm55r") pod "9f8b955e-7dbb-422e-9924-a4bbf836a579" (UID: "9f8b955e-7dbb-422e-9924-a4bbf836a579"). InnerVolumeSpecName "kube-api-access-lm55r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:34:04 crc kubenswrapper[4843]: I0318 13:34:04.508048 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm55r\" (UniqueName: \"kubernetes.io/projected/9f8b955e-7dbb-422e-9924-a4bbf836a579-kube-api-access-lm55r\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:04 crc kubenswrapper[4843]: I0318 13:34:04.980871 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564014-kmr7n" event={"ID":"9f8b955e-7dbb-422e-9924-a4bbf836a579","Type":"ContainerDied","Data":"20e727245659a90df5d7060d372c1c54d2b624c97a5578c59894fed91f5824db"} Mar 18 13:34:04 crc kubenswrapper[4843]: I0318 13:34:04.980943 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-kmr7n" Mar 18 13:34:04 crc kubenswrapper[4843]: I0318 13:34:04.980948 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20e727245659a90df5d7060d372c1c54d2b624c97a5578c59894fed91f5824db" Mar 18 13:34:05 crc kubenswrapper[4843]: I0318 13:34:05.374595 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-mnt7p"] Mar 18 13:34:05 crc kubenswrapper[4843]: I0318 13:34:05.388454 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-mnt7p"] Mar 18 13:34:06 crc kubenswrapper[4843]: I0318 13:34:06.995777 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d261b82-2683-4060-9190-d2f67e530dec" path="/var/lib/kubelet/pods/8d261b82-2683-4060-9190-d2f67e530dec/volumes" Mar 18 13:34:08 crc kubenswrapper[4843]: I0318 13:34:08.318766 4843 scope.go:117] "RemoveContainer" containerID="86619cf95f3e0ad0922344a0afcd35d19f4da71eaabe58b1bdf8c7c3988fad86" Mar 18 13:34:20 crc kubenswrapper[4843]: I0318 13:34:20.035517 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:34:20 crc kubenswrapper[4843]: I0318 13:34:20.036361 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:34:50 crc kubenswrapper[4843]: I0318 13:34:50.035001 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:34:50 crc kubenswrapper[4843]: I0318 13:34:50.035610 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:34:50 crc kubenswrapper[4843]: I0318 13:34:50.035693 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 13:34:50 crc kubenswrapper[4843]: I0318 13:34:50.036620 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:34:50 crc kubenswrapper[4843]: I0318 13:34:50.036745 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" gracePeriod=600 Mar 18 13:34:50 crc kubenswrapper[4843]: E0318 13:34:50.162745 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:34:50 crc kubenswrapper[4843]: I0318 13:34:50.446616 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" exitCode=0 Mar 18 13:34:50 crc kubenswrapper[4843]: I0318 13:34:50.446696 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f"} Mar 18 13:34:50 crc kubenswrapper[4843]: I0318 13:34:50.446755 4843 scope.go:117] "RemoveContainer" containerID="48d9422627d1f2b320842204ec90814b68898c02c8f99d441d7ad28939b9e238" Mar 18 13:34:50 crc kubenswrapper[4843]: I0318 13:34:50.447420 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:34:50 crc kubenswrapper[4843]: E0318 13:34:50.447728 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:35:00 crc kubenswrapper[4843]: I0318 13:35:00.984601 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:35:00 crc kubenswrapper[4843]: E0318 13:35:00.985614 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:35:15 crc kubenswrapper[4843]: I0318 13:35:15.984268 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:35:15 crc kubenswrapper[4843]: E0318 13:35:15.985216 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:35:29 crc kubenswrapper[4843]: I0318 13:35:29.984615 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:35:29 crc kubenswrapper[4843]: E0318 13:35:29.986848 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:35:43 crc kubenswrapper[4843]: I0318 13:35:43.984259 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:35:43 crc kubenswrapper[4843]: E0318 13:35:43.985284 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:35:57 crc kubenswrapper[4843]: I0318 13:35:56.999797 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:35:57 crc kubenswrapper[4843]: E0318 13:35:57.000582 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.167492 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564016-qmxll"] Mar 18 13:36:00 crc kubenswrapper[4843]: E0318 13:36:00.168276 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6555b595-97d4-4712-92d6-d820574a0d36" containerName="extract-content" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.168290 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="6555b595-97d4-4712-92d6-d820574a0d36" containerName="extract-content" Mar 18 13:36:00 crc kubenswrapper[4843]: E0318 13:36:00.168329 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8b955e-7dbb-422e-9924-a4bbf836a579" containerName="oc" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.168335 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8b955e-7dbb-422e-9924-a4bbf836a579" containerName="oc" Mar 18 13:36:00 crc kubenswrapper[4843]: E0318 13:36:00.168350 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6555b595-97d4-4712-92d6-d820574a0d36" containerName="registry-server" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.168358 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="6555b595-97d4-4712-92d6-d820574a0d36" containerName="registry-server" Mar 18 13:36:00 crc kubenswrapper[4843]: E0318 13:36:00.168375 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6555b595-97d4-4712-92d6-d820574a0d36" containerName="extract-utilities" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.168382 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="6555b595-97d4-4712-92d6-d820574a0d36" containerName="extract-utilities" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.168609 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="6555b595-97d4-4712-92d6-d820574a0d36" containerName="registry-server" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.168625 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f8b955e-7dbb-422e-9924-a4bbf836a579" containerName="oc" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.169502 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-qmxll" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.172985 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.172994 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.173036 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.176137 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtvp6\" (UniqueName: \"kubernetes.io/projected/975fe054-dd8f-4634-b17d-d3e9c218c439-kube-api-access-jtvp6\") pod \"auto-csr-approver-29564016-qmxll\" (UID: \"975fe054-dd8f-4634-b17d-d3e9c218c439\") " pod="openshift-infra/auto-csr-approver-29564016-qmxll" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.178139 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-qmxll"] Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.288410 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtvp6\" (UniqueName: \"kubernetes.io/projected/975fe054-dd8f-4634-b17d-d3e9c218c439-kube-api-access-jtvp6\") pod \"auto-csr-approver-29564016-qmxll\" (UID: \"975fe054-dd8f-4634-b17d-d3e9c218c439\") " pod="openshift-infra/auto-csr-approver-29564016-qmxll" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.311772 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtvp6\" (UniqueName: \"kubernetes.io/projected/975fe054-dd8f-4634-b17d-d3e9c218c439-kube-api-access-jtvp6\") pod \"auto-csr-approver-29564016-qmxll\" (UID: \"975fe054-dd8f-4634-b17d-d3e9c218c439\") " pod="openshift-infra/auto-csr-approver-29564016-qmxll" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.498166 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-qmxll" Mar 18 13:36:00 crc kubenswrapper[4843]: I0318 13:36:00.949106 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-qmxll"] Mar 18 13:36:01 crc kubenswrapper[4843]: I0318 13:36:01.830977 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564016-qmxll" event={"ID":"975fe054-dd8f-4634-b17d-d3e9c218c439","Type":"ContainerStarted","Data":"3012d7a8c086b838f5aa4072a4fe41948244c3fe0c61a67e3ab21f8ffeb97347"} Mar 18 13:36:03 crc kubenswrapper[4843]: I0318 13:36:03.951552 4843 generic.go:334] "Generic (PLEG): container finished" podID="975fe054-dd8f-4634-b17d-d3e9c218c439" containerID="72175606280b423fde784ff48f482f8d6617f6edc7f3597598fc1eb57efdcebb" exitCode=0 Mar 18 13:36:03 crc kubenswrapper[4843]: I0318 13:36:03.951573 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564016-qmxll" event={"ID":"975fe054-dd8f-4634-b17d-d3e9c218c439","Type":"ContainerDied","Data":"72175606280b423fde784ff48f482f8d6617f6edc7f3597598fc1eb57efdcebb"} Mar 18 13:36:05 crc kubenswrapper[4843]: I0318 13:36:05.304698 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-qmxll" Mar 18 13:36:05 crc kubenswrapper[4843]: I0318 13:36:05.490595 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtvp6\" (UniqueName: \"kubernetes.io/projected/975fe054-dd8f-4634-b17d-d3e9c218c439-kube-api-access-jtvp6\") pod \"975fe054-dd8f-4634-b17d-d3e9c218c439\" (UID: \"975fe054-dd8f-4634-b17d-d3e9c218c439\") " Mar 18 13:36:05 crc kubenswrapper[4843]: I0318 13:36:05.497004 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/975fe054-dd8f-4634-b17d-d3e9c218c439-kube-api-access-jtvp6" (OuterVolumeSpecName: "kube-api-access-jtvp6") pod "975fe054-dd8f-4634-b17d-d3e9c218c439" (UID: "975fe054-dd8f-4634-b17d-d3e9c218c439"). InnerVolumeSpecName "kube-api-access-jtvp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:36:05 crc kubenswrapper[4843]: I0318 13:36:05.593238 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtvp6\" (UniqueName: \"kubernetes.io/projected/975fe054-dd8f-4634-b17d-d3e9c218c439-kube-api-access-jtvp6\") on node \"crc\" DevicePath \"\"" Mar 18 13:36:05 crc kubenswrapper[4843]: I0318 13:36:05.974956 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564016-qmxll" event={"ID":"975fe054-dd8f-4634-b17d-d3e9c218c439","Type":"ContainerDied","Data":"3012d7a8c086b838f5aa4072a4fe41948244c3fe0c61a67e3ab21f8ffeb97347"} Mar 18 13:36:05 crc kubenswrapper[4843]: I0318 13:36:05.975000 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3012d7a8c086b838f5aa4072a4fe41948244c3fe0c61a67e3ab21f8ffeb97347" Mar 18 13:36:05 crc kubenswrapper[4843]: I0318 13:36:05.975032 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-qmxll" Mar 18 13:36:06 crc kubenswrapper[4843]: I0318 13:36:06.378768 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-726q7"] Mar 18 13:36:06 crc kubenswrapper[4843]: I0318 13:36:06.389242 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-726q7"] Mar 18 13:36:06 crc kubenswrapper[4843]: I0318 13:36:06.998691 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d57360d1-f8c3-4af7-958d-f5cf1308d69c" path="/var/lib/kubelet/pods/d57360d1-f8c3-4af7-958d-f5cf1308d69c/volumes" Mar 18 13:36:08 crc kubenswrapper[4843]: I0318 13:36:08.435462 4843 scope.go:117] "RemoveContainer" containerID="ccadf14b5ed7c4b5ab7efe7916330533a2af70fd57e7a91feebf227de07f4fd7" Mar 18 13:36:11 crc kubenswrapper[4843]: I0318 13:36:11.983978 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:36:11 crc kubenswrapper[4843]: E0318 13:36:11.984544 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:36:25 crc kubenswrapper[4843]: I0318 13:36:25.984969 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:36:25 crc kubenswrapper[4843]: E0318 13:36:25.985964 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:36:38 crc kubenswrapper[4843]: I0318 13:36:38.984729 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:36:38 crc kubenswrapper[4843]: E0318 13:36:38.985455 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:36:52 crc kubenswrapper[4843]: I0318 13:36:52.984204 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:36:52 crc kubenswrapper[4843]: E0318 13:36:52.985171 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:37:03 crc kubenswrapper[4843]: I0318 13:37:03.983921 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:37:03 crc kubenswrapper[4843]: E0318 13:37:03.984813 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:37:14 crc kubenswrapper[4843]: I0318 13:37:14.984986 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:37:14 crc kubenswrapper[4843]: E0318 13:37:14.985872 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:37:28 crc kubenswrapper[4843]: I0318 13:37:28.983817 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:37:28 crc kubenswrapper[4843]: E0318 13:37:28.984645 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.550076 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mz66c/must-gather-5ctdc"] Mar 18 13:37:39 crc kubenswrapper[4843]: E0318 13:37:39.555296 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975fe054-dd8f-4634-b17d-d3e9c218c439" containerName="oc" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.555328 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="975fe054-dd8f-4634-b17d-d3e9c218c439" containerName="oc" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.555677 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="975fe054-dd8f-4634-b17d-d3e9c218c439" containerName="oc" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.557064 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mz66c/must-gather-5ctdc" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.559190 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mz66c"/"openshift-service-ca.crt" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.559274 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mz66c"/"default-dockercfg-zg4wz" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.561153 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mz66c"/"kube-root-ca.crt" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.564860 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mz66c/must-gather-5ctdc"] Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.700166 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgm4c\" (UniqueName: \"kubernetes.io/projected/7ae7a0fc-1a94-4175-a9f6-403f501cc985-kube-api-access-pgm4c\") pod \"must-gather-5ctdc\" (UID: \"7ae7a0fc-1a94-4175-a9f6-403f501cc985\") " pod="openshift-must-gather-mz66c/must-gather-5ctdc" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.700254 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7ae7a0fc-1a94-4175-a9f6-403f501cc985-must-gather-output\") pod \"must-gather-5ctdc\" (UID: \"7ae7a0fc-1a94-4175-a9f6-403f501cc985\") " pod="openshift-must-gather-mz66c/must-gather-5ctdc" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.802416 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgm4c\" (UniqueName: \"kubernetes.io/projected/7ae7a0fc-1a94-4175-a9f6-403f501cc985-kube-api-access-pgm4c\") pod \"must-gather-5ctdc\" (UID: \"7ae7a0fc-1a94-4175-a9f6-403f501cc985\") " pod="openshift-must-gather-mz66c/must-gather-5ctdc" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.802473 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7ae7a0fc-1a94-4175-a9f6-403f501cc985-must-gather-output\") pod \"must-gather-5ctdc\" (UID: \"7ae7a0fc-1a94-4175-a9f6-403f501cc985\") " pod="openshift-must-gather-mz66c/must-gather-5ctdc" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.803074 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7ae7a0fc-1a94-4175-a9f6-403f501cc985-must-gather-output\") pod \"must-gather-5ctdc\" (UID: \"7ae7a0fc-1a94-4175-a9f6-403f501cc985\") " pod="openshift-must-gather-mz66c/must-gather-5ctdc" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.821531 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgm4c\" (UniqueName: \"kubernetes.io/projected/7ae7a0fc-1a94-4175-a9f6-403f501cc985-kube-api-access-pgm4c\") pod \"must-gather-5ctdc\" (UID: \"7ae7a0fc-1a94-4175-a9f6-403f501cc985\") " pod="openshift-must-gather-mz66c/must-gather-5ctdc" Mar 18 13:37:39 crc kubenswrapper[4843]: I0318 13:37:39.879629 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mz66c/must-gather-5ctdc" Mar 18 13:37:40 crc kubenswrapper[4843]: I0318 13:37:40.432788 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mz66c/must-gather-5ctdc"] Mar 18 13:37:40 crc kubenswrapper[4843]: I0318 13:37:40.439863 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:37:41 crc kubenswrapper[4843]: I0318 13:37:41.047635 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mz66c/must-gather-5ctdc" event={"ID":"7ae7a0fc-1a94-4175-a9f6-403f501cc985","Type":"ContainerStarted","Data":"eaaa7e9cdb66a86716497cead17419532246c0ae8955c2c1c779f8eecbf14f98"} Mar 18 13:37:43 crc kubenswrapper[4843]: I0318 13:37:43.094548 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:37:43 crc kubenswrapper[4843]: E0318 13:37:43.095066 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:37:49 crc kubenswrapper[4843]: I0318 13:37:49.387684 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mz66c/must-gather-5ctdc" event={"ID":"7ae7a0fc-1a94-4175-a9f6-403f501cc985","Type":"ContainerStarted","Data":"794d88c7798f240dea4919f5f5ef306db3ae62cb7a42005ef87a23034ab79f55"} Mar 18 13:37:49 crc kubenswrapper[4843]: I0318 13:37:49.388066 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mz66c/must-gather-5ctdc" event={"ID":"7ae7a0fc-1a94-4175-a9f6-403f501cc985","Type":"ContainerStarted","Data":"b5bac0dbccbf6dcde11d1c22059fedec1192d4b528d4e38832def92cb621d905"} Mar 18 13:37:49 crc kubenswrapper[4843]: I0318 13:37:49.474838 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mz66c/must-gather-5ctdc" podStartSLOduration=2.80622311 podStartE2EDuration="10.474787073s" podCreationTimestamp="2026-03-18 13:37:39 +0000 UTC" firstStartedPulling="2026-03-18 13:37:40.43941756 +0000 UTC m=+5294.155243084" lastFinishedPulling="2026-03-18 13:37:48.107981523 +0000 UTC m=+5301.823807047" observedRunningTime="2026-03-18 13:37:49.403574975 +0000 UTC m=+5303.119400509" watchObservedRunningTime="2026-03-18 13:37:49.474787073 +0000 UTC m=+5303.190612597" Mar 18 13:37:52 crc kubenswrapper[4843]: I0318 13:37:52.626944 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mz66c/crc-debug-k4g8l"] Mar 18 13:37:52 crc kubenswrapper[4843]: I0318 13:37:52.628810 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mz66c/crc-debug-k4g8l" Mar 18 13:37:52 crc kubenswrapper[4843]: I0318 13:37:52.808360 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e029802-6a38-4bca-89a9-6cf8bfb0b614-host\") pod \"crc-debug-k4g8l\" (UID: \"5e029802-6a38-4bca-89a9-6cf8bfb0b614\") " pod="openshift-must-gather-mz66c/crc-debug-k4g8l" Mar 18 13:37:52 crc kubenswrapper[4843]: I0318 13:37:52.808494 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz4nn\" (UniqueName: \"kubernetes.io/projected/5e029802-6a38-4bca-89a9-6cf8bfb0b614-kube-api-access-fz4nn\") pod \"crc-debug-k4g8l\" (UID: \"5e029802-6a38-4bca-89a9-6cf8bfb0b614\") " pod="openshift-must-gather-mz66c/crc-debug-k4g8l" Mar 18 13:37:52 crc kubenswrapper[4843]: I0318 13:37:52.909826 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e029802-6a38-4bca-89a9-6cf8bfb0b614-host\") pod \"crc-debug-k4g8l\" (UID: \"5e029802-6a38-4bca-89a9-6cf8bfb0b614\") " pod="openshift-must-gather-mz66c/crc-debug-k4g8l" Mar 18 13:37:52 crc kubenswrapper[4843]: I0318 13:37:52.909979 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz4nn\" (UniqueName: \"kubernetes.io/projected/5e029802-6a38-4bca-89a9-6cf8bfb0b614-kube-api-access-fz4nn\") pod \"crc-debug-k4g8l\" (UID: \"5e029802-6a38-4bca-89a9-6cf8bfb0b614\") " pod="openshift-must-gather-mz66c/crc-debug-k4g8l" Mar 18 13:37:52 crc kubenswrapper[4843]: I0318 13:37:52.910333 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e029802-6a38-4bca-89a9-6cf8bfb0b614-host\") pod \"crc-debug-k4g8l\" (UID: \"5e029802-6a38-4bca-89a9-6cf8bfb0b614\") " pod="openshift-must-gather-mz66c/crc-debug-k4g8l" Mar 18 13:37:52 crc kubenswrapper[4843]: I0318 13:37:52.932602 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz4nn\" (UniqueName: \"kubernetes.io/projected/5e029802-6a38-4bca-89a9-6cf8bfb0b614-kube-api-access-fz4nn\") pod \"crc-debug-k4g8l\" (UID: \"5e029802-6a38-4bca-89a9-6cf8bfb0b614\") " pod="openshift-must-gather-mz66c/crc-debug-k4g8l" Mar 18 13:37:52 crc kubenswrapper[4843]: I0318 13:37:52.949207 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mz66c/crc-debug-k4g8l" Mar 18 13:37:52 crc kubenswrapper[4843]: W0318 13:37:52.978029 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e029802_6a38_4bca_89a9_6cf8bfb0b614.slice/crio-b6294d8962b793b012c6f32c6934f686cb7a5e8a6a41bf09815b2ff61da7cca5 WatchSource:0}: Error finding container b6294d8962b793b012c6f32c6934f686cb7a5e8a6a41bf09815b2ff61da7cca5: Status 404 returned error can't find the container with id b6294d8962b793b012c6f32c6934f686cb7a5e8a6a41bf09815b2ff61da7cca5 Mar 18 13:37:53 crc kubenswrapper[4843]: I0318 13:37:53.427482 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mz66c/crc-debug-k4g8l" event={"ID":"5e029802-6a38-4bca-89a9-6cf8bfb0b614","Type":"ContainerStarted","Data":"b6294d8962b793b012c6f32c6934f686cb7a5e8a6a41bf09815b2ff61da7cca5"} Mar 18 13:37:54 crc kubenswrapper[4843]: I0318 13:37:54.983875 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:37:54 crc kubenswrapper[4843]: E0318 13:37:54.984402 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:38:00 crc kubenswrapper[4843]: I0318 13:38:00.154688 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564018-gf48x"] Mar 18 13:38:00 crc kubenswrapper[4843]: I0318 13:38:00.157211 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-gf48x" Mar 18 13:38:00 crc kubenswrapper[4843]: I0318 13:38:00.161200 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:38:00 crc kubenswrapper[4843]: I0318 13:38:00.161471 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:38:00 crc kubenswrapper[4843]: I0318 13:38:00.163926 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:38:00 crc kubenswrapper[4843]: I0318 13:38:00.165796 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-gf48x"] Mar 18 13:38:00 crc kubenswrapper[4843]: I0318 13:38:00.240376 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74w4k\" (UniqueName: \"kubernetes.io/projected/98150f00-d4aa-469a-9914-6e5358bd94b2-kube-api-access-74w4k\") pod \"auto-csr-approver-29564018-gf48x\" (UID: \"98150f00-d4aa-469a-9914-6e5358bd94b2\") " pod="openshift-infra/auto-csr-approver-29564018-gf48x" Mar 18 13:38:00 crc kubenswrapper[4843]: I0318 13:38:00.371133 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74w4k\" (UniqueName: \"kubernetes.io/projected/98150f00-d4aa-469a-9914-6e5358bd94b2-kube-api-access-74w4k\") pod \"auto-csr-approver-29564018-gf48x\" (UID: \"98150f00-d4aa-469a-9914-6e5358bd94b2\") " pod="openshift-infra/auto-csr-approver-29564018-gf48x" Mar 18 13:38:00 crc kubenswrapper[4843]: I0318 13:38:00.412090 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74w4k\" (UniqueName: \"kubernetes.io/projected/98150f00-d4aa-469a-9914-6e5358bd94b2-kube-api-access-74w4k\") pod \"auto-csr-approver-29564018-gf48x\" (UID: \"98150f00-d4aa-469a-9914-6e5358bd94b2\") " pod="openshift-infra/auto-csr-approver-29564018-gf48x" Mar 18 13:38:00 crc kubenswrapper[4843]: I0318 13:38:00.499550 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-gf48x" Mar 18 13:38:08 crc kubenswrapper[4843]: E0318 13:38:08.335671 4843 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Mar 18 13:38:08 crc kubenswrapper[4843]: E0318 13:38:08.336684 4843 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fz4nn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-k4g8l_openshift-must-gather-mz66c(5e029802-6a38-4bca-89a9-6cf8bfb0b614): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 13:38:08 crc kubenswrapper[4843]: E0318 13:38:08.338168 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-mz66c/crc-debug-k4g8l" podUID="5e029802-6a38-4bca-89a9-6cf8bfb0b614" Mar 18 13:38:08 crc kubenswrapper[4843]: E0318 13:38:08.581358 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-mz66c/crc-debug-k4g8l" podUID="5e029802-6a38-4bca-89a9-6cf8bfb0b614" Mar 18 13:38:08 crc kubenswrapper[4843]: I0318 13:38:08.848917 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-gf48x"] Mar 18 13:38:08 crc kubenswrapper[4843]: I0318 13:38:08.984555 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:38:08 crc kubenswrapper[4843]: E0318 13:38:08.984921 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:38:09 crc kubenswrapper[4843]: I0318 13:38:09.587769 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564018-gf48x" event={"ID":"98150f00-d4aa-469a-9914-6e5358bd94b2","Type":"ContainerStarted","Data":"004355d30d48f096388828a5f7b3e367a92710b7cdd87ed70489c1cc6c296c64"} Mar 18 13:38:11 crc kubenswrapper[4843]: I0318 13:38:11.610863 4843 generic.go:334] "Generic (PLEG): container finished" podID="98150f00-d4aa-469a-9914-6e5358bd94b2" containerID="7303e6e13660702edbdc4e8599706b580c3aa8fe0d3dd5c928b75a98c65d8751" exitCode=0 Mar 18 13:38:11 crc kubenswrapper[4843]: I0318 13:38:11.610966 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564018-gf48x" event={"ID":"98150f00-d4aa-469a-9914-6e5358bd94b2","Type":"ContainerDied","Data":"7303e6e13660702edbdc4e8599706b580c3aa8fe0d3dd5c928b75a98c65d8751"} Mar 18 13:38:13 crc kubenswrapper[4843]: I0318 13:38:13.045180 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-gf48x" Mar 18 13:38:13 crc kubenswrapper[4843]: I0318 13:38:13.194062 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74w4k\" (UniqueName: \"kubernetes.io/projected/98150f00-d4aa-469a-9914-6e5358bd94b2-kube-api-access-74w4k\") pod \"98150f00-d4aa-469a-9914-6e5358bd94b2\" (UID: \"98150f00-d4aa-469a-9914-6e5358bd94b2\") " Mar 18 13:38:13 crc kubenswrapper[4843]: I0318 13:38:13.200934 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98150f00-d4aa-469a-9914-6e5358bd94b2-kube-api-access-74w4k" (OuterVolumeSpecName: "kube-api-access-74w4k") pod "98150f00-d4aa-469a-9914-6e5358bd94b2" (UID: "98150f00-d4aa-469a-9914-6e5358bd94b2"). InnerVolumeSpecName "kube-api-access-74w4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:38:13 crc kubenswrapper[4843]: I0318 13:38:13.303196 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74w4k\" (UniqueName: \"kubernetes.io/projected/98150f00-d4aa-469a-9914-6e5358bd94b2-kube-api-access-74w4k\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:13 crc kubenswrapper[4843]: I0318 13:38:13.713268 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564018-gf48x" event={"ID":"98150f00-d4aa-469a-9914-6e5358bd94b2","Type":"ContainerDied","Data":"004355d30d48f096388828a5f7b3e367a92710b7cdd87ed70489c1cc6c296c64"} Mar 18 13:38:13 crc kubenswrapper[4843]: I0318 13:38:13.713323 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-gf48x" Mar 18 13:38:13 crc kubenswrapper[4843]: I0318 13:38:13.713328 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="004355d30d48f096388828a5f7b3e367a92710b7cdd87ed70489c1cc6c296c64" Mar 18 13:38:14 crc kubenswrapper[4843]: I0318 13:38:14.130404 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-p4gnp"] Mar 18 13:38:14 crc kubenswrapper[4843]: I0318 13:38:14.141016 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-p4gnp"] Mar 18 13:38:14 crc kubenswrapper[4843]: I0318 13:38:14.993818 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47bf1a53-8b55-491a-82c9-7ce936932faa" path="/var/lib/kubelet/pods/47bf1a53-8b55-491a-82c9-7ce936932faa/volumes" Mar 18 13:38:23 crc kubenswrapper[4843]: I0318 13:38:23.940073 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mz66c/crc-debug-k4g8l" event={"ID":"5e029802-6a38-4bca-89a9-6cf8bfb0b614","Type":"ContainerStarted","Data":"e1e40c44ad39a3e984ca5deb406909b2a99f08e373c9b66a86997273e0932f0c"} Mar 18 13:38:23 crc kubenswrapper[4843]: I0318 13:38:23.958140 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mz66c/crc-debug-k4g8l" podStartSLOduration=1.3946803110000001 podStartE2EDuration="31.958115861s" podCreationTimestamp="2026-03-18 13:37:52 +0000 UTC" firstStartedPulling="2026-03-18 13:37:52.980116253 +0000 UTC m=+5306.695941787" lastFinishedPulling="2026-03-18 13:38:23.543551803 +0000 UTC m=+5337.259377337" observedRunningTime="2026-03-18 13:38:23.954643922 +0000 UTC m=+5337.670469446" watchObservedRunningTime="2026-03-18 13:38:23.958115861 +0000 UTC m=+5337.673941385" Mar 18 13:38:23 crc kubenswrapper[4843]: I0318 13:38:23.984617 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:38:23 crc kubenswrapper[4843]: E0318 13:38:23.984898 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:38:37 crc kubenswrapper[4843]: I0318 13:38:37.984414 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:38:37 crc kubenswrapper[4843]: E0318 13:38:37.985293 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:38:41 crc kubenswrapper[4843]: I0318 13:38:41.852143 4843 generic.go:334] "Generic (PLEG): container finished" podID="5e029802-6a38-4bca-89a9-6cf8bfb0b614" containerID="e1e40c44ad39a3e984ca5deb406909b2a99f08e373c9b66a86997273e0932f0c" exitCode=0 Mar 18 13:38:41 crc kubenswrapper[4843]: I0318 13:38:41.852249 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mz66c/crc-debug-k4g8l" event={"ID":"5e029802-6a38-4bca-89a9-6cf8bfb0b614","Type":"ContainerDied","Data":"e1e40c44ad39a3e984ca5deb406909b2a99f08e373c9b66a86997273e0932f0c"} Mar 18 13:38:43 crc kubenswrapper[4843]: I0318 13:38:43.059135 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mz66c/crc-debug-k4g8l" Mar 18 13:38:43 crc kubenswrapper[4843]: I0318 13:38:43.094566 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mz66c/crc-debug-k4g8l"] Mar 18 13:38:43 crc kubenswrapper[4843]: I0318 13:38:43.104269 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mz66c/crc-debug-k4g8l"] Mar 18 13:38:43 crc kubenswrapper[4843]: I0318 13:38:43.221079 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e029802-6a38-4bca-89a9-6cf8bfb0b614-host\") pod \"5e029802-6a38-4bca-89a9-6cf8bfb0b614\" (UID: \"5e029802-6a38-4bca-89a9-6cf8bfb0b614\") " Mar 18 13:38:43 crc kubenswrapper[4843]: I0318 13:38:43.221252 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz4nn\" (UniqueName: \"kubernetes.io/projected/5e029802-6a38-4bca-89a9-6cf8bfb0b614-kube-api-access-fz4nn\") pod \"5e029802-6a38-4bca-89a9-6cf8bfb0b614\" (UID: \"5e029802-6a38-4bca-89a9-6cf8bfb0b614\") " Mar 18 13:38:43 crc kubenswrapper[4843]: I0318 13:38:43.221259 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e029802-6a38-4bca-89a9-6cf8bfb0b614-host" (OuterVolumeSpecName: "host") pod "5e029802-6a38-4bca-89a9-6cf8bfb0b614" (UID: "5e029802-6a38-4bca-89a9-6cf8bfb0b614"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:38:43 crc kubenswrapper[4843]: I0318 13:38:43.221816 4843 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e029802-6a38-4bca-89a9-6cf8bfb0b614-host\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:43 crc kubenswrapper[4843]: I0318 13:38:43.227039 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e029802-6a38-4bca-89a9-6cf8bfb0b614-kube-api-access-fz4nn" (OuterVolumeSpecName: "kube-api-access-fz4nn") pod "5e029802-6a38-4bca-89a9-6cf8bfb0b614" (UID: "5e029802-6a38-4bca-89a9-6cf8bfb0b614"). InnerVolumeSpecName "kube-api-access-fz4nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:38:43 crc kubenswrapper[4843]: I0318 13:38:43.324529 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz4nn\" (UniqueName: \"kubernetes.io/projected/5e029802-6a38-4bca-89a9-6cf8bfb0b614-kube-api-access-fz4nn\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:43 crc kubenswrapper[4843]: I0318 13:38:43.871360 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6294d8962b793b012c6f32c6934f686cb7a5e8a6a41bf09815b2ff61da7cca5" Mar 18 13:38:43 crc kubenswrapper[4843]: I0318 13:38:43.871477 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mz66c/crc-debug-k4g8l" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.293174 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mz66c/crc-debug-9g9sk"] Mar 18 13:38:44 crc kubenswrapper[4843]: E0318 13:38:44.293724 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e029802-6a38-4bca-89a9-6cf8bfb0b614" containerName="container-00" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.293742 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e029802-6a38-4bca-89a9-6cf8bfb0b614" containerName="container-00" Mar 18 13:38:44 crc kubenswrapper[4843]: E0318 13:38:44.293766 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98150f00-d4aa-469a-9914-6e5358bd94b2" containerName="oc" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.293773 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="98150f00-d4aa-469a-9914-6e5358bd94b2" containerName="oc" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.293967 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e029802-6a38-4bca-89a9-6cf8bfb0b614" containerName="container-00" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.293997 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="98150f00-d4aa-469a-9914-6e5358bd94b2" containerName="oc" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.294838 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mz66c/crc-debug-9g9sk" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.346978 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ef70c06-65c9-45ff-8485-0b60d1baae55-host\") pod \"crc-debug-9g9sk\" (UID: \"1ef70c06-65c9-45ff-8485-0b60d1baae55\") " pod="openshift-must-gather-mz66c/crc-debug-9g9sk" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.347294 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn82d\" (UniqueName: \"kubernetes.io/projected/1ef70c06-65c9-45ff-8485-0b60d1baae55-kube-api-access-wn82d\") pod \"crc-debug-9g9sk\" (UID: \"1ef70c06-65c9-45ff-8485-0b60d1baae55\") " pod="openshift-must-gather-mz66c/crc-debug-9g9sk" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.449292 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn82d\" (UniqueName: \"kubernetes.io/projected/1ef70c06-65c9-45ff-8485-0b60d1baae55-kube-api-access-wn82d\") pod \"crc-debug-9g9sk\" (UID: \"1ef70c06-65c9-45ff-8485-0b60d1baae55\") " pod="openshift-must-gather-mz66c/crc-debug-9g9sk" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.449769 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ef70c06-65c9-45ff-8485-0b60d1baae55-host\") pod \"crc-debug-9g9sk\" (UID: \"1ef70c06-65c9-45ff-8485-0b60d1baae55\") " pod="openshift-must-gather-mz66c/crc-debug-9g9sk" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.449907 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ef70c06-65c9-45ff-8485-0b60d1baae55-host\") pod \"crc-debug-9g9sk\" (UID: \"1ef70c06-65c9-45ff-8485-0b60d1baae55\") " pod="openshift-must-gather-mz66c/crc-debug-9g9sk" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.473373 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn82d\" (UniqueName: \"kubernetes.io/projected/1ef70c06-65c9-45ff-8485-0b60d1baae55-kube-api-access-wn82d\") pod \"crc-debug-9g9sk\" (UID: \"1ef70c06-65c9-45ff-8485-0b60d1baae55\") " pod="openshift-must-gather-mz66c/crc-debug-9g9sk" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.610964 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mz66c/crc-debug-9g9sk" Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.884499 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mz66c/crc-debug-9g9sk" event={"ID":"1ef70c06-65c9-45ff-8485-0b60d1baae55","Type":"ContainerStarted","Data":"a1c3a9d5ff56dcdd66e1475f6e2d3906863274bd31c886ab3fa810a96d789e1c"} Mar 18 13:38:44 crc kubenswrapper[4843]: I0318 13:38:44.996809 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e029802-6a38-4bca-89a9-6cf8bfb0b614" path="/var/lib/kubelet/pods/5e029802-6a38-4bca-89a9-6cf8bfb0b614/volumes" Mar 18 13:38:45 crc kubenswrapper[4843]: I0318 13:38:45.984983 4843 generic.go:334] "Generic (PLEG): container finished" podID="1ef70c06-65c9-45ff-8485-0b60d1baae55" containerID="f7044135d22c2d37429f1a05e4bbc7510fae3071ab3851769e45db5adf328fd5" exitCode=1 Mar 18 13:38:45 crc kubenswrapper[4843]: I0318 13:38:45.985135 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mz66c/crc-debug-9g9sk" event={"ID":"1ef70c06-65c9-45ff-8485-0b60d1baae55","Type":"ContainerDied","Data":"f7044135d22c2d37429f1a05e4bbc7510fae3071ab3851769e45db5adf328fd5"} Mar 18 13:38:46 crc kubenswrapper[4843]: I0318 13:38:46.038036 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mz66c/crc-debug-9g9sk"] Mar 18 13:38:46 crc kubenswrapper[4843]: I0318 13:38:46.052636 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mz66c/crc-debug-9g9sk"] Mar 18 13:38:47 crc kubenswrapper[4843]: I0318 13:38:47.253666 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mz66c/crc-debug-9g9sk" Mar 18 13:38:47 crc kubenswrapper[4843]: I0318 13:38:47.341961 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn82d\" (UniqueName: \"kubernetes.io/projected/1ef70c06-65c9-45ff-8485-0b60d1baae55-kube-api-access-wn82d\") pod \"1ef70c06-65c9-45ff-8485-0b60d1baae55\" (UID: \"1ef70c06-65c9-45ff-8485-0b60d1baae55\") " Mar 18 13:38:47 crc kubenswrapper[4843]: I0318 13:38:47.342064 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ef70c06-65c9-45ff-8485-0b60d1baae55-host\") pod \"1ef70c06-65c9-45ff-8485-0b60d1baae55\" (UID: \"1ef70c06-65c9-45ff-8485-0b60d1baae55\") " Mar 18 13:38:47 crc kubenswrapper[4843]: I0318 13:38:47.342212 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ef70c06-65c9-45ff-8485-0b60d1baae55-host" (OuterVolumeSpecName: "host") pod "1ef70c06-65c9-45ff-8485-0b60d1baae55" (UID: "1ef70c06-65c9-45ff-8485-0b60d1baae55"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:38:47 crc kubenswrapper[4843]: I0318 13:38:47.342628 4843 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ef70c06-65c9-45ff-8485-0b60d1baae55-host\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:47 crc kubenswrapper[4843]: I0318 13:38:47.349295 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef70c06-65c9-45ff-8485-0b60d1baae55-kube-api-access-wn82d" (OuterVolumeSpecName: "kube-api-access-wn82d") pod "1ef70c06-65c9-45ff-8485-0b60d1baae55" (UID: "1ef70c06-65c9-45ff-8485-0b60d1baae55"). InnerVolumeSpecName "kube-api-access-wn82d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:38:47 crc kubenswrapper[4843]: I0318 13:38:47.444093 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn82d\" (UniqueName: \"kubernetes.io/projected/1ef70c06-65c9-45ff-8485-0b60d1baae55-kube-api-access-wn82d\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:48 crc kubenswrapper[4843]: I0318 13:38:48.162973 4843 scope.go:117] "RemoveContainer" containerID="f7044135d22c2d37429f1a05e4bbc7510fae3071ab3851769e45db5adf328fd5" Mar 18 13:38:48 crc kubenswrapper[4843]: I0318 13:38:48.163181 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mz66c/crc-debug-9g9sk" Mar 18 13:38:49 crc kubenswrapper[4843]: I0318 13:38:49.152539 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef70c06-65c9-45ff-8485-0b60d1baae55" path="/var/lib/kubelet/pods/1ef70c06-65c9-45ff-8485-0b60d1baae55/volumes" Mar 18 13:38:50 crc kubenswrapper[4843]: I0318 13:38:50.987596 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:38:50 crc kubenswrapper[4843]: E0318 13:38:50.988332 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:39:05 crc kubenswrapper[4843]: I0318 13:39:05.984855 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:39:05 crc kubenswrapper[4843]: E0318 13:39:05.985789 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:39:08 crc kubenswrapper[4843]: I0318 13:39:08.543776 4843 scope.go:117] "RemoveContainer" containerID="ed7686f318c84c76c262c8efdd47cb3cfaded7bfbff3d145876b94aec1b2953b" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.450906 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8schj"] Mar 18 13:39:13 crc kubenswrapper[4843]: E0318 13:39:13.453565 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef70c06-65c9-45ff-8485-0b60d1baae55" containerName="container-00" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.453593 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef70c06-65c9-45ff-8485-0b60d1baae55" containerName="container-00" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.453874 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef70c06-65c9-45ff-8485-0b60d1baae55" containerName="container-00" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.455877 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.463717 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8schj"] Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.510321 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhpss\" (UniqueName: \"kubernetes.io/projected/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-kube-api-access-bhpss\") pod \"redhat-operators-8schj\" (UID: \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\") " pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.510457 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-catalog-content\") pod \"redhat-operators-8schj\" (UID: \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\") " pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.510572 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-utilities\") pod \"redhat-operators-8schj\" (UID: \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\") " pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.613147 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-catalog-content\") pod \"redhat-operators-8schj\" (UID: \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\") " pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.613251 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-utilities\") pod \"redhat-operators-8schj\" (UID: \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\") " pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.613428 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhpss\" (UniqueName: \"kubernetes.io/projected/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-kube-api-access-bhpss\") pod \"redhat-operators-8schj\" (UID: \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\") " pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.613924 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-utilities\") pod \"redhat-operators-8schj\" (UID: \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\") " pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.613981 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-catalog-content\") pod \"redhat-operators-8schj\" (UID: \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\") " pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.641849 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhpss\" (UniqueName: \"kubernetes.io/projected/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-kube-api-access-bhpss\") pod \"redhat-operators-8schj\" (UID: \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\") " pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:13 crc kubenswrapper[4843]: I0318 13:39:13.793808 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:14 crc kubenswrapper[4843]: I0318 13:39:14.529247 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8schj"] Mar 18 13:39:14 crc kubenswrapper[4843]: I0318 13:39:14.750400 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8schj" event={"ID":"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb","Type":"ContainerStarted","Data":"c1b46610f5920cce99159cd734f9e46b96f7f2485ce96e654c54a876472ba9d6"} Mar 18 13:39:15 crc kubenswrapper[4843]: I0318 13:39:15.761205 4843 generic.go:334] "Generic (PLEG): container finished" podID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" containerID="1769b13afd8ce6e44beed52c6e0d2c995e89df63a855098887fc72c9fee7feef" exitCode=0 Mar 18 13:39:15 crc kubenswrapper[4843]: I0318 13:39:15.761306 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8schj" event={"ID":"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb","Type":"ContainerDied","Data":"1769b13afd8ce6e44beed52c6e0d2c995e89df63a855098887fc72c9fee7feef"} Mar 18 13:39:17 crc kubenswrapper[4843]: I0318 13:39:17.783488 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8schj" event={"ID":"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb","Type":"ContainerStarted","Data":"3f4791407bd14a80ce0b5fe3a5937254c955f791bf0cf5e65f96d74cb135ca06"} Mar 18 13:39:19 crc kubenswrapper[4843]: I0318 13:39:19.804917 4843 generic.go:334] "Generic (PLEG): container finished" podID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" containerID="3f4791407bd14a80ce0b5fe3a5937254c955f791bf0cf5e65f96d74cb135ca06" exitCode=0 Mar 18 13:39:19 crc kubenswrapper[4843]: I0318 13:39:19.806248 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8schj" event={"ID":"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb","Type":"ContainerDied","Data":"3f4791407bd14a80ce0b5fe3a5937254c955f791bf0cf5e65f96d74cb135ca06"} Mar 18 13:39:20 crc kubenswrapper[4843]: I0318 13:39:20.984733 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:39:20 crc kubenswrapper[4843]: E0318 13:39:20.985506 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:39:21 crc kubenswrapper[4843]: I0318 13:39:21.827333 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8schj" event={"ID":"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb","Type":"ContainerStarted","Data":"3114db1e9e228132ebe7c719d55a0da08aa043e20d903037cbbd0b1d33b5db7e"} Mar 18 13:39:22 crc kubenswrapper[4843]: I0318 13:39:22.726114 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8schj" podStartSLOduration=4.626891448 podStartE2EDuration="9.726070701s" podCreationTimestamp="2026-03-18 13:39:13 +0000 UTC" firstStartedPulling="2026-03-18 13:39:15.763836177 +0000 UTC m=+5389.479661701" lastFinishedPulling="2026-03-18 13:39:20.86301543 +0000 UTC m=+5394.578840954" observedRunningTime="2026-03-18 13:39:22.714629157 +0000 UTC m=+5396.430454681" watchObservedRunningTime="2026-03-18 13:39:22.726070701 +0000 UTC m=+5396.441896225" Mar 18 13:39:22 crc kubenswrapper[4843]: I0318 13:39:22.735826 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55c9766bb-kdvz7_c05a32ac-ec1d-497c-81be-4160787c43b3/barbican-api/0.log" Mar 18 13:39:23 crc kubenswrapper[4843]: I0318 13:39:23.436462 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-656c8c855b-nmnnt_1d1350ab-77e3-446d-a13d-152395262970/barbican-keystone-listener-log/0.log" Mar 18 13:39:23 crc kubenswrapper[4843]: I0318 13:39:23.500514 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55c9766bb-kdvz7_c05a32ac-ec1d-497c-81be-4160787c43b3/barbican-api-log/0.log" Mar 18 13:39:23 crc kubenswrapper[4843]: I0318 13:39:23.585726 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-656c8c855b-nmnnt_1d1350ab-77e3-446d-a13d-152395262970/barbican-keystone-listener/0.log" Mar 18 13:39:23 crc kubenswrapper[4843]: I0318 13:39:23.724618 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-94c7cb559-snwrc_599f937c-69d8-4281-b545-e97d4678bc9b/barbican-worker/0.log" Mar 18 13:39:23 crc kubenswrapper[4843]: I0318 13:39:23.772058 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-94c7cb559-snwrc_599f937c-69d8-4281-b545-e97d4678bc9b/barbican-worker-log/0.log" Mar 18 13:39:23 crc kubenswrapper[4843]: I0318 13:39:23.794313 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:23 crc kubenswrapper[4843]: I0318 13:39:23.794713 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:24 crc kubenswrapper[4843]: I0318 13:39:24.054752 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-wzxx7_7882cd3e-a6a4-4be5-8a49-ce2e8610f5ab/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:24 crc kubenswrapper[4843]: I0318 13:39:24.055461 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_948691c5-0a85-451f-931d-2ab2108c1736/ceilometer-central-agent/0.log" Mar 18 13:39:24 crc kubenswrapper[4843]: I0318 13:39:24.157809 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_948691c5-0a85-451f-931d-2ab2108c1736/ceilometer-notification-agent/0.log" Mar 18 13:39:24 crc kubenswrapper[4843]: I0318 13:39:24.278596 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_948691c5-0a85-451f-931d-2ab2108c1736/proxy-httpd/0.log" Mar 18 13:39:24 crc kubenswrapper[4843]: I0318 13:39:24.476709 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a/cinder-api/0.log" Mar 18 13:39:24 crc kubenswrapper[4843]: I0318 13:39:24.495189 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_948691c5-0a85-451f-931d-2ab2108c1736/sg-core/0.log" Mar 18 13:39:24 crc kubenswrapper[4843]: I0318 13:39:24.535994 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_91af6bbc-b1be-4753-9a9b-bdae0cf4ed1a/cinder-api-log/0.log" Mar 18 13:39:24 crc kubenswrapper[4843]: I0318 13:39:24.757124 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_480953e1-6024-4f9e-9f0b-a14f95a047cc/cinder-scheduler/0.log" Mar 18 13:39:24 crc kubenswrapper[4843]: I0318 13:39:24.834977 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_480953e1-6024-4f9e-9f0b-a14f95a047cc/probe/0.log" Mar 18 13:39:24 crc kubenswrapper[4843]: I0318 13:39:24.850087 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8schj" podUID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" containerName="registry-server" probeResult="failure" output=< Mar 18 13:39:24 crc kubenswrapper[4843]: timeout: failed to connect service ":50051" within 1s Mar 18 13:39:24 crc kubenswrapper[4843]: > Mar 18 13:39:25 crc kubenswrapper[4843]: I0318 13:39:25.104011 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-5lfmx_bf8d8fe5-8550-45de-9087-a47fb8695b53/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:25 crc kubenswrapper[4843]: I0318 13:39:25.214288 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-wxkmr_6b700040-447e-4328-a5ac-f2a5ee4fc818/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:25 crc kubenswrapper[4843]: I0318 13:39:25.358631 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-br7qm_275e8234-1b33-40c3-ade6-1c75519ca5c2/init/0.log" Mar 18 13:39:25 crc kubenswrapper[4843]: I0318 13:39:25.569389 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-br7qm_275e8234-1b33-40c3-ade6-1c75519ca5c2/dnsmasq-dns/0.log" Mar 18 13:39:25 crc kubenswrapper[4843]: I0318 13:39:25.639273 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-br7qm_275e8234-1b33-40c3-ade6-1c75519ca5c2/init/0.log" Mar 18 13:39:25 crc kubenswrapper[4843]: I0318 13:39:25.713151 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-xgrmm_433cb022-0613-4bf3-81ed-8b4239e48629/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:26 crc kubenswrapper[4843]: I0318 13:39:26.123919 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f0845b03-51e4-4a41-9b69-895e588930be/glance-httpd/0.log" Mar 18 13:39:26 crc kubenswrapper[4843]: I0318 13:39:26.246379 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f0845b03-51e4-4a41-9b69-895e588930be/glance-log/0.log" Mar 18 13:39:26 crc kubenswrapper[4843]: I0318 13:39:26.360519 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3d9ce903-7368-4024-bb16-5563248684cc/glance-httpd/0.log" Mar 18 13:39:26 crc kubenswrapper[4843]: I0318 13:39:26.384379 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3d9ce903-7368-4024-bb16-5563248684cc/glance-log/0.log" Mar 18 13:39:26 crc kubenswrapper[4843]: I0318 13:39:26.602111 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5bf5d4bdcb-8xfkn_0199f761-6d2f-4921-8060-6960a0141f0a/horizon/0.log" Mar 18 13:39:26 crc kubenswrapper[4843]: I0318 13:39:26.911688 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-g26vs_86207772-fe8f-4753-a658-3827b5cc18b2/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:27 crc kubenswrapper[4843]: I0318 13:39:27.294996 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5bf5d4bdcb-8xfkn_0199f761-6d2f-4921-8060-6960a0141f0a/horizon-log/0.log" Mar 18 13:39:27 crc kubenswrapper[4843]: I0318 13:39:27.300342 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29563981-r8ktq_ce4c6c09-97bc-4b96-9014-279327ba52b6/keystone-cron/0.log" Mar 18 13:39:27 crc kubenswrapper[4843]: I0318 13:39:27.340192 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5x8ns_31c490be-9979-4c2c-b49a-191985508c24/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:27 crc kubenswrapper[4843]: I0318 13:39:27.544430 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c8ecee00-1ed4-4fce-9705-e5e513a922cc/kube-state-metrics/0.log" Mar 18 13:39:27 crc kubenswrapper[4843]: I0318 13:39:27.667287 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-f4bfb58c4-dvx2n_c8cb3725-0d98-427b-9c3f-4ae277b032c4/keystone-api/0.log" Mar 18 13:39:28 crc kubenswrapper[4843]: I0318 13:39:28.414574 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-5jzrk_b908e440-89c5-462b-bab0-861853b924d3/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:28 crc kubenswrapper[4843]: I0318 13:39:28.602573 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-5vc5p_d6c0ebf8-f243-40ce-bd7b-8c6a5e8fd0e7/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:28 crc kubenswrapper[4843]: I0318 13:39:28.733216 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-csdpx_d8417d5f-c42c-4aa5-bd01-97c74ab650c0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:28 crc kubenswrapper[4843]: I0318 13:39:28.834890 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ftv47_17aa29d2-9988-4fa7-86b4-e62e6f879817/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:29 crc kubenswrapper[4843]: I0318 13:39:29.378608 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-gvmj8_9c9dbb1c-ba27-4b96-a1f6-5bce89ce06c3/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:29 crc kubenswrapper[4843]: I0318 13:39:29.672403 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7555f69bf7-z6jmw_ebe913d5-4e12-4fdc-9968-5caaa8aa1271/neutron-api/0.log" Mar 18 13:39:29 crc kubenswrapper[4843]: I0318 13:39:29.673381 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7555f69bf7-z6jmw_ebe913d5-4e12-4fdc-9968-5caaa8aa1271/neutron-httpd/0.log" Mar 18 13:39:29 crc kubenswrapper[4843]: I0318 13:39:29.747109 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-kng5q_98e7a8b0-08df-4140-94f7-135db9497789/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:29 crc kubenswrapper[4843]: I0318 13:39:29.760414 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-hp6t8_bbe53df4-b4eb-4ff6-ac59-f6532974af67/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:29 crc kubenswrapper[4843]: I0318 13:39:29.978491 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-wxptw_8894c42c-6e17-4c87-84e1-1888c1800b04/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:30 crc kubenswrapper[4843]: I0318 13:39:30.556752 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8d1869df-ee8c-406d-99e1-2c63b2d2c7f3/nova-api-log/0.log" Mar 18 13:39:30 crc kubenswrapper[4843]: I0318 13:39:30.607553 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e4769626-cdcb-498c-a5e1-0743378e318e/nova-cell0-conductor-conductor/0.log" Mar 18 13:39:30 crc kubenswrapper[4843]: I0318 13:39:30.898055 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_285d4562-f78e-4e15-8802-ad07d22a1e95/nova-cell1-conductor-conductor/0.log" Mar 18 13:39:31 crc kubenswrapper[4843]: I0318 13:39:31.028370 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8d1869df-ee8c-406d-99e1-2c63b2d2c7f3/nova-api-api/0.log" Mar 18 13:39:31 crc kubenswrapper[4843]: I0318 13:39:31.086348 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_205bbfd1-7bde-4af6-a335-3bc3b8338143/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 13:39:31 crc kubenswrapper[4843]: I0318 13:39:31.211069 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9465d55f-c883-49a5-b007-68821f953f6a/nova-metadata-log/0.log" Mar 18 13:39:31 crc kubenswrapper[4843]: I0318 13:39:31.540789 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ca3b8254-f0ce-4a83-9c8b-616800d7565f/nova-scheduler-scheduler/0.log" Mar 18 13:39:31 crc kubenswrapper[4843]: I0318 13:39:31.579142 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9f1dd599-3b73-4b6c-8f80-0fbb1ce13520/mysql-bootstrap/0.log" Mar 18 13:39:31 crc kubenswrapper[4843]: I0318 13:39:31.739126 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9f1dd599-3b73-4b6c-8f80-0fbb1ce13520/mysql-bootstrap/0.log" Mar 18 13:39:31 crc kubenswrapper[4843]: I0318 13:39:31.749187 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9f1dd599-3b73-4b6c-8f80-0fbb1ce13520/galera/0.log" Mar 18 13:39:31 crc kubenswrapper[4843]: I0318 13:39:31.891311 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9465d55f-c883-49a5-b007-68821f953f6a/nova-metadata-metadata/0.log" Mar 18 13:39:31 crc kubenswrapper[4843]: I0318 13:39:31.976791 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a384e3a7-e6ad-4832-8218-ba3f11df2c2f/mysql-bootstrap/0.log" Mar 18 13:39:32 crc kubenswrapper[4843]: I0318 13:39:32.151269 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a384e3a7-e6ad-4832-8218-ba3f11df2c2f/mysql-bootstrap/0.log" Mar 18 13:39:32 crc kubenswrapper[4843]: I0318 13:39:32.224797 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a384e3a7-e6ad-4832-8218-ba3f11df2c2f/galera/0.log" Mar 18 13:39:32 crc kubenswrapper[4843]: I0318 13:39:32.234251 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b7b95404-5989-4c88-969b-4e9afaaebe8e/openstackclient/0.log" Mar 18 13:39:32 crc kubenswrapper[4843]: I0318 13:39:32.975364 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bpdmm_da7f4dbf-af62-45f2-a204-578e20011760/openstack-network-exporter/0.log" Mar 18 13:39:32 crc kubenswrapper[4843]: I0318 13:39:32.993397 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-5x5rj_a9cc1cd2-018b-40fc-9434-97d649bdd2a8/ovn-controller/0.log" Mar 18 13:39:33 crc kubenswrapper[4843]: I0318 13:39:33.212969 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-s65qf_a8d4e76a-b337-4cdf-a4fe-929389ba6e8c/ovsdb-server-init/0.log" Mar 18 13:39:33 crc kubenswrapper[4843]: I0318 13:39:33.406222 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-s65qf_a8d4e76a-b337-4cdf-a4fe-929389ba6e8c/ovs-vswitchd/0.log" Mar 18 13:39:33 crc kubenswrapper[4843]: I0318 13:39:33.452080 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-s65qf_a8d4e76a-b337-4cdf-a4fe-929389ba6e8c/ovsdb-server-init/0.log" Mar 18 13:39:33 crc kubenswrapper[4843]: I0318 13:39:33.499287 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-s65qf_a8d4e76a-b337-4cdf-a4fe-929389ba6e8c/ovsdb-server/0.log" Mar 18 13:39:33 crc kubenswrapper[4843]: I0318 13:39:33.695433 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_301b1485-ef36-42c7-a6a8-4c3619416072/openstack-network-exporter/0.log" Mar 18 13:39:33 crc kubenswrapper[4843]: I0318 13:39:33.765644 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-rhwls_d49d6abb-b38e-42a3-a374-054ddbd3d2f7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:33 crc kubenswrapper[4843]: I0318 13:39:33.786759 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_301b1485-ef36-42c7-a6a8-4c3619416072/ovn-northd/0.log" Mar 18 13:39:33 crc kubenswrapper[4843]: I0318 13:39:33.942929 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2e6bed4c-30f0-4088-9ac8-2a818cd781d6/openstack-network-exporter/0.log" Mar 18 13:39:34 crc kubenswrapper[4843]: I0318 13:39:34.033359 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2e6bed4c-30f0-4088-9ac8-2a818cd781d6/ovsdbserver-nb/0.log" Mar 18 13:39:34 crc kubenswrapper[4843]: I0318 13:39:34.160939 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7e2f8130-0164-41ca-aa4a-5e206b21bef2/openstack-network-exporter/0.log" Mar 18 13:39:34 crc kubenswrapper[4843]: I0318 13:39:34.800148 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7e2f8130-0164-41ca-aa4a-5e206b21bef2/ovsdbserver-sb/0.log" Mar 18 13:39:34 crc kubenswrapper[4843]: I0318 13:39:34.845230 4843 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8schj" podUID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" containerName="registry-server" probeResult="failure" output=< Mar 18 13:39:34 crc kubenswrapper[4843]: timeout: failed to connect service ":50051" within 1s Mar 18 13:39:34 crc kubenswrapper[4843]: > Mar 18 13:39:34 crc kubenswrapper[4843]: I0318 13:39:34.996417 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64d556c464-829sb_0bb677d0-a346-4663-b989-13b846766c47/placement-api/0.log" Mar 18 13:39:35 crc kubenswrapper[4843]: I0318 13:39:35.043777 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64d556c464-829sb_0bb677d0-a346-4663-b989-13b846766c47/placement-log/0.log" Mar 18 13:39:35 crc kubenswrapper[4843]: I0318 13:39:35.204581 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3aaa37b6-550a-4bd8-a166-af337e05defd/setup-container/0.log" Mar 18 13:39:35 crc kubenswrapper[4843]: I0318 13:39:35.425843 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3aaa37b6-550a-4bd8-a166-af337e05defd/setup-container/0.log" Mar 18 13:39:35 crc kubenswrapper[4843]: I0318 13:39:35.461287 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3aaa37b6-550a-4bd8-a166-af337e05defd/rabbitmq/0.log" Mar 18 13:39:35 crc kubenswrapper[4843]: I0318 13:39:35.470041 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63d21391-4df5-4d15-a12d-7ac03c66194c/setup-container/0.log" Mar 18 13:39:35 crc kubenswrapper[4843]: I0318 13:39:35.730581 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63d21391-4df5-4d15-a12d-7ac03c66194c/setup-container/0.log" Mar 18 13:39:35 crc kubenswrapper[4843]: I0318 13:39:35.772910 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63d21391-4df5-4d15-a12d-7ac03c66194c/rabbitmq/0.log" Mar 18 13:39:35 crc kubenswrapper[4843]: I0318 13:39:35.783560 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2btbj_6a80f3a2-c501-4f2f-8304-41fd472da368/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:35 crc kubenswrapper[4843]: I0318 13:39:35.983893 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:39:35 crc kubenswrapper[4843]: E0318 13:39:35.984232 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:39:36 crc kubenswrapper[4843]: I0318 13:39:36.010480 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-2shvr_c4d5f3c0-524f-472b-b3f0-bcb8f735420f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:36 crc kubenswrapper[4843]: I0318 13:39:36.011506 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-6gm9x_76ab23d5-a490-4e61-b3ae-27e991303a9c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:36 crc kubenswrapper[4843]: I0318 13:39:36.341420 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-kwfgn_c769322f-958a-43ca-b4f6-7379465f6276/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:36 crc kubenswrapper[4843]: I0318 13:39:36.356627 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-mxj5b_f545b271-e719-4624-9fdf-c8e26500c8ab/ssh-known-hosts-edpm-deployment/0.log" Mar 18 13:39:36 crc kubenswrapper[4843]: I0318 13:39:36.601197 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76c6d69747-h572n_5001539a-ee9d-44c9-bcab-58e3720323ae/proxy-server/0.log" Mar 18 13:39:36 crc kubenswrapper[4843]: I0318 13:39:36.713275 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-9bkwq_570c96ec-626f-41e2-bf9b-5da8f8d65fa2/swift-ring-rebalance/0.log" Mar 18 13:39:36 crc kubenswrapper[4843]: I0318 13:39:36.755818 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76c6d69747-h572n_5001539a-ee9d-44c9-bcab-58e3720323ae/proxy-httpd/0.log" Mar 18 13:39:36 crc kubenswrapper[4843]: I0318 13:39:36.909829 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/account-auditor/0.log" Mar 18 13:39:36 crc kubenswrapper[4843]: I0318 13:39:36.980257 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/account-reaper/0.log" Mar 18 13:39:37 crc kubenswrapper[4843]: I0318 13:39:37.058702 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/account-replicator/0.log" Mar 18 13:39:37 crc kubenswrapper[4843]: I0318 13:39:37.196109 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/container-auditor/0.log" Mar 18 13:39:37 crc kubenswrapper[4843]: I0318 13:39:37.274353 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/account-server/0.log" Mar 18 13:39:37 crc kubenswrapper[4843]: I0318 13:39:37.368217 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/container-replicator/0.log" Mar 18 13:39:37 crc kubenswrapper[4843]: I0318 13:39:37.389482 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/container-server/0.log" Mar 18 13:39:37 crc kubenswrapper[4843]: I0318 13:39:37.511635 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/container-updater/0.log" Mar 18 13:39:37 crc kubenswrapper[4843]: I0318 13:39:37.515557 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/object-auditor/0.log" Mar 18 13:39:37 crc kubenswrapper[4843]: I0318 13:39:37.628690 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/object-expirer/0.log" Mar 18 13:39:37 crc kubenswrapper[4843]: I0318 13:39:37.656985 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/object-replicator/0.log" Mar 18 13:39:37 crc kubenswrapper[4843]: I0318 13:39:37.720180 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/object-server/0.log" Mar 18 13:39:37 crc kubenswrapper[4843]: I0318 13:39:37.807841 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/object-updater/0.log" Mar 18 13:39:37 crc kubenswrapper[4843]: I0318 13:39:37.821486 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/rsync/0.log" Mar 18 13:39:37 crc kubenswrapper[4843]: I0318 13:39:37.881613 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4e07598e-c70f-4beb-a828-b58cb64c38c0/swift-recon-cron/0.log" Mar 18 13:39:38 crc kubenswrapper[4843]: I0318 13:39:38.085044 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-78z9w_4ed71bc4-df8e-47f7-87f0-bd7173061676/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:39:43 crc kubenswrapper[4843]: I0318 13:39:43.752306 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4c0d669e-b241-4702-96c3-2de893c52987/memcached/0.log" Mar 18 13:39:43 crc kubenswrapper[4843]: I0318 13:39:43.861601 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:43 crc kubenswrapper[4843]: I0318 13:39:43.932257 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:44 crc kubenswrapper[4843]: I0318 13:39:44.643899 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8schj"] Mar 18 13:39:45 crc kubenswrapper[4843]: I0318 13:39:45.074257 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8schj" podUID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" containerName="registry-server" containerID="cri-o://3114db1e9e228132ebe7c719d55a0da08aa043e20d903037cbbd0b1d33b5db7e" gracePeriod=2 Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.087960 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.089231 4843 generic.go:334] "Generic (PLEG): container finished" podID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" containerID="3114db1e9e228132ebe7c719d55a0da08aa043e20d903037cbbd0b1d33b5db7e" exitCode=0 Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.089272 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8schj" event={"ID":"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb","Type":"ContainerDied","Data":"3114db1e9e228132ebe7c719d55a0da08aa043e20d903037cbbd0b1d33b5db7e"} Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.089302 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8schj" event={"ID":"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb","Type":"ContainerDied","Data":"c1b46610f5920cce99159cd734f9e46b96f7f2485ce96e654c54a876472ba9d6"} Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.089323 4843 scope.go:117] "RemoveContainer" containerID="3114db1e9e228132ebe7c719d55a0da08aa043e20d903037cbbd0b1d33b5db7e" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.123685 4843 scope.go:117] "RemoveContainer" containerID="3f4791407bd14a80ce0b5fe3a5937254c955f791bf0cf5e65f96d74cb135ca06" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.176945 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-catalog-content\") pod \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\" (UID: \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\") " Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.177053 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhpss\" (UniqueName: \"kubernetes.io/projected/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-kube-api-access-bhpss\") pod \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\" (UID: \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\") " Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.177291 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-utilities\") pod \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\" (UID: \"b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb\") " Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.178323 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-utilities" (OuterVolumeSpecName: "utilities") pod "b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" (UID: "b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.186988 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-kube-api-access-bhpss" (OuterVolumeSpecName: "kube-api-access-bhpss") pod "b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" (UID: "b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb"). InnerVolumeSpecName "kube-api-access-bhpss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.194246 4843 scope.go:117] "RemoveContainer" containerID="1769b13afd8ce6e44beed52c6e0d2c995e89df63a855098887fc72c9fee7feef" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.278531 4843 scope.go:117] "RemoveContainer" containerID="3114db1e9e228132ebe7c719d55a0da08aa043e20d903037cbbd0b1d33b5db7e" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.280380 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.280410 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhpss\" (UniqueName: \"kubernetes.io/projected/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-kube-api-access-bhpss\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:46 crc kubenswrapper[4843]: E0318 13:39:46.280536 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3114db1e9e228132ebe7c719d55a0da08aa043e20d903037cbbd0b1d33b5db7e\": container with ID starting with 3114db1e9e228132ebe7c719d55a0da08aa043e20d903037cbbd0b1d33b5db7e not found: ID does not exist" containerID="3114db1e9e228132ebe7c719d55a0da08aa043e20d903037cbbd0b1d33b5db7e" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.280566 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3114db1e9e228132ebe7c719d55a0da08aa043e20d903037cbbd0b1d33b5db7e"} err="failed to get container status \"3114db1e9e228132ebe7c719d55a0da08aa043e20d903037cbbd0b1d33b5db7e\": rpc error: code = NotFound desc = could not find container \"3114db1e9e228132ebe7c719d55a0da08aa043e20d903037cbbd0b1d33b5db7e\": container with ID starting with 3114db1e9e228132ebe7c719d55a0da08aa043e20d903037cbbd0b1d33b5db7e not found: ID does not exist" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.280592 4843 scope.go:117] "RemoveContainer" containerID="3f4791407bd14a80ce0b5fe3a5937254c955f791bf0cf5e65f96d74cb135ca06" Mar 18 13:39:46 crc kubenswrapper[4843]: E0318 13:39:46.281250 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4791407bd14a80ce0b5fe3a5937254c955f791bf0cf5e65f96d74cb135ca06\": container with ID starting with 3f4791407bd14a80ce0b5fe3a5937254c955f791bf0cf5e65f96d74cb135ca06 not found: ID does not exist" containerID="3f4791407bd14a80ce0b5fe3a5937254c955f791bf0cf5e65f96d74cb135ca06" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.281310 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4791407bd14a80ce0b5fe3a5937254c955f791bf0cf5e65f96d74cb135ca06"} err="failed to get container status \"3f4791407bd14a80ce0b5fe3a5937254c955f791bf0cf5e65f96d74cb135ca06\": rpc error: code = NotFound desc = could not find container \"3f4791407bd14a80ce0b5fe3a5937254c955f791bf0cf5e65f96d74cb135ca06\": container with ID starting with 3f4791407bd14a80ce0b5fe3a5937254c955f791bf0cf5e65f96d74cb135ca06 not found: ID does not exist" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.281342 4843 scope.go:117] "RemoveContainer" containerID="1769b13afd8ce6e44beed52c6e0d2c995e89df63a855098887fc72c9fee7feef" Mar 18 13:39:46 crc kubenswrapper[4843]: E0318 13:39:46.285352 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1769b13afd8ce6e44beed52c6e0d2c995e89df63a855098887fc72c9fee7feef\": container with ID starting with 1769b13afd8ce6e44beed52c6e0d2c995e89df63a855098887fc72c9fee7feef not found: ID does not exist" containerID="1769b13afd8ce6e44beed52c6e0d2c995e89df63a855098887fc72c9fee7feef" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.285417 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1769b13afd8ce6e44beed52c6e0d2c995e89df63a855098887fc72c9fee7feef"} err="failed to get container status \"1769b13afd8ce6e44beed52c6e0d2c995e89df63a855098887fc72c9fee7feef\": rpc error: code = NotFound desc = could not find container \"1769b13afd8ce6e44beed52c6e0d2c995e89df63a855098887fc72c9fee7feef\": container with ID starting with 1769b13afd8ce6e44beed52c6e0d2c995e89df63a855098887fc72c9fee7feef not found: ID does not exist" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.377931 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" (UID: "b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:39:46 crc kubenswrapper[4843]: I0318 13:39:46.382029 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:47 crc kubenswrapper[4843]: I0318 13:39:47.101011 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8schj" Mar 18 13:39:47 crc kubenswrapper[4843]: I0318 13:39:47.126760 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8schj"] Mar 18 13:39:47 crc kubenswrapper[4843]: I0318 13:39:47.136019 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8schj"] Mar 18 13:39:48 crc kubenswrapper[4843]: I0318 13:39:48.996463 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" path="/var/lib/kubelet/pods/b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb/volumes" Mar 18 13:39:49 crc kubenswrapper[4843]: I0318 13:39:49.984617 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:39:49 crc kubenswrapper[4843]: E0318 13:39:49.985194 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.723895 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wjqzj"] Mar 18 13:39:57 crc kubenswrapper[4843]: E0318 13:39:57.724915 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" containerName="extract-content" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.724935 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" containerName="extract-content" Mar 18 13:39:57 crc kubenswrapper[4843]: E0318 13:39:57.724965 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" containerName="extract-utilities" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.724973 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" containerName="extract-utilities" Mar 18 13:39:57 crc kubenswrapper[4843]: E0318 13:39:57.725006 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" containerName="registry-server" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.725016 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" containerName="registry-server" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.725227 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="b53ac867-59ad-4f5d-ae2a-1bd70edb1bcb" containerName="registry-server" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.726700 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.747498 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wjqzj"] Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.810976 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc2tr\" (UniqueName: \"kubernetes.io/projected/75dc0699-c72c-4f38-bdf5-594eed550a2b-kube-api-access-dc2tr\") pod \"certified-operators-wjqzj\" (UID: \"75dc0699-c72c-4f38-bdf5-594eed550a2b\") " pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.811055 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dc0699-c72c-4f38-bdf5-594eed550a2b-utilities\") pod \"certified-operators-wjqzj\" (UID: \"75dc0699-c72c-4f38-bdf5-594eed550a2b\") " pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.811328 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dc0699-c72c-4f38-bdf5-594eed550a2b-catalog-content\") pod \"certified-operators-wjqzj\" (UID: \"75dc0699-c72c-4f38-bdf5-594eed550a2b\") " pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.961463 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dc0699-c72c-4f38-bdf5-594eed550a2b-utilities\") pod \"certified-operators-wjqzj\" (UID: \"75dc0699-c72c-4f38-bdf5-594eed550a2b\") " pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.961579 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dc0699-c72c-4f38-bdf5-594eed550a2b-catalog-content\") pod \"certified-operators-wjqzj\" (UID: \"75dc0699-c72c-4f38-bdf5-594eed550a2b\") " pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.961753 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc2tr\" (UniqueName: \"kubernetes.io/projected/75dc0699-c72c-4f38-bdf5-594eed550a2b-kube-api-access-dc2tr\") pod \"certified-operators-wjqzj\" (UID: \"75dc0699-c72c-4f38-bdf5-594eed550a2b\") " pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.962065 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dc0699-c72c-4f38-bdf5-594eed550a2b-utilities\") pod \"certified-operators-wjqzj\" (UID: \"75dc0699-c72c-4f38-bdf5-594eed550a2b\") " pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.962081 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dc0699-c72c-4f38-bdf5-594eed550a2b-catalog-content\") pod \"certified-operators-wjqzj\" (UID: \"75dc0699-c72c-4f38-bdf5-594eed550a2b\") " pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:39:57 crc kubenswrapper[4843]: I0318 13:39:57.999415 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc2tr\" (UniqueName: \"kubernetes.io/projected/75dc0699-c72c-4f38-bdf5-594eed550a2b-kube-api-access-dc2tr\") pod \"certified-operators-wjqzj\" (UID: \"75dc0699-c72c-4f38-bdf5-594eed550a2b\") " pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:39:58 crc kubenswrapper[4843]: I0318 13:39:58.052132 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:39:58 crc kubenswrapper[4843]: I0318 13:39:58.752486 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wjqzj"] Mar 18 13:39:59 crc kubenswrapper[4843]: I0318 13:39:59.229180 4843 generic.go:334] "Generic (PLEG): container finished" podID="75dc0699-c72c-4f38-bdf5-594eed550a2b" containerID="74b55781e2f11de157f1a9906e69ff33f0582fc52d8ebf9b3ae49d3749e3c9ee" exitCode=0 Mar 18 13:39:59 crc kubenswrapper[4843]: I0318 13:39:59.229236 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjqzj" event={"ID":"75dc0699-c72c-4f38-bdf5-594eed550a2b","Type":"ContainerDied","Data":"74b55781e2f11de157f1a9906e69ff33f0582fc52d8ebf9b3ae49d3749e3c9ee"} Mar 18 13:39:59 crc kubenswrapper[4843]: I0318 13:39:59.229267 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjqzj" event={"ID":"75dc0699-c72c-4f38-bdf5-594eed550a2b","Type":"ContainerStarted","Data":"7298e182c6fcc71d1f3427df4feea79500a56db89227e6f14437bb4d4f3830a1"} Mar 18 13:40:00 crc kubenswrapper[4843]: I0318 13:40:00.162629 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564020-gpzl4"] Mar 18 13:40:00 crc kubenswrapper[4843]: I0318 13:40:00.165671 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-gpzl4" Mar 18 13:40:00 crc kubenswrapper[4843]: I0318 13:40:00.168566 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:40:00 crc kubenswrapper[4843]: I0318 13:40:00.168800 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:40:00 crc kubenswrapper[4843]: I0318 13:40:00.169052 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:40:00 crc kubenswrapper[4843]: I0318 13:40:00.173160 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-gpzl4"] Mar 18 13:40:00 crc kubenswrapper[4843]: I0318 13:40:00.243103 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjqzj" event={"ID":"75dc0699-c72c-4f38-bdf5-594eed550a2b","Type":"ContainerStarted","Data":"17bb5f418865c935f5917bbb86d9732bbb14b634fdadc865416dc1cab6a0bd08"} Mar 18 13:40:00 crc kubenswrapper[4843]: I0318 13:40:00.361284 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mfkm\" (UniqueName: \"kubernetes.io/projected/60e10d34-3d48-4bb0-af2f-802e0e3fa75e-kube-api-access-5mfkm\") pod \"auto-csr-approver-29564020-gpzl4\" (UID: \"60e10d34-3d48-4bb0-af2f-802e0e3fa75e\") " pod="openshift-infra/auto-csr-approver-29564020-gpzl4" Mar 18 13:40:00 crc kubenswrapper[4843]: I0318 13:40:00.463464 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mfkm\" (UniqueName: \"kubernetes.io/projected/60e10d34-3d48-4bb0-af2f-802e0e3fa75e-kube-api-access-5mfkm\") pod \"auto-csr-approver-29564020-gpzl4\" (UID: \"60e10d34-3d48-4bb0-af2f-802e0e3fa75e\") " pod="openshift-infra/auto-csr-approver-29564020-gpzl4" Mar 18 13:40:00 crc kubenswrapper[4843]: I0318 13:40:00.484105 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mfkm\" (UniqueName: \"kubernetes.io/projected/60e10d34-3d48-4bb0-af2f-802e0e3fa75e-kube-api-access-5mfkm\") pod \"auto-csr-approver-29564020-gpzl4\" (UID: \"60e10d34-3d48-4bb0-af2f-802e0e3fa75e\") " pod="openshift-infra/auto-csr-approver-29564020-gpzl4" Mar 18 13:40:00 crc kubenswrapper[4843]: I0318 13:40:00.525591 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-gpzl4" Mar 18 13:40:00 crc kubenswrapper[4843]: I0318 13:40:00.984821 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:40:00 crc kubenswrapper[4843]: W0318 13:40:00.990314 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60e10d34_3d48_4bb0_af2f_802e0e3fa75e.slice/crio-a92ae3800d51dae1191528403e258ff0e0976650c3619f5ce435421db8f89c0e WatchSource:0}: Error finding container a92ae3800d51dae1191528403e258ff0e0976650c3619f5ce435421db8f89c0e: Status 404 returned error can't find the container with id a92ae3800d51dae1191528403e258ff0e0976650c3619f5ce435421db8f89c0e Mar 18 13:40:00 crc kubenswrapper[4843]: I0318 13:40:00.997505 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-gpzl4"] Mar 18 13:40:01 crc kubenswrapper[4843]: I0318 13:40:01.266219 4843 generic.go:334] "Generic (PLEG): container finished" podID="75dc0699-c72c-4f38-bdf5-594eed550a2b" containerID="17bb5f418865c935f5917bbb86d9732bbb14b634fdadc865416dc1cab6a0bd08" exitCode=0 Mar 18 13:40:01 crc kubenswrapper[4843]: I0318 13:40:01.266876 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjqzj" event={"ID":"75dc0699-c72c-4f38-bdf5-594eed550a2b","Type":"ContainerDied","Data":"17bb5f418865c935f5917bbb86d9732bbb14b634fdadc865416dc1cab6a0bd08"} Mar 18 13:40:01 crc kubenswrapper[4843]: I0318 13:40:01.271822 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564020-gpzl4" event={"ID":"60e10d34-3d48-4bb0-af2f-802e0e3fa75e","Type":"ContainerStarted","Data":"a92ae3800d51dae1191528403e258ff0e0976650c3619f5ce435421db8f89c0e"} Mar 18 13:40:02 crc kubenswrapper[4843]: I0318 13:40:02.284719 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"23bfc8498d903deeb680a9817a6086e184f5afd90d3ed161f929039e9a0b59b1"} Mar 18 13:40:02 crc kubenswrapper[4843]: I0318 13:40:02.287140 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjqzj" event={"ID":"75dc0699-c72c-4f38-bdf5-594eed550a2b","Type":"ContainerStarted","Data":"a509c65b059c1abadbc10f8f6d960e4329204682a9516c8a6fdf57773e33f459"} Mar 18 13:40:02 crc kubenswrapper[4843]: I0318 13:40:02.385972 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wjqzj" podStartSLOduration=2.667710552 podStartE2EDuration="5.385945956s" podCreationTimestamp="2026-03-18 13:39:57 +0000 UTC" firstStartedPulling="2026-03-18 13:39:59.233123943 +0000 UTC m=+5432.948949467" lastFinishedPulling="2026-03-18 13:40:01.951359347 +0000 UTC m=+5435.667184871" observedRunningTime="2026-03-18 13:40:02.332317296 +0000 UTC m=+5436.048142820" watchObservedRunningTime="2026-03-18 13:40:02.385945956 +0000 UTC m=+5436.101771480" Mar 18 13:40:03 crc kubenswrapper[4843]: I0318 13:40:03.299644 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564020-gpzl4" event={"ID":"60e10d34-3d48-4bb0-af2f-802e0e3fa75e","Type":"ContainerStarted","Data":"c76e50f7758c1c88ffd9888832810567c1e7286f3d8bb89b489deb24fac65c13"} Mar 18 13:40:03 crc kubenswrapper[4843]: I0318 13:40:03.316360 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564020-gpzl4" podStartSLOduration=1.62795139 podStartE2EDuration="3.31633512s" podCreationTimestamp="2026-03-18 13:40:00 +0000 UTC" firstStartedPulling="2026-03-18 13:40:00.99332596 +0000 UTC m=+5434.709151484" lastFinishedPulling="2026-03-18 13:40:02.68170969 +0000 UTC m=+5436.397535214" observedRunningTime="2026-03-18 13:40:03.313053307 +0000 UTC m=+5437.028878831" watchObservedRunningTime="2026-03-18 13:40:03.31633512 +0000 UTC m=+5437.032160644" Mar 18 13:40:04 crc kubenswrapper[4843]: I0318 13:40:04.453645 4843 generic.go:334] "Generic (PLEG): container finished" podID="60e10d34-3d48-4bb0-af2f-802e0e3fa75e" containerID="c76e50f7758c1c88ffd9888832810567c1e7286f3d8bb89b489deb24fac65c13" exitCode=0 Mar 18 13:40:04 crc kubenswrapper[4843]: I0318 13:40:04.453976 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564020-gpzl4" event={"ID":"60e10d34-3d48-4bb0-af2f-802e0e3fa75e","Type":"ContainerDied","Data":"c76e50f7758c1c88ffd9888832810567c1e7286f3d8bb89b489deb24fac65c13"} Mar 18 13:40:06 crc kubenswrapper[4843]: I0318 13:40:06.002630 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-gpzl4" Mar 18 13:40:06 crc kubenswrapper[4843]: I0318 13:40:06.136866 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mfkm\" (UniqueName: \"kubernetes.io/projected/60e10d34-3d48-4bb0-af2f-802e0e3fa75e-kube-api-access-5mfkm\") pod \"60e10d34-3d48-4bb0-af2f-802e0e3fa75e\" (UID: \"60e10d34-3d48-4bb0-af2f-802e0e3fa75e\") " Mar 18 13:40:06 crc kubenswrapper[4843]: I0318 13:40:06.146926 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e10d34-3d48-4bb0-af2f-802e0e3fa75e-kube-api-access-5mfkm" (OuterVolumeSpecName: "kube-api-access-5mfkm") pod "60e10d34-3d48-4bb0-af2f-802e0e3fa75e" (UID: "60e10d34-3d48-4bb0-af2f-802e0e3fa75e"). InnerVolumeSpecName "kube-api-access-5mfkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:40:06 crc kubenswrapper[4843]: I0318 13:40:06.238876 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mfkm\" (UniqueName: \"kubernetes.io/projected/60e10d34-3d48-4bb0-af2f-802e0e3fa75e-kube-api-access-5mfkm\") on node \"crc\" DevicePath \"\"" Mar 18 13:40:06 crc kubenswrapper[4843]: I0318 13:40:06.482566 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564020-gpzl4" event={"ID":"60e10d34-3d48-4bb0-af2f-802e0e3fa75e","Type":"ContainerDied","Data":"a92ae3800d51dae1191528403e258ff0e0976650c3619f5ce435421db8f89c0e"} Mar 18 13:40:06 crc kubenswrapper[4843]: I0318 13:40:06.482614 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a92ae3800d51dae1191528403e258ff0e0976650c3619f5ce435421db8f89c0e" Mar 18 13:40:06 crc kubenswrapper[4843]: I0318 13:40:06.482713 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-gpzl4" Mar 18 13:40:06 crc kubenswrapper[4843]: I0318 13:40:06.536254 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-kmr7n"] Mar 18 13:40:06 crc kubenswrapper[4843]: I0318 13:40:06.544722 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-kmr7n"] Mar 18 13:40:07 crc kubenswrapper[4843]: I0318 13:40:07.038341 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f8b955e-7dbb-422e-9924-a4bbf836a579" path="/var/lib/kubelet/pods/9f8b955e-7dbb-422e-9924-a4bbf836a579/volumes" Mar 18 13:40:08 crc kubenswrapper[4843]: I0318 13:40:08.147379 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:40:08 crc kubenswrapper[4843]: I0318 13:40:08.147666 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:40:08 crc kubenswrapper[4843]: I0318 13:40:08.212925 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:40:08 crc kubenswrapper[4843]: I0318 13:40:08.552130 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:40:08 crc kubenswrapper[4843]: I0318 13:40:08.612217 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wjqzj"] Mar 18 13:40:08 crc kubenswrapper[4843]: I0318 13:40:08.758693 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb_f4c2f3d7-acdb-4867-88ea-4c27f043a32b/util/0.log" Mar 18 13:40:08 crc kubenswrapper[4843]: I0318 13:40:08.914069 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb_f4c2f3d7-acdb-4867-88ea-4c27f043a32b/util/0.log" Mar 18 13:40:08 crc kubenswrapper[4843]: I0318 13:40:08.944318 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb_f4c2f3d7-acdb-4867-88ea-4c27f043a32b/pull/0.log" Mar 18 13:40:08 crc kubenswrapper[4843]: I0318 13:40:08.944543 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb_f4c2f3d7-acdb-4867-88ea-4c27f043a32b/pull/0.log" Mar 18 13:40:09 crc kubenswrapper[4843]: I0318 13:40:09.124462 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb_f4c2f3d7-acdb-4867-88ea-4c27f043a32b/util/0.log" Mar 18 13:40:09 crc kubenswrapper[4843]: I0318 13:40:09.145477 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb_f4c2f3d7-acdb-4867-88ea-4c27f043a32b/pull/0.log" Mar 18 13:40:09 crc kubenswrapper[4843]: I0318 13:40:09.152220 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e95dbb_f4c2f3d7-acdb-4867-88ea-4c27f043a32b/extract/0.log" Mar 18 13:40:09 crc kubenswrapper[4843]: I0318 13:40:09.441755 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-27wfc_e8181c6a-2135-4961-8b68-410a69e807f2/manager/0.log" Mar 18 13:40:09 crc kubenswrapper[4843]: I0318 13:40:09.576792 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-hdzfh_d375d53d-379d-43a3-ae17-3b4853b69f08/manager/0.log" Mar 18 13:40:09 crc kubenswrapper[4843]: I0318 13:40:09.862593 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-qfshp_e55d0291-8b77-48ba-921f-ae8f2b3f7289/manager/0.log" Mar 18 13:40:10 crc kubenswrapper[4843]: I0318 13:40:10.530840 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wjqzj" podUID="75dc0699-c72c-4f38-bdf5-594eed550a2b" containerName="registry-server" containerID="cri-o://a509c65b059c1abadbc10f8f6d960e4329204682a9516c8a6fdf57773e33f459" gracePeriod=2 Mar 18 13:40:10 crc kubenswrapper[4843]: I0318 13:40:10.542896 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-m49jb_7676b45d-b133-49b7-80ce-b8cfd4bb27bc/manager/0.log" Mar 18 13:40:10 crc kubenswrapper[4843]: I0318 13:40:10.581890 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-z9q56_65ec012e-03c7-44d2-a280-4349c82db2b9/manager/0.log" Mar 18 13:40:10 crc kubenswrapper[4843]: I0318 13:40:10.790944 4843 scope.go:117] "RemoveContainer" containerID="ec9706175d8e9936451caa65c278bce56918d4a65e792237e27c640f66485d60" Mar 18 13:40:11 crc kubenswrapper[4843]: I0318 13:40:11.123854 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-vrstp_36803f94-ff88-4cf3-ad7b-6fd7e6222a96/manager/0.log" Mar 18 13:40:11 crc kubenswrapper[4843]: I0318 13:40:11.379372 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-92l6c_78bea798-f276-4bec-9b9f-32148b813f3e/manager/0.log" Mar 18 13:40:11 crc kubenswrapper[4843]: I0318 13:40:11.448463 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-mr8bb_5d1a187a-26d1-4e3f-84b2-54e7c5c596f9/manager/0.log" Mar 18 13:40:11 crc kubenswrapper[4843]: I0318 13:40:11.488700 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-bs4lw_3d98bd17-e9d9-414f-a4b6-95975ab5ba2f/manager/0.log" Mar 18 13:40:11 crc kubenswrapper[4843]: I0318 13:40:11.543115 4843 generic.go:334] "Generic (PLEG): container finished" podID="75dc0699-c72c-4f38-bdf5-594eed550a2b" containerID="a509c65b059c1abadbc10f8f6d960e4329204682a9516c8a6fdf57773e33f459" exitCode=0 Mar 18 13:40:11 crc kubenswrapper[4843]: I0318 13:40:11.543169 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjqzj" event={"ID":"75dc0699-c72c-4f38-bdf5-594eed550a2b","Type":"ContainerDied","Data":"a509c65b059c1abadbc10f8f6d960e4329204682a9516c8a6fdf57773e33f459"} Mar 18 13:40:11 crc kubenswrapper[4843]: I0318 13:40:11.564246 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-sg29f_c4ef047f-6e31-4338-8710-556d04c03f41/manager/0.log" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.315251 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.381251 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dc0699-c72c-4f38-bdf5-594eed550a2b-utilities\") pod \"75dc0699-c72c-4f38-bdf5-594eed550a2b\" (UID: \"75dc0699-c72c-4f38-bdf5-594eed550a2b\") " Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.381391 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc2tr\" (UniqueName: \"kubernetes.io/projected/75dc0699-c72c-4f38-bdf5-594eed550a2b-kube-api-access-dc2tr\") pod \"75dc0699-c72c-4f38-bdf5-594eed550a2b\" (UID: \"75dc0699-c72c-4f38-bdf5-594eed550a2b\") " Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.381688 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dc0699-c72c-4f38-bdf5-594eed550a2b-catalog-content\") pod \"75dc0699-c72c-4f38-bdf5-594eed550a2b\" (UID: \"75dc0699-c72c-4f38-bdf5-594eed550a2b\") " Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.394325 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dc0699-c72c-4f38-bdf5-594eed550a2b-utilities" (OuterVolumeSpecName: "utilities") pod "75dc0699-c72c-4f38-bdf5-594eed550a2b" (UID: "75dc0699-c72c-4f38-bdf5-594eed550a2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.417035 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75dc0699-c72c-4f38-bdf5-594eed550a2b-kube-api-access-dc2tr" (OuterVolumeSpecName: "kube-api-access-dc2tr") pod "75dc0699-c72c-4f38-bdf5-594eed550a2b" (UID: "75dc0699-c72c-4f38-bdf5-594eed550a2b"). InnerVolumeSpecName "kube-api-access-dc2tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.460481 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dc0699-c72c-4f38-bdf5-594eed550a2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75dc0699-c72c-4f38-bdf5-594eed550a2b" (UID: "75dc0699-c72c-4f38-bdf5-594eed550a2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.483297 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dc0699-c72c-4f38-bdf5-594eed550a2b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.483336 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dc0699-c72c-4f38-bdf5-594eed550a2b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.483347 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc2tr\" (UniqueName: \"kubernetes.io/projected/75dc0699-c72c-4f38-bdf5-594eed550a2b-kube-api-access-dc2tr\") on node \"crc\" DevicePath \"\"" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.484401 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-lg5fn_c827df2a-19e1-4a20-bf4e-dffab3abe636/manager/0.log" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.506457 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-xkf8s_20256137-950a-4141-b73a-5993a2bc30d8/manager/0.log" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.556352 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjqzj" event={"ID":"75dc0699-c72c-4f38-bdf5-594eed550a2b","Type":"ContainerDied","Data":"7298e182c6fcc71d1f3427df4feea79500a56db89227e6f14437bb4d4f3830a1"} Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.556418 4843 scope.go:117] "RemoveContainer" containerID="a509c65b059c1abadbc10f8f6d960e4329204682a9516c8a6fdf57773e33f459" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.556444 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjqzj" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.593703 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wjqzj"] Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.594797 4843 scope.go:117] "RemoveContainer" containerID="17bb5f418865c935f5917bbb86d9732bbb14b634fdadc865416dc1cab6a0bd08" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.603425 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wjqzj"] Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.619029 4843 scope.go:117] "RemoveContainer" containerID="74b55781e2f11de157f1a9906e69ff33f0582fc52d8ebf9b3ae49d3749e3c9ee" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.768131 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-vzr9g_61be3a4c-f87e-4be8-a440-4359a84464c9/manager/0.log" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.791453 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-rt7zn_548351fd-949d-4cb3-be2d-b8a5f07d1c49/manager/0.log" Mar 18 13:40:12 crc kubenswrapper[4843]: I0318 13:40:12.994718 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75dc0699-c72c-4f38-bdf5-594eed550a2b" path="/var/lib/kubelet/pods/75dc0699-c72c-4f38-bdf5-594eed550a2b/volumes" Mar 18 13:40:13 crc kubenswrapper[4843]: I0318 13:40:13.039161 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-6lz54_503b0a02-e1c5-4375-94f7-466bb80e97c8/manager/0.log" Mar 18 13:40:13 crc kubenswrapper[4843]: I0318 13:40:13.197778 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-68ccf9867-wdzw9_95fc7104-316c-4699-98b1-3ff394a0c609/operator/0.log" Mar 18 13:40:13 crc kubenswrapper[4843]: I0318 13:40:13.324782 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rtqgz_05c2657b-cf87-40d7-9078-432f19509383/registry-server/0.log" Mar 18 13:40:13 crc kubenswrapper[4843]: I0318 13:40:13.580015 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-j7b9v_61ad1951-c777-4392-be9f-e968600ccfc2/manager/0.log" Mar 18 13:40:13 crc kubenswrapper[4843]: I0318 13:40:13.586344 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-hfp4q_510ea8f8-3bdc-4db8-b4bf-accdfc6fe8e1/manager/0.log" Mar 18 13:40:13 crc kubenswrapper[4843]: I0318 13:40:13.763866 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5zfhk_c7dae85f-590a-445f-ad19-dcd447f77980/operator/0.log" Mar 18 13:40:13 crc kubenswrapper[4843]: I0318 13:40:13.849630 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-r8fkk_c4e6b3ff-9964-4ef4-b0bf-2e1573be138c/manager/0.log" Mar 18 13:40:14 crc kubenswrapper[4843]: I0318 13:40:14.114852 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-cfxn4_6372bc8a-9914-4ecf-bca4-f6998231babb/manager/0.log" Mar 18 13:40:14 crc kubenswrapper[4843]: I0318 13:40:14.128634 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-wg5ch_e210a78c-667d-46a0-a0bc-7c9371ecf962/manager/0.log" Mar 18 13:40:14 crc kubenswrapper[4843]: I0318 13:40:14.286595 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-w4wgm_c01f9d62-3fbf-4eae-a515-6a8e8c41897c/manager/0.log" Mar 18 13:40:14 crc kubenswrapper[4843]: I0318 13:40:14.771209 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-76c5949666-hxxp9_f5e127c8-a5cf-4a2e-9be5-87dfc4a0aa79/manager/0.log" Mar 18 13:40:34 crc kubenswrapper[4843]: I0318 13:40:34.097245 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gf5bj_22656451-217a-4227-becf-75f7ae30423b/control-plane-machine-set-operator/0.log" Mar 18 13:40:34 crc kubenswrapper[4843]: I0318 13:40:34.838071 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hj6cb_7ab6a392-4347-4101-88e2-00ec7b9aecf5/kube-rbac-proxy/0.log" Mar 18 13:40:34 crc kubenswrapper[4843]: I0318 13:40:34.870480 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hj6cb_7ab6a392-4347-4101-88e2-00ec7b9aecf5/machine-api-operator/0.log" Mar 18 13:40:48 crc kubenswrapper[4843]: I0318 13:40:48.047642 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-jwmgj_8047c536-ad04-4693-9e68-a75e89953f61/cert-manager-controller/0.log" Mar 18 13:40:48 crc kubenswrapper[4843]: I0318 13:40:48.226499 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-6fgmp_7a983b8c-1c33-466e-9a33-f25f8b160273/cert-manager-cainjector/0.log" Mar 18 13:40:48 crc kubenswrapper[4843]: I0318 13:40:48.302119 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-v5tld_e0a593cb-58ea-462d-994b-eafeebd4a2e1/cert-manager-webhook/0.log" Mar 18 13:41:02 crc kubenswrapper[4843]: I0318 13:41:02.610398 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-s9x62_ba685259-be2e-473b-b07b-76fd1fba4433/nmstate-console-plugin/0.log" Mar 18 13:41:02 crc kubenswrapper[4843]: I0318 13:41:02.785692 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-llzl5_5a195771-131e-4a55-a719-ccde43845b3c/nmstate-handler/0.log" Mar 18 13:41:02 crc kubenswrapper[4843]: I0318 13:41:02.977463 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-slwpg_c219dce4-3b4d-4b0f-b8bc-52313c223e06/kube-rbac-proxy/0.log" Mar 18 13:41:03 crc kubenswrapper[4843]: I0318 13:41:03.044851 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-slwpg_c219dce4-3b4d-4b0f-b8bc-52313c223e06/nmstate-metrics/0.log" Mar 18 13:41:03 crc kubenswrapper[4843]: I0318 13:41:03.084267 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-q8kxc_943fabaf-11af-46ca-8dd2-3bfd7081bdec/nmstate-operator/0.log" Mar 18 13:41:03 crc kubenswrapper[4843]: I0318 13:41:03.290307 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-2lmd4_19d2d46b-445f-463b-a1b7-e33d82880a8f/nmstate-webhook/0.log" Mar 18 13:41:30 crc kubenswrapper[4843]: I0318 13:41:30.802458 4843 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="9f1dd599-3b73-4b6c-8f80-0fbb1ce13520" containerName="galera" probeResult="failure" output="command timed out" Mar 18 13:41:30 crc kubenswrapper[4843]: I0318 13:41:30.803771 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="9f1dd599-3b73-4b6c-8f80-0fbb1ce13520" containerName="galera" probeResult="failure" output="command timed out" Mar 18 13:41:31 crc kubenswrapper[4843]: I0318 13:41:31.394783 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-54rq8_7e058656-5aa9-4672-97fd-fa0c64ff46b5/kube-rbac-proxy/0.log" Mar 18 13:41:31 crc kubenswrapper[4843]: I0318 13:41:31.470382 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-54rq8_7e058656-5aa9-4672-97fd-fa0c64ff46b5/controller/0.log" Mar 18 13:41:31 crc kubenswrapper[4843]: I0318 13:41:31.512572 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-bn9hp_f29dedb4-cde5-4f7a-9710-9dd1de387482/frr-k8s-webhook-server/0.log" Mar 18 13:41:31 crc kubenswrapper[4843]: I0318 13:41:31.674859 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/cp-frr-files/0.log" Mar 18 13:41:31 crc kubenswrapper[4843]: I0318 13:41:31.816745 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/cp-frr-files/0.log" Mar 18 13:41:31 crc kubenswrapper[4843]: I0318 13:41:31.825197 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/cp-reloader/0.log" Mar 18 13:41:31 crc kubenswrapper[4843]: I0318 13:41:31.870617 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/cp-metrics/0.log" Mar 18 13:41:31 crc kubenswrapper[4843]: I0318 13:41:31.914111 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/cp-reloader/0.log" Mar 18 13:41:32 crc kubenswrapper[4843]: I0318 13:41:32.099092 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/cp-frr-files/0.log" Mar 18 13:41:32 crc kubenswrapper[4843]: I0318 13:41:32.125215 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/cp-metrics/0.log" Mar 18 13:41:32 crc kubenswrapper[4843]: I0318 13:41:32.173388 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/cp-reloader/0.log" Mar 18 13:41:32 crc kubenswrapper[4843]: I0318 13:41:32.177368 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/cp-metrics/0.log" Mar 18 13:41:32 crc kubenswrapper[4843]: I0318 13:41:32.637056 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/cp-frr-files/0.log" Mar 18 13:41:32 crc kubenswrapper[4843]: I0318 13:41:32.700754 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/cp-reloader/0.log" Mar 18 13:41:32 crc kubenswrapper[4843]: I0318 13:41:32.797277 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/cp-metrics/0.log" Mar 18 13:41:32 crc kubenswrapper[4843]: I0318 13:41:32.801764 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/controller/0.log" Mar 18 13:41:32 crc kubenswrapper[4843]: I0318 13:41:32.924045 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/frr-metrics/0.log" Mar 18 13:41:33 crc kubenswrapper[4843]: I0318 13:41:33.030342 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/kube-rbac-proxy-frr/0.log" Mar 18 13:41:33 crc kubenswrapper[4843]: I0318 13:41:33.066363 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/kube-rbac-proxy/0.log" Mar 18 13:41:33 crc kubenswrapper[4843]: I0318 13:41:33.242221 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/reloader/0.log" Mar 18 13:41:33 crc kubenswrapper[4843]: I0318 13:41:33.344675 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84c485cc8b-2wjb4_eb52c05a-e620-4914-94e4-d485168aec35/manager/0.log" Mar 18 13:41:33 crc kubenswrapper[4843]: I0318 13:41:33.534158 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6c9ffdb45c-wwbs4_ab023c74-77a1-4169-8539-24dd12546dad/webhook-server/0.log" Mar 18 13:41:33 crc kubenswrapper[4843]: I0318 13:41:33.764178 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-k26nm_d989f9e2-655b-468f-9fbc-e65fecdd3303/kube-rbac-proxy/0.log" Mar 18 13:41:34 crc kubenswrapper[4843]: I0318 13:41:34.314748 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-k26nm_d989f9e2-655b-468f-9fbc-e65fecdd3303/speaker/0.log" Mar 18 13:41:34 crc kubenswrapper[4843]: I0318 13:41:34.816919 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x7lj9_3dc20e76-907f-4852-91af-a114f966c97b/frr/0.log" Mar 18 13:41:48 crc kubenswrapper[4843]: I0318 13:41:48.365490 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b_2ae2830e-03f9-4bfe-941a-a30fa8f092cf/util/0.log" Mar 18 13:41:48 crc kubenswrapper[4843]: I0318 13:41:48.595560 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b_2ae2830e-03f9-4bfe-941a-a30fa8f092cf/pull/0.log" Mar 18 13:41:48 crc kubenswrapper[4843]: I0318 13:41:48.607505 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b_2ae2830e-03f9-4bfe-941a-a30fa8f092cf/util/0.log" Mar 18 13:41:48 crc kubenswrapper[4843]: I0318 13:41:48.658570 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b_2ae2830e-03f9-4bfe-941a-a30fa8f092cf/pull/0.log" Mar 18 13:41:48 crc kubenswrapper[4843]: I0318 13:41:48.844248 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b_2ae2830e-03f9-4bfe-941a-a30fa8f092cf/util/0.log" Mar 18 13:41:48 crc kubenswrapper[4843]: I0318 13:41:48.852855 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b_2ae2830e-03f9-4bfe-941a-a30fa8f092cf/pull/0.log" Mar 18 13:41:48 crc kubenswrapper[4843]: I0318 13:41:48.873786 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87464d4b_2ae2830e-03f9-4bfe-941a-a30fa8f092cf/extract/0.log" Mar 18 13:41:49 crc kubenswrapper[4843]: I0318 13:41:49.043310 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5_4bd5afa8-337e-4384-ae10-8689ce534039/util/0.log" Mar 18 13:41:49 crc kubenswrapper[4843]: I0318 13:41:49.181067 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5_4bd5afa8-337e-4384-ae10-8689ce534039/util/0.log" Mar 18 13:41:49 crc kubenswrapper[4843]: I0318 13:41:49.181396 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5_4bd5afa8-337e-4384-ae10-8689ce534039/pull/0.log" Mar 18 13:41:49 crc kubenswrapper[4843]: I0318 13:41:49.238304 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5_4bd5afa8-337e-4384-ae10-8689ce534039/pull/0.log" Mar 18 13:41:49 crc kubenswrapper[4843]: I0318 13:41:49.393569 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5_4bd5afa8-337e-4384-ae10-8689ce534039/pull/0.log" Mar 18 13:41:49 crc kubenswrapper[4843]: I0318 13:41:49.406397 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5_4bd5afa8-337e-4384-ae10-8689ce534039/util/0.log" Mar 18 13:41:49 crc kubenswrapper[4843]: I0318 13:41:49.448784 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1b9fd5_4bd5afa8-337e-4384-ae10-8689ce534039/extract/0.log" Mar 18 13:41:49 crc kubenswrapper[4843]: I0318 13:41:49.578341 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-927pm_65efad8c-4b82-445e-b2a6-25c6035182da/extract-utilities/0.log" Mar 18 13:41:49 crc kubenswrapper[4843]: I0318 13:41:49.761329 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-927pm_65efad8c-4b82-445e-b2a6-25c6035182da/extract-utilities/0.log" Mar 18 13:41:49 crc kubenswrapper[4843]: I0318 13:41:49.774069 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-927pm_65efad8c-4b82-445e-b2a6-25c6035182da/extract-content/0.log" Mar 18 13:41:49 crc kubenswrapper[4843]: I0318 13:41:49.780853 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-927pm_65efad8c-4b82-445e-b2a6-25c6035182da/extract-content/0.log" Mar 18 13:41:49 crc kubenswrapper[4843]: I0318 13:41:49.953106 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-927pm_65efad8c-4b82-445e-b2a6-25c6035182da/extract-utilities/0.log" Mar 18 13:41:49 crc kubenswrapper[4843]: I0318 13:41:49.962505 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-927pm_65efad8c-4b82-445e-b2a6-25c6035182da/extract-content/0.log" Mar 18 13:41:50 crc kubenswrapper[4843]: I0318 13:41:50.226641 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xrxxz_bded68f0-358f-4215-a313-6f28ef9b506c/extract-utilities/0.log" Mar 18 13:41:50 crc kubenswrapper[4843]: I0318 13:41:50.446206 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xrxxz_bded68f0-358f-4215-a313-6f28ef9b506c/extract-content/0.log" Mar 18 13:41:50 crc kubenswrapper[4843]: I0318 13:41:50.452316 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xrxxz_bded68f0-358f-4215-a313-6f28ef9b506c/extract-utilities/0.log" Mar 18 13:41:50 crc kubenswrapper[4843]: I0318 13:41:50.532667 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xrxxz_bded68f0-358f-4215-a313-6f28ef9b506c/extract-content/0.log" Mar 18 13:41:50 crc kubenswrapper[4843]: I0318 13:41:50.569963 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-927pm_65efad8c-4b82-445e-b2a6-25c6035182da/registry-server/0.log" Mar 18 13:41:50 crc kubenswrapper[4843]: I0318 13:41:50.710959 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xrxxz_bded68f0-358f-4215-a313-6f28ef9b506c/extract-content/0.log" Mar 18 13:41:50 crc kubenswrapper[4843]: I0318 13:41:50.775032 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xrxxz_bded68f0-358f-4215-a313-6f28ef9b506c/extract-utilities/0.log" Mar 18 13:41:51 crc kubenswrapper[4843]: I0318 13:41:51.007417 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zmr7t_514ca135-154e-44f9-b4e6-de6b600085b7/extract-utilities/0.log" Mar 18 13:41:51 crc kubenswrapper[4843]: I0318 13:41:51.014751 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-x8vmv_05781490-ae8b-484f-9c0f-93409e2d5850/marketplace-operator/0.log" Mar 18 13:41:51 crc kubenswrapper[4843]: I0318 13:41:51.275901 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zmr7t_514ca135-154e-44f9-b4e6-de6b600085b7/extract-utilities/0.log" Mar 18 13:41:51 crc kubenswrapper[4843]: I0318 13:41:51.280563 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zmr7t_514ca135-154e-44f9-b4e6-de6b600085b7/extract-content/0.log" Mar 18 13:41:51 crc kubenswrapper[4843]: I0318 13:41:51.317805 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zmr7t_514ca135-154e-44f9-b4e6-de6b600085b7/extract-content/0.log" Mar 18 13:41:51 crc kubenswrapper[4843]: I0318 13:41:51.530344 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xrxxz_bded68f0-358f-4215-a313-6f28ef9b506c/registry-server/0.log" Mar 18 13:41:51 crc kubenswrapper[4843]: I0318 13:41:51.757754 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zmr7t_514ca135-154e-44f9-b4e6-de6b600085b7/extract-utilities/0.log" Mar 18 13:41:51 crc kubenswrapper[4843]: I0318 13:41:51.834139 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zmr7t_514ca135-154e-44f9-b4e6-de6b600085b7/extract-content/0.log" Mar 18 13:41:52 crc kubenswrapper[4843]: I0318 13:41:52.026265 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zmr7t_514ca135-154e-44f9-b4e6-de6b600085b7/registry-server/0.log" Mar 18 13:41:52 crc kubenswrapper[4843]: I0318 13:41:52.042281 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l4ncn_00cc77a8-dc0e-44dc-905e-8ed09c5646a3/extract-utilities/0.log" Mar 18 13:41:52 crc kubenswrapper[4843]: I0318 13:41:52.144633 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l4ncn_00cc77a8-dc0e-44dc-905e-8ed09c5646a3/extract-utilities/0.log" Mar 18 13:41:52 crc kubenswrapper[4843]: I0318 13:41:52.203966 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l4ncn_00cc77a8-dc0e-44dc-905e-8ed09c5646a3/extract-content/0.log" Mar 18 13:41:52 crc kubenswrapper[4843]: I0318 13:41:52.230570 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l4ncn_00cc77a8-dc0e-44dc-905e-8ed09c5646a3/extract-content/0.log" Mar 18 13:41:52 crc kubenswrapper[4843]: I0318 13:41:52.408249 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l4ncn_00cc77a8-dc0e-44dc-905e-8ed09c5646a3/extract-content/0.log" Mar 18 13:41:52 crc kubenswrapper[4843]: I0318 13:41:52.476974 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l4ncn_00cc77a8-dc0e-44dc-905e-8ed09c5646a3/extract-utilities/0.log" Mar 18 13:41:53 crc kubenswrapper[4843]: I0318 13:41:53.145147 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l4ncn_00cc77a8-dc0e-44dc-905e-8ed09c5646a3/registry-server/0.log" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.154622 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564022-tclkp"] Mar 18 13:42:00 crc kubenswrapper[4843]: E0318 13:42:00.155594 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e10d34-3d48-4bb0-af2f-802e0e3fa75e" containerName="oc" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.155609 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e10d34-3d48-4bb0-af2f-802e0e3fa75e" containerName="oc" Mar 18 13:42:00 crc kubenswrapper[4843]: E0318 13:42:00.155698 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dc0699-c72c-4f38-bdf5-594eed550a2b" containerName="extract-content" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.155709 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dc0699-c72c-4f38-bdf5-594eed550a2b" containerName="extract-content" Mar 18 13:42:00 crc kubenswrapper[4843]: E0318 13:42:00.155726 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dc0699-c72c-4f38-bdf5-594eed550a2b" containerName="registry-server" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.155732 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dc0699-c72c-4f38-bdf5-594eed550a2b" containerName="registry-server" Mar 18 13:42:00 crc kubenswrapper[4843]: E0318 13:42:00.155751 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dc0699-c72c-4f38-bdf5-594eed550a2b" containerName="extract-utilities" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.155758 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dc0699-c72c-4f38-bdf5-594eed550a2b" containerName="extract-utilities" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.155954 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e10d34-3d48-4bb0-af2f-802e0e3fa75e" containerName="oc" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.155969 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="75dc0699-c72c-4f38-bdf5-594eed550a2b" containerName="registry-server" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.156750 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-tclkp" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.167075 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-tclkp"] Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.167739 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.168114 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.168433 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.307430 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh2p9\" (UniqueName: \"kubernetes.io/projected/8953032d-0dee-49d0-803c-c43b5e93542a-kube-api-access-lh2p9\") pod \"auto-csr-approver-29564022-tclkp\" (UID: \"8953032d-0dee-49d0-803c-c43b5e93542a\") " pod="openshift-infra/auto-csr-approver-29564022-tclkp" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.409749 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh2p9\" (UniqueName: \"kubernetes.io/projected/8953032d-0dee-49d0-803c-c43b5e93542a-kube-api-access-lh2p9\") pod \"auto-csr-approver-29564022-tclkp\" (UID: \"8953032d-0dee-49d0-803c-c43b5e93542a\") " pod="openshift-infra/auto-csr-approver-29564022-tclkp" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.437267 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh2p9\" (UniqueName: \"kubernetes.io/projected/8953032d-0dee-49d0-803c-c43b5e93542a-kube-api-access-lh2p9\") pod \"auto-csr-approver-29564022-tclkp\" (UID: \"8953032d-0dee-49d0-803c-c43b5e93542a\") " pod="openshift-infra/auto-csr-approver-29564022-tclkp" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.505207 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-tclkp" Mar 18 13:42:00 crc kubenswrapper[4843]: I0318 13:42:00.964667 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-tclkp"] Mar 18 13:42:01 crc kubenswrapper[4843]: I0318 13:42:01.638630 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564022-tclkp" event={"ID":"8953032d-0dee-49d0-803c-c43b5e93542a","Type":"ContainerStarted","Data":"628c018e60c16ab7ae4131d7f32ff24f69d723b26da857cb3f815b67e0d9737a"} Mar 18 13:42:02 crc kubenswrapper[4843]: I0318 13:42:02.650042 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564022-tclkp" event={"ID":"8953032d-0dee-49d0-803c-c43b5e93542a","Type":"ContainerStarted","Data":"ab64d1cc939eb1bab9f7e20ec9ac5c6d68808dd396469aebcf57d9d8746553ed"} Mar 18 13:42:03 crc kubenswrapper[4843]: I0318 13:42:03.659495 4843 generic.go:334] "Generic (PLEG): container finished" podID="8953032d-0dee-49d0-803c-c43b5e93542a" containerID="ab64d1cc939eb1bab9f7e20ec9ac5c6d68808dd396469aebcf57d9d8746553ed" exitCode=0 Mar 18 13:42:03 crc kubenswrapper[4843]: I0318 13:42:03.659542 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564022-tclkp" event={"ID":"8953032d-0dee-49d0-803c-c43b5e93542a","Type":"ContainerDied","Data":"ab64d1cc939eb1bab9f7e20ec9ac5c6d68808dd396469aebcf57d9d8746553ed"} Mar 18 13:42:05 crc kubenswrapper[4843]: I0318 13:42:05.047160 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-tclkp" Mar 18 13:42:05 crc kubenswrapper[4843]: I0318 13:42:05.219369 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh2p9\" (UniqueName: \"kubernetes.io/projected/8953032d-0dee-49d0-803c-c43b5e93542a-kube-api-access-lh2p9\") pod \"8953032d-0dee-49d0-803c-c43b5e93542a\" (UID: \"8953032d-0dee-49d0-803c-c43b5e93542a\") " Mar 18 13:42:05 crc kubenswrapper[4843]: I0318 13:42:05.226160 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8953032d-0dee-49d0-803c-c43b5e93542a-kube-api-access-lh2p9" (OuterVolumeSpecName: "kube-api-access-lh2p9") pod "8953032d-0dee-49d0-803c-c43b5e93542a" (UID: "8953032d-0dee-49d0-803c-c43b5e93542a"). InnerVolumeSpecName "kube-api-access-lh2p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:05 crc kubenswrapper[4843]: I0318 13:42:05.321898 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh2p9\" (UniqueName: \"kubernetes.io/projected/8953032d-0dee-49d0-803c-c43b5e93542a-kube-api-access-lh2p9\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:05 crc kubenswrapper[4843]: I0318 13:42:05.690988 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564022-tclkp" event={"ID":"8953032d-0dee-49d0-803c-c43b5e93542a","Type":"ContainerDied","Data":"628c018e60c16ab7ae4131d7f32ff24f69d723b26da857cb3f815b67e0d9737a"} Mar 18 13:42:05 crc kubenswrapper[4843]: I0318 13:42:05.691363 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="628c018e60c16ab7ae4131d7f32ff24f69d723b26da857cb3f815b67e0d9737a" Mar 18 13:42:05 crc kubenswrapper[4843]: I0318 13:42:05.691206 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-tclkp" Mar 18 13:42:05 crc kubenswrapper[4843]: I0318 13:42:05.745167 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-qmxll"] Mar 18 13:42:05 crc kubenswrapper[4843]: I0318 13:42:05.755756 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-qmxll"] Mar 18 13:42:06 crc kubenswrapper[4843]: I0318 13:42:06.996304 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="975fe054-dd8f-4634-b17d-d3e9c218c439" path="/var/lib/kubelet/pods/975fe054-dd8f-4634-b17d-d3e9c218c439/volumes" Mar 18 13:42:11 crc kubenswrapper[4843]: I0318 13:42:11.066222 4843 scope.go:117] "RemoveContainer" containerID="72175606280b423fde784ff48f482f8d6617f6edc7f3597598fc1eb57efdcebb" Mar 18 13:42:20 crc kubenswrapper[4843]: I0318 13:42:20.036213 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:42:20 crc kubenswrapper[4843]: I0318 13:42:20.037925 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:42:50 crc kubenswrapper[4843]: I0318 13:42:50.034424 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:42:50 crc kubenswrapper[4843]: I0318 13:42:50.035089 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:43:20 crc kubenswrapper[4843]: I0318 13:43:20.034790 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:43:20 crc kubenswrapper[4843]: I0318 13:43:20.035389 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:43:20 crc kubenswrapper[4843]: I0318 13:43:20.035449 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 13:43:20 crc kubenswrapper[4843]: I0318 13:43:20.036317 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"23bfc8498d903deeb680a9817a6086e184f5afd90d3ed161f929039e9a0b59b1"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:43:20 crc kubenswrapper[4843]: I0318 13:43:20.036378 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://23bfc8498d903deeb680a9817a6086e184f5afd90d3ed161f929039e9a0b59b1" gracePeriod=600 Mar 18 13:43:20 crc kubenswrapper[4843]: I0318 13:43:20.468408 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="23bfc8498d903deeb680a9817a6086e184f5afd90d3ed161f929039e9a0b59b1" exitCode=0 Mar 18 13:43:20 crc kubenswrapper[4843]: I0318 13:43:20.468511 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"23bfc8498d903deeb680a9817a6086e184f5afd90d3ed161f929039e9a0b59b1"} Mar 18 13:43:20 crc kubenswrapper[4843]: I0318 13:43:20.468902 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerStarted","Data":"16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a"} Mar 18 13:43:20 crc kubenswrapper[4843]: I0318 13:43:20.468940 4843 scope.go:117] "RemoveContainer" containerID="ef4632d8e4c1731f7c4280f5df759b5d499e380f0027dd1db6a0d14e3b2c227f" Mar 18 13:43:47 crc kubenswrapper[4843]: I0318 13:43:47.815439 4843 generic.go:334] "Generic (PLEG): container finished" podID="7ae7a0fc-1a94-4175-a9f6-403f501cc985" containerID="b5bac0dbccbf6dcde11d1c22059fedec1192d4b528d4e38832def92cb621d905" exitCode=0 Mar 18 13:43:47 crc kubenswrapper[4843]: I0318 13:43:47.815559 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mz66c/must-gather-5ctdc" event={"ID":"7ae7a0fc-1a94-4175-a9f6-403f501cc985","Type":"ContainerDied","Data":"b5bac0dbccbf6dcde11d1c22059fedec1192d4b528d4e38832def92cb621d905"} Mar 18 13:43:47 crc kubenswrapper[4843]: I0318 13:43:47.817149 4843 scope.go:117] "RemoveContainer" containerID="b5bac0dbccbf6dcde11d1c22059fedec1192d4b528d4e38832def92cb621d905" Mar 18 13:43:48 crc kubenswrapper[4843]: I0318 13:43:48.031405 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mz66c_must-gather-5ctdc_7ae7a0fc-1a94-4175-a9f6-403f501cc985/gather/0.log" Mar 18 13:43:55 crc kubenswrapper[4843]: I0318 13:43:55.916782 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mz66c/must-gather-5ctdc"] Mar 18 13:43:55 crc kubenswrapper[4843]: I0318 13:43:55.917457 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-mz66c/must-gather-5ctdc" podUID="7ae7a0fc-1a94-4175-a9f6-403f501cc985" containerName="copy" containerID="cri-o://794d88c7798f240dea4919f5f5ef306db3ae62cb7a42005ef87a23034ab79f55" gracePeriod=2 Mar 18 13:43:55 crc kubenswrapper[4843]: I0318 13:43:55.926293 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mz66c/must-gather-5ctdc"] Mar 18 13:43:56 crc kubenswrapper[4843]: I0318 13:43:56.432501 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mz66c_must-gather-5ctdc_7ae7a0fc-1a94-4175-a9f6-403f501cc985/copy/0.log" Mar 18 13:43:56 crc kubenswrapper[4843]: I0318 13:43:56.433361 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mz66c/must-gather-5ctdc" Mar 18 13:43:56 crc kubenswrapper[4843]: I0318 13:43:56.586572 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7ae7a0fc-1a94-4175-a9f6-403f501cc985-must-gather-output\") pod \"7ae7a0fc-1a94-4175-a9f6-403f501cc985\" (UID: \"7ae7a0fc-1a94-4175-a9f6-403f501cc985\") " Mar 18 13:43:56 crc kubenswrapper[4843]: I0318 13:43:56.586727 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgm4c\" (UniqueName: \"kubernetes.io/projected/7ae7a0fc-1a94-4175-a9f6-403f501cc985-kube-api-access-pgm4c\") pod \"7ae7a0fc-1a94-4175-a9f6-403f501cc985\" (UID: \"7ae7a0fc-1a94-4175-a9f6-403f501cc985\") " Mar 18 13:43:56 crc kubenswrapper[4843]: I0318 13:43:56.593065 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae7a0fc-1a94-4175-a9f6-403f501cc985-kube-api-access-pgm4c" (OuterVolumeSpecName: "kube-api-access-pgm4c") pod "7ae7a0fc-1a94-4175-a9f6-403f501cc985" (UID: "7ae7a0fc-1a94-4175-a9f6-403f501cc985"). InnerVolumeSpecName "kube-api-access-pgm4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:43:56 crc kubenswrapper[4843]: I0318 13:43:56.689542 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgm4c\" (UniqueName: \"kubernetes.io/projected/7ae7a0fc-1a94-4175-a9f6-403f501cc985-kube-api-access-pgm4c\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:56 crc kubenswrapper[4843]: I0318 13:43:56.744996 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ae7a0fc-1a94-4175-a9f6-403f501cc985-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7ae7a0fc-1a94-4175-a9f6-403f501cc985" (UID: "7ae7a0fc-1a94-4175-a9f6-403f501cc985"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:43:56 crc kubenswrapper[4843]: I0318 13:43:56.791371 4843 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7ae7a0fc-1a94-4175-a9f6-403f501cc985-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:56 crc kubenswrapper[4843]: I0318 13:43:56.928093 4843 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mz66c_must-gather-5ctdc_7ae7a0fc-1a94-4175-a9f6-403f501cc985/copy/0.log" Mar 18 13:43:56 crc kubenswrapper[4843]: I0318 13:43:56.928603 4843 generic.go:334] "Generic (PLEG): container finished" podID="7ae7a0fc-1a94-4175-a9f6-403f501cc985" containerID="794d88c7798f240dea4919f5f5ef306db3ae62cb7a42005ef87a23034ab79f55" exitCode=143 Mar 18 13:43:56 crc kubenswrapper[4843]: I0318 13:43:56.928678 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mz66c/must-gather-5ctdc" Mar 18 13:43:56 crc kubenswrapper[4843]: I0318 13:43:56.928693 4843 scope.go:117] "RemoveContainer" containerID="794d88c7798f240dea4919f5f5ef306db3ae62cb7a42005ef87a23034ab79f55" Mar 18 13:43:56 crc kubenswrapper[4843]: I0318 13:43:56.960302 4843 scope.go:117] "RemoveContainer" containerID="b5bac0dbccbf6dcde11d1c22059fedec1192d4b528d4e38832def92cb621d905" Mar 18 13:43:57 crc kubenswrapper[4843]: I0318 13:43:57.001852 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae7a0fc-1a94-4175-a9f6-403f501cc985" path="/var/lib/kubelet/pods/7ae7a0fc-1a94-4175-a9f6-403f501cc985/volumes" Mar 18 13:43:57 crc kubenswrapper[4843]: I0318 13:43:57.383079 4843 scope.go:117] "RemoveContainer" containerID="794d88c7798f240dea4919f5f5ef306db3ae62cb7a42005ef87a23034ab79f55" Mar 18 13:43:57 crc kubenswrapper[4843]: E0318 13:43:57.384450 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794d88c7798f240dea4919f5f5ef306db3ae62cb7a42005ef87a23034ab79f55\": container with ID starting with 794d88c7798f240dea4919f5f5ef306db3ae62cb7a42005ef87a23034ab79f55 not found: ID does not exist" containerID="794d88c7798f240dea4919f5f5ef306db3ae62cb7a42005ef87a23034ab79f55" Mar 18 13:43:57 crc kubenswrapper[4843]: I0318 13:43:57.384602 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794d88c7798f240dea4919f5f5ef306db3ae62cb7a42005ef87a23034ab79f55"} err="failed to get container status \"794d88c7798f240dea4919f5f5ef306db3ae62cb7a42005ef87a23034ab79f55\": rpc error: code = NotFound desc = could not find container \"794d88c7798f240dea4919f5f5ef306db3ae62cb7a42005ef87a23034ab79f55\": container with ID starting with 794d88c7798f240dea4919f5f5ef306db3ae62cb7a42005ef87a23034ab79f55 not found: ID does not exist" Mar 18 13:43:57 crc kubenswrapper[4843]: I0318 13:43:57.384736 4843 scope.go:117] "RemoveContainer" containerID="b5bac0dbccbf6dcde11d1c22059fedec1192d4b528d4e38832def92cb621d905" Mar 18 13:43:57 crc kubenswrapper[4843]: E0318 13:43:57.386236 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5bac0dbccbf6dcde11d1c22059fedec1192d4b528d4e38832def92cb621d905\": container with ID starting with b5bac0dbccbf6dcde11d1c22059fedec1192d4b528d4e38832def92cb621d905 not found: ID does not exist" containerID="b5bac0dbccbf6dcde11d1c22059fedec1192d4b528d4e38832def92cb621d905" Mar 18 13:43:57 crc kubenswrapper[4843]: I0318 13:43:57.386363 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5bac0dbccbf6dcde11d1c22059fedec1192d4b528d4e38832def92cb621d905"} err="failed to get container status \"b5bac0dbccbf6dcde11d1c22059fedec1192d4b528d4e38832def92cb621d905\": rpc error: code = NotFound desc = could not find container \"b5bac0dbccbf6dcde11d1c22059fedec1192d4b528d4e38832def92cb621d905\": container with ID starting with b5bac0dbccbf6dcde11d1c22059fedec1192d4b528d4e38832def92cb621d905 not found: ID does not exist" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.181314 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564024-jsdld"] Mar 18 13:44:00 crc kubenswrapper[4843]: E0318 13:44:00.182433 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae7a0fc-1a94-4175-a9f6-403f501cc985" containerName="copy" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.182450 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae7a0fc-1a94-4175-a9f6-403f501cc985" containerName="copy" Mar 18 13:44:00 crc kubenswrapper[4843]: E0318 13:44:00.182486 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae7a0fc-1a94-4175-a9f6-403f501cc985" containerName="gather" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.182495 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae7a0fc-1a94-4175-a9f6-403f501cc985" containerName="gather" Mar 18 13:44:00 crc kubenswrapper[4843]: E0318 13:44:00.182508 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8953032d-0dee-49d0-803c-c43b5e93542a" containerName="oc" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.182519 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="8953032d-0dee-49d0-803c-c43b5e93542a" containerName="oc" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.182773 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae7a0fc-1a94-4175-a9f6-403f501cc985" containerName="gather" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.182790 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="8953032d-0dee-49d0-803c-c43b5e93542a" containerName="oc" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.182800 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae7a0fc-1a94-4175-a9f6-403f501cc985" containerName="copy" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.183644 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-jsdld" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.186277 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.186429 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.186365 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.189997 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-jsdld"] Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.338991 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7psn\" (UniqueName: \"kubernetes.io/projected/fc7847b0-0c22-40ff-b0cf-3d8b05a0121b-kube-api-access-t7psn\") pod \"auto-csr-approver-29564024-jsdld\" (UID: \"fc7847b0-0c22-40ff-b0cf-3d8b05a0121b\") " pod="openshift-infra/auto-csr-approver-29564024-jsdld" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.441769 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7psn\" (UniqueName: \"kubernetes.io/projected/fc7847b0-0c22-40ff-b0cf-3d8b05a0121b-kube-api-access-t7psn\") pod \"auto-csr-approver-29564024-jsdld\" (UID: \"fc7847b0-0c22-40ff-b0cf-3d8b05a0121b\") " pod="openshift-infra/auto-csr-approver-29564024-jsdld" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.724610 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7psn\" (UniqueName: \"kubernetes.io/projected/fc7847b0-0c22-40ff-b0cf-3d8b05a0121b-kube-api-access-t7psn\") pod \"auto-csr-approver-29564024-jsdld\" (UID: \"fc7847b0-0c22-40ff-b0cf-3d8b05a0121b\") " pod="openshift-infra/auto-csr-approver-29564024-jsdld" Mar 18 13:44:00 crc kubenswrapper[4843]: I0318 13:44:00.803715 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-jsdld" Mar 18 13:44:01 crc kubenswrapper[4843]: I0318 13:44:01.257383 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-jsdld"] Mar 18 13:44:01 crc kubenswrapper[4843]: I0318 13:44:01.266249 4843 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:44:01 crc kubenswrapper[4843]: I0318 13:44:01.998361 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-jsdld" event={"ID":"fc7847b0-0c22-40ff-b0cf-3d8b05a0121b","Type":"ContainerStarted","Data":"9f261bceffcb23902433118dfcbcaf46e18f567f3d24c4461e1ee006889bc6a7"} Mar 18 13:44:04 crc kubenswrapper[4843]: I0318 13:44:04.021703 4843 generic.go:334] "Generic (PLEG): container finished" podID="fc7847b0-0c22-40ff-b0cf-3d8b05a0121b" containerID="4a893b9269fde84ddeef7d98fa7b280b1c46b195ca32e8026d7e41bd39e9bf33" exitCode=0 Mar 18 13:44:04 crc kubenswrapper[4843]: I0318 13:44:04.021811 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-jsdld" event={"ID":"fc7847b0-0c22-40ff-b0cf-3d8b05a0121b","Type":"ContainerDied","Data":"4a893b9269fde84ddeef7d98fa7b280b1c46b195ca32e8026d7e41bd39e9bf33"} Mar 18 13:44:05 crc kubenswrapper[4843]: I0318 13:44:05.395324 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-jsdld" Mar 18 13:44:05 crc kubenswrapper[4843]: I0318 13:44:05.403014 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7psn\" (UniqueName: \"kubernetes.io/projected/fc7847b0-0c22-40ff-b0cf-3d8b05a0121b-kube-api-access-t7psn\") pod \"fc7847b0-0c22-40ff-b0cf-3d8b05a0121b\" (UID: \"fc7847b0-0c22-40ff-b0cf-3d8b05a0121b\") " Mar 18 13:44:05 crc kubenswrapper[4843]: I0318 13:44:05.412437 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc7847b0-0c22-40ff-b0cf-3d8b05a0121b-kube-api-access-t7psn" (OuterVolumeSpecName: "kube-api-access-t7psn") pod "fc7847b0-0c22-40ff-b0cf-3d8b05a0121b" (UID: "fc7847b0-0c22-40ff-b0cf-3d8b05a0121b"). InnerVolumeSpecName "kube-api-access-t7psn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:44:05 crc kubenswrapper[4843]: I0318 13:44:05.507806 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7psn\" (UniqueName: \"kubernetes.io/projected/fc7847b0-0c22-40ff-b0cf-3d8b05a0121b-kube-api-access-t7psn\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:06 crc kubenswrapper[4843]: I0318 13:44:06.040558 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-jsdld" event={"ID":"fc7847b0-0c22-40ff-b0cf-3d8b05a0121b","Type":"ContainerDied","Data":"9f261bceffcb23902433118dfcbcaf46e18f567f3d24c4461e1ee006889bc6a7"} Mar 18 13:44:06 crc kubenswrapper[4843]: I0318 13:44:06.040609 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f261bceffcb23902433118dfcbcaf46e18f567f3d24c4461e1ee006889bc6a7" Mar 18 13:44:06 crc kubenswrapper[4843]: I0318 13:44:06.040635 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-jsdld" Mar 18 13:44:06 crc kubenswrapper[4843]: I0318 13:44:06.451763 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-gf48x"] Mar 18 13:44:06 crc kubenswrapper[4843]: I0318 13:44:06.459419 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-gf48x"] Mar 18 13:44:06 crc kubenswrapper[4843]: I0318 13:44:06.994957 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98150f00-d4aa-469a-9914-6e5358bd94b2" path="/var/lib/kubelet/pods/98150f00-d4aa-469a-9914-6e5358bd94b2/volumes" Mar 18 13:44:11 crc kubenswrapper[4843]: I0318 13:44:11.174799 4843 scope.go:117] "RemoveContainer" containerID="7303e6e13660702edbdc4e8599706b580c3aa8fe0d3dd5c928b75a98c65d8751" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.176751 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l"] Mar 18 13:45:00 crc kubenswrapper[4843]: E0318 13:45:00.178384 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7847b0-0c22-40ff-b0cf-3d8b05a0121b" containerName="oc" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.178417 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7847b0-0c22-40ff-b0cf-3d8b05a0121b" containerName="oc" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.178870 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7847b0-0c22-40ff-b0cf-3d8b05a0121b" containerName="oc" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.180361 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.189746 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l"] Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.189948 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb2sc\" (UniqueName: \"kubernetes.io/projected/767c323f-4d09-489a-a2c4-f163fa73634a-kube-api-access-kb2sc\") pod \"collect-profiles-29564025-wwg2l\" (UID: \"767c323f-4d09-489a-a2c4-f163fa73634a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.190011 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/767c323f-4d09-489a-a2c4-f163fa73634a-config-volume\") pod \"collect-profiles-29564025-wwg2l\" (UID: \"767c323f-4d09-489a-a2c4-f163fa73634a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.190129 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/767c323f-4d09-489a-a2c4-f163fa73634a-secret-volume\") pod \"collect-profiles-29564025-wwg2l\" (UID: \"767c323f-4d09-489a-a2c4-f163fa73634a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.214698 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.216548 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.291799 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb2sc\" (UniqueName: \"kubernetes.io/projected/767c323f-4d09-489a-a2c4-f163fa73634a-kube-api-access-kb2sc\") pod \"collect-profiles-29564025-wwg2l\" (UID: \"767c323f-4d09-489a-a2c4-f163fa73634a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.291916 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/767c323f-4d09-489a-a2c4-f163fa73634a-config-volume\") pod \"collect-profiles-29564025-wwg2l\" (UID: \"767c323f-4d09-489a-a2c4-f163fa73634a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.292081 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/767c323f-4d09-489a-a2c4-f163fa73634a-secret-volume\") pod \"collect-profiles-29564025-wwg2l\" (UID: \"767c323f-4d09-489a-a2c4-f163fa73634a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.293085 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/767c323f-4d09-489a-a2c4-f163fa73634a-config-volume\") pod \"collect-profiles-29564025-wwg2l\" (UID: \"767c323f-4d09-489a-a2c4-f163fa73634a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.306856 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/767c323f-4d09-489a-a2c4-f163fa73634a-secret-volume\") pod \"collect-profiles-29564025-wwg2l\" (UID: \"767c323f-4d09-489a-a2c4-f163fa73634a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.326224 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb2sc\" (UniqueName: \"kubernetes.io/projected/767c323f-4d09-489a-a2c4-f163fa73634a-kube-api-access-kb2sc\") pod \"collect-profiles-29564025-wwg2l\" (UID: \"767c323f-4d09-489a-a2c4-f163fa73634a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.353701 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lb54f"] Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.367564 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.378022 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb54f"] Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.394311 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4dada73-6104-4328-ac84-a067cc7fc38c-catalog-content\") pod \"redhat-marketplace-lb54f\" (UID: \"f4dada73-6104-4328-ac84-a067cc7fc38c\") " pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.394525 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzp6l\" (UniqueName: \"kubernetes.io/projected/f4dada73-6104-4328-ac84-a067cc7fc38c-kube-api-access-dzp6l\") pod \"redhat-marketplace-lb54f\" (UID: \"f4dada73-6104-4328-ac84-a067cc7fc38c\") " pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.394626 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4dada73-6104-4328-ac84-a067cc7fc38c-utilities\") pod \"redhat-marketplace-lb54f\" (UID: \"f4dada73-6104-4328-ac84-a067cc7fc38c\") " pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.496858 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4dada73-6104-4328-ac84-a067cc7fc38c-catalog-content\") pod \"redhat-marketplace-lb54f\" (UID: \"f4dada73-6104-4328-ac84-a067cc7fc38c\") " pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.496985 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzp6l\" (UniqueName: \"kubernetes.io/projected/f4dada73-6104-4328-ac84-a067cc7fc38c-kube-api-access-dzp6l\") pod \"redhat-marketplace-lb54f\" (UID: \"f4dada73-6104-4328-ac84-a067cc7fc38c\") " pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.497382 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4dada73-6104-4328-ac84-a067cc7fc38c-utilities\") pod \"redhat-marketplace-lb54f\" (UID: \"f4dada73-6104-4328-ac84-a067cc7fc38c\") " pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.497422 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4dada73-6104-4328-ac84-a067cc7fc38c-catalog-content\") pod \"redhat-marketplace-lb54f\" (UID: \"f4dada73-6104-4328-ac84-a067cc7fc38c\") " pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.497676 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4dada73-6104-4328-ac84-a067cc7fc38c-utilities\") pod \"redhat-marketplace-lb54f\" (UID: \"f4dada73-6104-4328-ac84-a067cc7fc38c\") " pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:00 crc kubenswrapper[4843]: I0318 13:45:00.537133 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" Mar 18 13:45:01 crc kubenswrapper[4843]: I0318 13:45:01.002562 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzp6l\" (UniqueName: \"kubernetes.io/projected/f4dada73-6104-4328-ac84-a067cc7fc38c-kube-api-access-dzp6l\") pod \"redhat-marketplace-lb54f\" (UID: \"f4dada73-6104-4328-ac84-a067cc7fc38c\") " pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:01 crc kubenswrapper[4843]: I0318 13:45:01.014015 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:01 crc kubenswrapper[4843]: I0318 13:45:01.296109 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l"] Mar 18 13:45:01 crc kubenswrapper[4843]: W0318 13:45:01.553305 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4dada73_6104_4328_ac84_a067cc7fc38c.slice/crio-66c90d37114f02e65a03b91acc03ab36a249fb49fcc4c5814faf2fc10a8fa8eb WatchSource:0}: Error finding container 66c90d37114f02e65a03b91acc03ab36a249fb49fcc4c5814faf2fc10a8fa8eb: Status 404 returned error can't find the container with id 66c90d37114f02e65a03b91acc03ab36a249fb49fcc4c5814faf2fc10a8fa8eb Mar 18 13:45:01 crc kubenswrapper[4843]: I0318 13:45:01.555567 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb54f"] Mar 18 13:45:02 crc kubenswrapper[4843]: I0318 13:45:02.072799 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4dada73-6104-4328-ac84-a067cc7fc38c" containerID="7e2d205ec08524185bb5adc506c4fdb658567d9606df45fb17792d89513a34db" exitCode=0 Mar 18 13:45:02 crc kubenswrapper[4843]: I0318 13:45:02.072880 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb54f" event={"ID":"f4dada73-6104-4328-ac84-a067cc7fc38c","Type":"ContainerDied","Data":"7e2d205ec08524185bb5adc506c4fdb658567d9606df45fb17792d89513a34db"} Mar 18 13:45:02 crc kubenswrapper[4843]: I0318 13:45:02.073139 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb54f" event={"ID":"f4dada73-6104-4328-ac84-a067cc7fc38c","Type":"ContainerStarted","Data":"66c90d37114f02e65a03b91acc03ab36a249fb49fcc4c5814faf2fc10a8fa8eb"} Mar 18 13:45:02 crc kubenswrapper[4843]: I0318 13:45:02.075009 4843 generic.go:334] "Generic (PLEG): container finished" podID="767c323f-4d09-489a-a2c4-f163fa73634a" containerID="66b744485e740dfc0ab39fab6d4288b7a64a513a43419e217f1896d2310a8547" exitCode=0 Mar 18 13:45:02 crc kubenswrapper[4843]: I0318 13:45:02.075047 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" event={"ID":"767c323f-4d09-489a-a2c4-f163fa73634a","Type":"ContainerDied","Data":"66b744485e740dfc0ab39fab6d4288b7a64a513a43419e217f1896d2310a8547"} Mar 18 13:45:02 crc kubenswrapper[4843]: I0318 13:45:02.075068 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" event={"ID":"767c323f-4d09-489a-a2c4-f163fa73634a","Type":"ContainerStarted","Data":"24ffebd6594e185288ceb251344458ab614c7dc32ca5da9238b7c12984f4b708"} Mar 18 13:45:03 crc kubenswrapper[4843]: I0318 13:45:03.385896 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" Mar 18 13:45:03 crc kubenswrapper[4843]: I0318 13:45:03.555387 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb2sc\" (UniqueName: \"kubernetes.io/projected/767c323f-4d09-489a-a2c4-f163fa73634a-kube-api-access-kb2sc\") pod \"767c323f-4d09-489a-a2c4-f163fa73634a\" (UID: \"767c323f-4d09-489a-a2c4-f163fa73634a\") " Mar 18 13:45:03 crc kubenswrapper[4843]: I0318 13:45:03.555621 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/767c323f-4d09-489a-a2c4-f163fa73634a-config-volume\") pod \"767c323f-4d09-489a-a2c4-f163fa73634a\" (UID: \"767c323f-4d09-489a-a2c4-f163fa73634a\") " Mar 18 13:45:03 crc kubenswrapper[4843]: I0318 13:45:03.555708 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/767c323f-4d09-489a-a2c4-f163fa73634a-secret-volume\") pod \"767c323f-4d09-489a-a2c4-f163fa73634a\" (UID: \"767c323f-4d09-489a-a2c4-f163fa73634a\") " Mar 18 13:45:03 crc kubenswrapper[4843]: I0318 13:45:03.556822 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/767c323f-4d09-489a-a2c4-f163fa73634a-config-volume" (OuterVolumeSpecName: "config-volume") pod "767c323f-4d09-489a-a2c4-f163fa73634a" (UID: "767c323f-4d09-489a-a2c4-f163fa73634a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:03 crc kubenswrapper[4843]: I0318 13:45:03.562671 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/767c323f-4d09-489a-a2c4-f163fa73634a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "767c323f-4d09-489a-a2c4-f163fa73634a" (UID: "767c323f-4d09-489a-a2c4-f163fa73634a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:03 crc kubenswrapper[4843]: I0318 13:45:03.567388 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/767c323f-4d09-489a-a2c4-f163fa73634a-kube-api-access-kb2sc" (OuterVolumeSpecName: "kube-api-access-kb2sc") pod "767c323f-4d09-489a-a2c4-f163fa73634a" (UID: "767c323f-4d09-489a-a2c4-f163fa73634a"). InnerVolumeSpecName "kube-api-access-kb2sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:03 crc kubenswrapper[4843]: I0318 13:45:03.658549 4843 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/767c323f-4d09-489a-a2c4-f163fa73634a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:03 crc kubenswrapper[4843]: I0318 13:45:03.659331 4843 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/767c323f-4d09-489a-a2c4-f163fa73634a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:03 crc kubenswrapper[4843]: I0318 13:45:03.659484 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb2sc\" (UniqueName: \"kubernetes.io/projected/767c323f-4d09-489a-a2c4-f163fa73634a-kube-api-access-kb2sc\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:04 crc kubenswrapper[4843]: I0318 13:45:04.098280 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4dada73-6104-4328-ac84-a067cc7fc38c" containerID="80fa426fe5cc0a2e2fd38da1cd3081ff8e1e78fb33d5eee6c62c63c7b797d320" exitCode=0 Mar 18 13:45:04 crc kubenswrapper[4843]: I0318 13:45:04.098349 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb54f" event={"ID":"f4dada73-6104-4328-ac84-a067cc7fc38c","Type":"ContainerDied","Data":"80fa426fe5cc0a2e2fd38da1cd3081ff8e1e78fb33d5eee6c62c63c7b797d320"} Mar 18 13:45:04 crc kubenswrapper[4843]: I0318 13:45:04.101112 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" event={"ID":"767c323f-4d09-489a-a2c4-f163fa73634a","Type":"ContainerDied","Data":"24ffebd6594e185288ceb251344458ab614c7dc32ca5da9238b7c12984f4b708"} Mar 18 13:45:04 crc kubenswrapper[4843]: I0318 13:45:04.101152 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24ffebd6594e185288ceb251344458ab614c7dc32ca5da9238b7c12984f4b708" Mar 18 13:45:04 crc kubenswrapper[4843]: I0318 13:45:04.101239 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-wwg2l" Mar 18 13:45:04 crc kubenswrapper[4843]: I0318 13:45:04.483908 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v"] Mar 18 13:45:04 crc kubenswrapper[4843]: I0318 13:45:04.496996 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-k245v"] Mar 18 13:45:04 crc kubenswrapper[4843]: I0318 13:45:04.999322 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdc91d39-cf27-435d-90c1-d596d09c4b8f" path="/var/lib/kubelet/pods/cdc91d39-cf27-435d-90c1-d596d09c4b8f/volumes" Mar 18 13:45:05 crc kubenswrapper[4843]: I0318 13:45:05.110106 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb54f" event={"ID":"f4dada73-6104-4328-ac84-a067cc7fc38c","Type":"ContainerStarted","Data":"694c2258b4be97ebf1026f23a8eb6963bb44bd9b57efd0dbda9597879774384d"} Mar 18 13:45:05 crc kubenswrapper[4843]: I0318 13:45:05.137255 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lb54f" podStartSLOduration=2.651656574 podStartE2EDuration="5.137202441s" podCreationTimestamp="2026-03-18 13:45:00 +0000 UTC" firstStartedPulling="2026-03-18 13:45:02.075049589 +0000 UTC m=+5735.790875113" lastFinishedPulling="2026-03-18 13:45:04.560595456 +0000 UTC m=+5738.276420980" observedRunningTime="2026-03-18 13:45:05.12730594 +0000 UTC m=+5738.843131464" watchObservedRunningTime="2026-03-18 13:45:05.137202441 +0000 UTC m=+5738.853027965" Mar 18 13:45:11 crc kubenswrapper[4843]: I0318 13:45:11.014516 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:11 crc kubenswrapper[4843]: I0318 13:45:11.014867 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:11 crc kubenswrapper[4843]: I0318 13:45:11.071172 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:11 crc kubenswrapper[4843]: I0318 13:45:11.215343 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:11 crc kubenswrapper[4843]: I0318 13:45:11.281554 4843 scope.go:117] "RemoveContainer" containerID="e1e40c44ad39a3e984ca5deb406909b2a99f08e373c9b66a86997273e0932f0c" Mar 18 13:45:11 crc kubenswrapper[4843]: I0318 13:45:11.302218 4843 scope.go:117] "RemoveContainer" containerID="d5716208d8be35b9b5a5aa760a90299337a00584119e900460b934d4951d4f75" Mar 18 13:45:11 crc kubenswrapper[4843]: I0318 13:45:11.321749 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb54f"] Mar 18 13:45:13 crc kubenswrapper[4843]: I0318 13:45:13.186517 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lb54f" podUID="f4dada73-6104-4328-ac84-a067cc7fc38c" containerName="registry-server" containerID="cri-o://694c2258b4be97ebf1026f23a8eb6963bb44bd9b57efd0dbda9597879774384d" gracePeriod=2 Mar 18 13:45:13 crc kubenswrapper[4843]: I0318 13:45:13.770458 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:13 crc kubenswrapper[4843]: I0318 13:45:13.776764 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4dada73-6104-4328-ac84-a067cc7fc38c-catalog-content\") pod \"f4dada73-6104-4328-ac84-a067cc7fc38c\" (UID: \"f4dada73-6104-4328-ac84-a067cc7fc38c\") " Mar 18 13:45:13 crc kubenswrapper[4843]: I0318 13:45:13.818805 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4dada73-6104-4328-ac84-a067cc7fc38c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4dada73-6104-4328-ac84-a067cc7fc38c" (UID: "f4dada73-6104-4328-ac84-a067cc7fc38c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:45:13 crc kubenswrapper[4843]: I0318 13:45:13.880123 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzp6l\" (UniqueName: \"kubernetes.io/projected/f4dada73-6104-4328-ac84-a067cc7fc38c-kube-api-access-dzp6l\") pod \"f4dada73-6104-4328-ac84-a067cc7fc38c\" (UID: \"f4dada73-6104-4328-ac84-a067cc7fc38c\") " Mar 18 13:45:13 crc kubenswrapper[4843]: I0318 13:45:13.880612 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4dada73-6104-4328-ac84-a067cc7fc38c-utilities\") pod \"f4dada73-6104-4328-ac84-a067cc7fc38c\" (UID: \"f4dada73-6104-4328-ac84-a067cc7fc38c\") " Mar 18 13:45:13 crc kubenswrapper[4843]: I0318 13:45:13.881243 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4dada73-6104-4328-ac84-a067cc7fc38c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:13 crc kubenswrapper[4843]: I0318 13:45:13.882029 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4dada73-6104-4328-ac84-a067cc7fc38c-utilities" (OuterVolumeSpecName: "utilities") pod "f4dada73-6104-4328-ac84-a067cc7fc38c" (UID: "f4dada73-6104-4328-ac84-a067cc7fc38c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:45:13 crc kubenswrapper[4843]: I0318 13:45:13.887700 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4dada73-6104-4328-ac84-a067cc7fc38c-kube-api-access-dzp6l" (OuterVolumeSpecName: "kube-api-access-dzp6l") pod "f4dada73-6104-4328-ac84-a067cc7fc38c" (UID: "f4dada73-6104-4328-ac84-a067cc7fc38c"). InnerVolumeSpecName "kube-api-access-dzp6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:13 crc kubenswrapper[4843]: I0318 13:45:13.982292 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzp6l\" (UniqueName: \"kubernetes.io/projected/f4dada73-6104-4328-ac84-a067cc7fc38c-kube-api-access-dzp6l\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:13 crc kubenswrapper[4843]: I0318 13:45:13.982323 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4dada73-6104-4328-ac84-a067cc7fc38c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.203823 4843 generic.go:334] "Generic (PLEG): container finished" podID="f4dada73-6104-4328-ac84-a067cc7fc38c" containerID="694c2258b4be97ebf1026f23a8eb6963bb44bd9b57efd0dbda9597879774384d" exitCode=0 Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.203882 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb54f" event={"ID":"f4dada73-6104-4328-ac84-a067cc7fc38c","Type":"ContainerDied","Data":"694c2258b4be97ebf1026f23a8eb6963bb44bd9b57efd0dbda9597879774384d"} Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.203930 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lb54f" event={"ID":"f4dada73-6104-4328-ac84-a067cc7fc38c","Type":"ContainerDied","Data":"66c90d37114f02e65a03b91acc03ab36a249fb49fcc4c5814faf2fc10a8fa8eb"} Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.203938 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lb54f" Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.203958 4843 scope.go:117] "RemoveContainer" containerID="694c2258b4be97ebf1026f23a8eb6963bb44bd9b57efd0dbda9597879774384d" Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.238005 4843 scope.go:117] "RemoveContainer" containerID="80fa426fe5cc0a2e2fd38da1cd3081ff8e1e78fb33d5eee6c62c63c7b797d320" Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.251072 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb54f"] Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.260488 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lb54f"] Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.260804 4843 scope.go:117] "RemoveContainer" containerID="7e2d205ec08524185bb5adc506c4fdb658567d9606df45fb17792d89513a34db" Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.317070 4843 scope.go:117] "RemoveContainer" containerID="694c2258b4be97ebf1026f23a8eb6963bb44bd9b57efd0dbda9597879774384d" Mar 18 13:45:14 crc kubenswrapper[4843]: E0318 13:45:14.317557 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694c2258b4be97ebf1026f23a8eb6963bb44bd9b57efd0dbda9597879774384d\": container with ID starting with 694c2258b4be97ebf1026f23a8eb6963bb44bd9b57efd0dbda9597879774384d not found: ID does not exist" containerID="694c2258b4be97ebf1026f23a8eb6963bb44bd9b57efd0dbda9597879774384d" Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.317617 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694c2258b4be97ebf1026f23a8eb6963bb44bd9b57efd0dbda9597879774384d"} err="failed to get container status \"694c2258b4be97ebf1026f23a8eb6963bb44bd9b57efd0dbda9597879774384d\": rpc error: code = NotFound desc = could not find container \"694c2258b4be97ebf1026f23a8eb6963bb44bd9b57efd0dbda9597879774384d\": container with ID starting with 694c2258b4be97ebf1026f23a8eb6963bb44bd9b57efd0dbda9597879774384d not found: ID does not exist" Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.317669 4843 scope.go:117] "RemoveContainer" containerID="80fa426fe5cc0a2e2fd38da1cd3081ff8e1e78fb33d5eee6c62c63c7b797d320" Mar 18 13:45:14 crc kubenswrapper[4843]: E0318 13:45:14.318000 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80fa426fe5cc0a2e2fd38da1cd3081ff8e1e78fb33d5eee6c62c63c7b797d320\": container with ID starting with 80fa426fe5cc0a2e2fd38da1cd3081ff8e1e78fb33d5eee6c62c63c7b797d320 not found: ID does not exist" containerID="80fa426fe5cc0a2e2fd38da1cd3081ff8e1e78fb33d5eee6c62c63c7b797d320" Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.318037 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80fa426fe5cc0a2e2fd38da1cd3081ff8e1e78fb33d5eee6c62c63c7b797d320"} err="failed to get container status \"80fa426fe5cc0a2e2fd38da1cd3081ff8e1e78fb33d5eee6c62c63c7b797d320\": rpc error: code = NotFound desc = could not find container \"80fa426fe5cc0a2e2fd38da1cd3081ff8e1e78fb33d5eee6c62c63c7b797d320\": container with ID starting with 80fa426fe5cc0a2e2fd38da1cd3081ff8e1e78fb33d5eee6c62c63c7b797d320 not found: ID does not exist" Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.318061 4843 scope.go:117] "RemoveContainer" containerID="7e2d205ec08524185bb5adc506c4fdb658567d9606df45fb17792d89513a34db" Mar 18 13:45:14 crc kubenswrapper[4843]: E0318 13:45:14.318332 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e2d205ec08524185bb5adc506c4fdb658567d9606df45fb17792d89513a34db\": container with ID starting with 7e2d205ec08524185bb5adc506c4fdb658567d9606df45fb17792d89513a34db not found: ID does not exist" containerID="7e2d205ec08524185bb5adc506c4fdb658567d9606df45fb17792d89513a34db" Mar 18 13:45:14 crc kubenswrapper[4843]: I0318 13:45:14.318356 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2d205ec08524185bb5adc506c4fdb658567d9606df45fb17792d89513a34db"} err="failed to get container status \"7e2d205ec08524185bb5adc506c4fdb658567d9606df45fb17792d89513a34db\": rpc error: code = NotFound desc = could not find container \"7e2d205ec08524185bb5adc506c4fdb658567d9606df45fb17792d89513a34db\": container with ID starting with 7e2d205ec08524185bb5adc506c4fdb658567d9606df45fb17792d89513a34db not found: ID does not exist" Mar 18 13:45:15 crc kubenswrapper[4843]: I0318 13:45:15.014212 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4dada73-6104-4328-ac84-a067cc7fc38c" path="/var/lib/kubelet/pods/f4dada73-6104-4328-ac84-a067cc7fc38c/volumes" Mar 18 13:45:20 crc kubenswrapper[4843]: I0318 13:45:20.035024 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:45:20 crc kubenswrapper[4843]: I0318 13:45:20.035623 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:45:50 crc kubenswrapper[4843]: I0318 13:45:50.035668 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:45:50 crc kubenswrapper[4843]: I0318 13:45:50.036369 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.220320 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564026-q66h7"] Mar 18 13:46:00 crc kubenswrapper[4843]: E0318 13:46:00.222131 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4dada73-6104-4328-ac84-a067cc7fc38c" containerName="registry-server" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.222159 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4dada73-6104-4328-ac84-a067cc7fc38c" containerName="registry-server" Mar 18 13:46:00 crc kubenswrapper[4843]: E0318 13:46:00.222181 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4dada73-6104-4328-ac84-a067cc7fc38c" containerName="extract-content" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.222188 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4dada73-6104-4328-ac84-a067cc7fc38c" containerName="extract-content" Mar 18 13:46:00 crc kubenswrapper[4843]: E0318 13:46:00.222206 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4dada73-6104-4328-ac84-a067cc7fc38c" containerName="extract-utilities" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.222214 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4dada73-6104-4328-ac84-a067cc7fc38c" containerName="extract-utilities" Mar 18 13:46:00 crc kubenswrapper[4843]: E0318 13:46:00.222231 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="767c323f-4d09-489a-a2c4-f163fa73634a" containerName="collect-profiles" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.222239 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="767c323f-4d09-489a-a2c4-f163fa73634a" containerName="collect-profiles" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.222481 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="767c323f-4d09-489a-a2c4-f163fa73634a" containerName="collect-profiles" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.222518 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4dada73-6104-4328-ac84-a067cc7fc38c" containerName="registry-server" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.226034 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-q66h7" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.232162 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.232469 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.232519 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564026-q66h7"] Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.232790 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.396239 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tzxn\" (UniqueName: \"kubernetes.io/projected/98554e76-d3ed-4fb0-9e84-c012ca2f35e0-kube-api-access-6tzxn\") pod \"auto-csr-approver-29564026-q66h7\" (UID: \"98554e76-d3ed-4fb0-9e84-c012ca2f35e0\") " pod="openshift-infra/auto-csr-approver-29564026-q66h7" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.498315 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tzxn\" (UniqueName: \"kubernetes.io/projected/98554e76-d3ed-4fb0-9e84-c012ca2f35e0-kube-api-access-6tzxn\") pod \"auto-csr-approver-29564026-q66h7\" (UID: \"98554e76-d3ed-4fb0-9e84-c012ca2f35e0\") " pod="openshift-infra/auto-csr-approver-29564026-q66h7" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.519575 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tzxn\" (UniqueName: \"kubernetes.io/projected/98554e76-d3ed-4fb0-9e84-c012ca2f35e0-kube-api-access-6tzxn\") pod \"auto-csr-approver-29564026-q66h7\" (UID: \"98554e76-d3ed-4fb0-9e84-c012ca2f35e0\") " pod="openshift-infra/auto-csr-approver-29564026-q66h7" Mar 18 13:46:00 crc kubenswrapper[4843]: I0318 13:46:00.552071 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-q66h7" Mar 18 13:46:01 crc kubenswrapper[4843]: I0318 13:46:01.022854 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564026-q66h7"] Mar 18 13:46:01 crc kubenswrapper[4843]: I0318 13:46:01.658946 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-q66h7" event={"ID":"98554e76-d3ed-4fb0-9e84-c012ca2f35e0","Type":"ContainerStarted","Data":"78d60653eff3ec96fdadd03090c4a40db5f19b2bb0cac671364a76b50cb0031e"} Mar 18 13:46:02 crc kubenswrapper[4843]: I0318 13:46:02.670002 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-q66h7" event={"ID":"98554e76-d3ed-4fb0-9e84-c012ca2f35e0","Type":"ContainerStarted","Data":"bd406dcc9d9174d351f58711d1e7a52bd7bb976422d0a06c43e4aebd0718c913"} Mar 18 13:46:02 crc kubenswrapper[4843]: I0318 13:46:02.695302 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564026-q66h7" podStartSLOduration=1.471496605 podStartE2EDuration="2.695277626s" podCreationTimestamp="2026-03-18 13:46:00 +0000 UTC" firstStartedPulling="2026-03-18 13:46:01.03529918 +0000 UTC m=+5794.751124704" lastFinishedPulling="2026-03-18 13:46:02.259080201 +0000 UTC m=+5795.974905725" observedRunningTime="2026-03-18 13:46:02.683931394 +0000 UTC m=+5796.399756928" watchObservedRunningTime="2026-03-18 13:46:02.695277626 +0000 UTC m=+5796.411103160" Mar 18 13:46:03 crc kubenswrapper[4843]: I0318 13:46:03.682206 4843 generic.go:334] "Generic (PLEG): container finished" podID="98554e76-d3ed-4fb0-9e84-c012ca2f35e0" containerID="bd406dcc9d9174d351f58711d1e7a52bd7bb976422d0a06c43e4aebd0718c913" exitCode=0 Mar 18 13:46:03 crc kubenswrapper[4843]: I0318 13:46:03.682271 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-q66h7" event={"ID":"98554e76-d3ed-4fb0-9e84-c012ca2f35e0","Type":"ContainerDied","Data":"bd406dcc9d9174d351f58711d1e7a52bd7bb976422d0a06c43e4aebd0718c913"} Mar 18 13:46:05 crc kubenswrapper[4843]: I0318 13:46:05.024318 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-q66h7" Mar 18 13:46:05 crc kubenswrapper[4843]: I0318 13:46:05.179146 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tzxn\" (UniqueName: \"kubernetes.io/projected/98554e76-d3ed-4fb0-9e84-c012ca2f35e0-kube-api-access-6tzxn\") pod \"98554e76-d3ed-4fb0-9e84-c012ca2f35e0\" (UID: \"98554e76-d3ed-4fb0-9e84-c012ca2f35e0\") " Mar 18 13:46:05 crc kubenswrapper[4843]: I0318 13:46:05.524512 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98554e76-d3ed-4fb0-9e84-c012ca2f35e0-kube-api-access-6tzxn" (OuterVolumeSpecName: "kube-api-access-6tzxn") pod "98554e76-d3ed-4fb0-9e84-c012ca2f35e0" (UID: "98554e76-d3ed-4fb0-9e84-c012ca2f35e0"). InnerVolumeSpecName "kube-api-access-6tzxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:05 crc kubenswrapper[4843]: I0318 13:46:05.596771 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tzxn\" (UniqueName: \"kubernetes.io/projected/98554e76-d3ed-4fb0-9e84-c012ca2f35e0-kube-api-access-6tzxn\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:05 crc kubenswrapper[4843]: I0318 13:46:05.702020 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-q66h7" event={"ID":"98554e76-d3ed-4fb0-9e84-c012ca2f35e0","Type":"ContainerDied","Data":"78d60653eff3ec96fdadd03090c4a40db5f19b2bb0cac671364a76b50cb0031e"} Mar 18 13:46:05 crc kubenswrapper[4843]: I0318 13:46:05.702093 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-q66h7" Mar 18 13:46:05 crc kubenswrapper[4843]: I0318 13:46:05.702100 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78d60653eff3ec96fdadd03090c4a40db5f19b2bb0cac671364a76b50cb0031e" Mar 18 13:46:05 crc kubenswrapper[4843]: I0318 13:46:05.759786 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-gpzl4"] Mar 18 13:46:05 crc kubenswrapper[4843]: I0318 13:46:05.767589 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-gpzl4"] Mar 18 13:46:06 crc kubenswrapper[4843]: I0318 13:46:06.996547 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60e10d34-3d48-4bb0-af2f-802e0e3fa75e" path="/var/lib/kubelet/pods/60e10d34-3d48-4bb0-af2f-802e0e3fa75e/volumes" Mar 18 13:46:11 crc kubenswrapper[4843]: I0318 13:46:11.398503 4843 scope.go:117] "RemoveContainer" containerID="c76e50f7758c1c88ffd9888832810567c1e7286f3d8bb89b489deb24fac65c13" Mar 18 13:46:20 crc kubenswrapper[4843]: I0318 13:46:20.034865 4843 patch_prober.go:28] interesting pod/machine-config-daemon-wstcq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:46:20 crc kubenswrapper[4843]: I0318 13:46:20.036005 4843 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:46:20 crc kubenswrapper[4843]: I0318 13:46:20.036069 4843 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" Mar 18 13:46:20 crc kubenswrapper[4843]: I0318 13:46:20.037171 4843 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a"} pod="openshift-machine-config-operator/machine-config-daemon-wstcq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:46:20 crc kubenswrapper[4843]: I0318 13:46:20.037260 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerName="machine-config-daemon" containerID="cri-o://16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" gracePeriod=600 Mar 18 13:46:20 crc kubenswrapper[4843]: E0318 13:46:20.160566 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:46:20 crc kubenswrapper[4843]: I0318 13:46:20.948546 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" event={"ID":"f5a185c4-48ac-4f51-99be-0a9418d9e53f","Type":"ContainerDied","Data":"16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a"} Mar 18 13:46:20 crc kubenswrapper[4843]: I0318 13:46:20.948563 4843 generic.go:334] "Generic (PLEG): container finished" podID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" exitCode=0 Mar 18 13:46:20 crc kubenswrapper[4843]: I0318 13:46:20.948922 4843 scope.go:117] "RemoveContainer" containerID="23bfc8498d903deeb680a9817a6086e184f5afd90d3ed161f929039e9a0b59b1" Mar 18 13:46:20 crc kubenswrapper[4843]: I0318 13:46:20.950122 4843 scope.go:117] "RemoveContainer" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" Mar 18 13:46:20 crc kubenswrapper[4843]: E0318 13:46:20.950594 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:46:32 crc kubenswrapper[4843]: I0318 13:46:32.985053 4843 scope.go:117] "RemoveContainer" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" Mar 18 13:46:32 crc kubenswrapper[4843]: E0318 13:46:32.985970 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:46:45 crc kubenswrapper[4843]: I0318 13:46:45.984440 4843 scope.go:117] "RemoveContainer" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" Mar 18 13:46:45 crc kubenswrapper[4843]: E0318 13:46:45.985243 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:46:56 crc kubenswrapper[4843]: I0318 13:46:56.990476 4843 scope.go:117] "RemoveContainer" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" Mar 18 13:46:56 crc kubenswrapper[4843]: E0318 13:46:56.991460 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:47:11 crc kubenswrapper[4843]: I0318 13:47:11.984114 4843 scope.go:117] "RemoveContainer" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" Mar 18 13:47:11 crc kubenswrapper[4843]: E0318 13:47:11.985979 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:47:21 crc kubenswrapper[4843]: I0318 13:47:21.984735 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tmpkb"] Mar 18 13:47:21 crc kubenswrapper[4843]: E0318 13:47:21.986705 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98554e76-d3ed-4fb0-9e84-c012ca2f35e0" containerName="oc" Mar 18 13:47:21 crc kubenswrapper[4843]: I0318 13:47:21.986807 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="98554e76-d3ed-4fb0-9e84-c012ca2f35e0" containerName="oc" Mar 18 13:47:21 crc kubenswrapper[4843]: I0318 13:47:21.987480 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="98554e76-d3ed-4fb0-9e84-c012ca2f35e0" containerName="oc" Mar 18 13:47:21 crc kubenswrapper[4843]: I0318 13:47:21.989643 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:22 crc kubenswrapper[4843]: I0318 13:47:22.000267 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmpkb"] Mar 18 13:47:22 crc kubenswrapper[4843]: I0318 13:47:22.123146 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf6kv\" (UniqueName: \"kubernetes.io/projected/92bb2bd5-4d49-457c-9054-522965d7f404-kube-api-access-xf6kv\") pod \"community-operators-tmpkb\" (UID: \"92bb2bd5-4d49-457c-9054-522965d7f404\") " pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:22 crc kubenswrapper[4843]: I0318 13:47:22.123205 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bb2bd5-4d49-457c-9054-522965d7f404-utilities\") pod \"community-operators-tmpkb\" (UID: \"92bb2bd5-4d49-457c-9054-522965d7f404\") " pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:22 crc kubenswrapper[4843]: I0318 13:47:22.123815 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bb2bd5-4d49-457c-9054-522965d7f404-catalog-content\") pod \"community-operators-tmpkb\" (UID: \"92bb2bd5-4d49-457c-9054-522965d7f404\") " pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:22 crc kubenswrapper[4843]: I0318 13:47:22.226011 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bb2bd5-4d49-457c-9054-522965d7f404-catalog-content\") pod \"community-operators-tmpkb\" (UID: \"92bb2bd5-4d49-457c-9054-522965d7f404\") " pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:22 crc kubenswrapper[4843]: I0318 13:47:22.226165 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf6kv\" (UniqueName: \"kubernetes.io/projected/92bb2bd5-4d49-457c-9054-522965d7f404-kube-api-access-xf6kv\") pod \"community-operators-tmpkb\" (UID: \"92bb2bd5-4d49-457c-9054-522965d7f404\") " pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:22 crc kubenswrapper[4843]: I0318 13:47:22.226190 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bb2bd5-4d49-457c-9054-522965d7f404-utilities\") pod \"community-operators-tmpkb\" (UID: \"92bb2bd5-4d49-457c-9054-522965d7f404\") " pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:22 crc kubenswrapper[4843]: I0318 13:47:22.226605 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bb2bd5-4d49-457c-9054-522965d7f404-catalog-content\") pod \"community-operators-tmpkb\" (UID: \"92bb2bd5-4d49-457c-9054-522965d7f404\") " pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:22 crc kubenswrapper[4843]: I0318 13:47:22.226665 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bb2bd5-4d49-457c-9054-522965d7f404-utilities\") pod \"community-operators-tmpkb\" (UID: \"92bb2bd5-4d49-457c-9054-522965d7f404\") " pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:22 crc kubenswrapper[4843]: I0318 13:47:22.247914 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf6kv\" (UniqueName: \"kubernetes.io/projected/92bb2bd5-4d49-457c-9054-522965d7f404-kube-api-access-xf6kv\") pod \"community-operators-tmpkb\" (UID: \"92bb2bd5-4d49-457c-9054-522965d7f404\") " pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:22 crc kubenswrapper[4843]: I0318 13:47:22.320951 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:22 crc kubenswrapper[4843]: W0318 13:47:22.946926 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92bb2bd5_4d49_457c_9054_522965d7f404.slice/crio-f473678ac873c36a8d2123ea9b12bb41548a4d5d9279afe59e268871a204ce45 WatchSource:0}: Error finding container f473678ac873c36a8d2123ea9b12bb41548a4d5d9279afe59e268871a204ce45: Status 404 returned error can't find the container with id f473678ac873c36a8d2123ea9b12bb41548a4d5d9279afe59e268871a204ce45 Mar 18 13:47:22 crc kubenswrapper[4843]: I0318 13:47:22.948422 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmpkb"] Mar 18 13:47:22 crc kubenswrapper[4843]: I0318 13:47:22.985471 4843 scope.go:117] "RemoveContainer" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" Mar 18 13:47:22 crc kubenswrapper[4843]: E0318 13:47:22.985774 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:47:23 crc kubenswrapper[4843]: I0318 13:47:23.605887 4843 generic.go:334] "Generic (PLEG): container finished" podID="92bb2bd5-4d49-457c-9054-522965d7f404" containerID="78a96a98d3e18fd2737e6a52a17969d1ba7cd5bd259cad8d3e1f89f9a259e1f1" exitCode=0 Mar 18 13:47:23 crc kubenswrapper[4843]: I0318 13:47:23.605988 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmpkb" event={"ID":"92bb2bd5-4d49-457c-9054-522965d7f404","Type":"ContainerDied","Data":"78a96a98d3e18fd2737e6a52a17969d1ba7cd5bd259cad8d3e1f89f9a259e1f1"} Mar 18 13:47:23 crc kubenswrapper[4843]: I0318 13:47:23.606445 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmpkb" event={"ID":"92bb2bd5-4d49-457c-9054-522965d7f404","Type":"ContainerStarted","Data":"f473678ac873c36a8d2123ea9b12bb41548a4d5d9279afe59e268871a204ce45"} Mar 18 13:47:26 crc kubenswrapper[4843]: I0318 13:47:26.635110 4843 generic.go:334] "Generic (PLEG): container finished" podID="92bb2bd5-4d49-457c-9054-522965d7f404" containerID="56c95398d38d18c9a8cb3d786281ca482af50feb29b6d3c0ddf97b34bd555531" exitCode=0 Mar 18 13:47:26 crc kubenswrapper[4843]: I0318 13:47:26.635176 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmpkb" event={"ID":"92bb2bd5-4d49-457c-9054-522965d7f404","Type":"ContainerDied","Data":"56c95398d38d18c9a8cb3d786281ca482af50feb29b6d3c0ddf97b34bd555531"} Mar 18 13:47:27 crc kubenswrapper[4843]: I0318 13:47:27.647761 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmpkb" event={"ID":"92bb2bd5-4d49-457c-9054-522965d7f404","Type":"ContainerStarted","Data":"c64ef2df27afdc20eba11f439c7c1e0d14695072293ae2fade2a5117ffaf5289"} Mar 18 13:47:27 crc kubenswrapper[4843]: I0318 13:47:27.669125 4843 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tmpkb" podStartSLOduration=3.169277808 podStartE2EDuration="6.669088277s" podCreationTimestamp="2026-03-18 13:47:21 +0000 UTC" firstStartedPulling="2026-03-18 13:47:23.610422516 +0000 UTC m=+5877.326248040" lastFinishedPulling="2026-03-18 13:47:27.110232985 +0000 UTC m=+5880.826058509" observedRunningTime="2026-03-18 13:47:27.667916844 +0000 UTC m=+5881.383742388" watchObservedRunningTime="2026-03-18 13:47:27.669088277 +0000 UTC m=+5881.384913801" Mar 18 13:47:32 crc kubenswrapper[4843]: I0318 13:47:32.321680 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:32 crc kubenswrapper[4843]: I0318 13:47:32.321744 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:32 crc kubenswrapper[4843]: I0318 13:47:32.378694 4843 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:32 crc kubenswrapper[4843]: I0318 13:47:32.778556 4843 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:32 crc kubenswrapper[4843]: I0318 13:47:32.830718 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmpkb"] Mar 18 13:47:34 crc kubenswrapper[4843]: I0318 13:47:34.748160 4843 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tmpkb" podUID="92bb2bd5-4d49-457c-9054-522965d7f404" containerName="registry-server" containerID="cri-o://c64ef2df27afdc20eba11f439c7c1e0d14695072293ae2fade2a5117ffaf5289" gracePeriod=2 Mar 18 13:47:34 crc kubenswrapper[4843]: I0318 13:47:34.984662 4843 scope.go:117] "RemoveContainer" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" Mar 18 13:47:34 crc kubenswrapper[4843]: E0318 13:47:34.984934 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.233139 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.295709 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bb2bd5-4d49-457c-9054-522965d7f404-utilities\") pod \"92bb2bd5-4d49-457c-9054-522965d7f404\" (UID: \"92bb2bd5-4d49-457c-9054-522965d7f404\") " Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.295898 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf6kv\" (UniqueName: \"kubernetes.io/projected/92bb2bd5-4d49-457c-9054-522965d7f404-kube-api-access-xf6kv\") pod \"92bb2bd5-4d49-457c-9054-522965d7f404\" (UID: \"92bb2bd5-4d49-457c-9054-522965d7f404\") " Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.296011 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bb2bd5-4d49-457c-9054-522965d7f404-catalog-content\") pod \"92bb2bd5-4d49-457c-9054-522965d7f404\" (UID: \"92bb2bd5-4d49-457c-9054-522965d7f404\") " Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.301259 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92bb2bd5-4d49-457c-9054-522965d7f404-utilities" (OuterVolumeSpecName: "utilities") pod "92bb2bd5-4d49-457c-9054-522965d7f404" (UID: "92bb2bd5-4d49-457c-9054-522965d7f404"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.306361 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92bb2bd5-4d49-457c-9054-522965d7f404-kube-api-access-xf6kv" (OuterVolumeSpecName: "kube-api-access-xf6kv") pod "92bb2bd5-4d49-457c-9054-522965d7f404" (UID: "92bb2bd5-4d49-457c-9054-522965d7f404"). InnerVolumeSpecName "kube-api-access-xf6kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.373398 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92bb2bd5-4d49-457c-9054-522965d7f404-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92bb2bd5-4d49-457c-9054-522965d7f404" (UID: "92bb2bd5-4d49-457c-9054-522965d7f404"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.398210 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf6kv\" (UniqueName: \"kubernetes.io/projected/92bb2bd5-4d49-457c-9054-522965d7f404-kube-api-access-xf6kv\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.398257 4843 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92bb2bd5-4d49-457c-9054-522965d7f404-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.398269 4843 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92bb2bd5-4d49-457c-9054-522965d7f404-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.758882 4843 generic.go:334] "Generic (PLEG): container finished" podID="92bb2bd5-4d49-457c-9054-522965d7f404" containerID="c64ef2df27afdc20eba11f439c7c1e0d14695072293ae2fade2a5117ffaf5289" exitCode=0 Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.758967 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmpkb" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.758963 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmpkb" event={"ID":"92bb2bd5-4d49-457c-9054-522965d7f404","Type":"ContainerDied","Data":"c64ef2df27afdc20eba11f439c7c1e0d14695072293ae2fade2a5117ffaf5289"} Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.759342 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmpkb" event={"ID":"92bb2bd5-4d49-457c-9054-522965d7f404","Type":"ContainerDied","Data":"f473678ac873c36a8d2123ea9b12bb41548a4d5d9279afe59e268871a204ce45"} Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.759369 4843 scope.go:117] "RemoveContainer" containerID="c64ef2df27afdc20eba11f439c7c1e0d14695072293ae2fade2a5117ffaf5289" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.780058 4843 scope.go:117] "RemoveContainer" containerID="56c95398d38d18c9a8cb3d786281ca482af50feb29b6d3c0ddf97b34bd555531" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.811328 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmpkb"] Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.812844 4843 scope.go:117] "RemoveContainer" containerID="78a96a98d3e18fd2737e6a52a17969d1ba7cd5bd259cad8d3e1f89f9a259e1f1" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.819487 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tmpkb"] Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.861509 4843 scope.go:117] "RemoveContainer" containerID="c64ef2df27afdc20eba11f439c7c1e0d14695072293ae2fade2a5117ffaf5289" Mar 18 13:47:35 crc kubenswrapper[4843]: E0318 13:47:35.862118 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c64ef2df27afdc20eba11f439c7c1e0d14695072293ae2fade2a5117ffaf5289\": container with ID starting with c64ef2df27afdc20eba11f439c7c1e0d14695072293ae2fade2a5117ffaf5289 not found: ID does not exist" containerID="c64ef2df27afdc20eba11f439c7c1e0d14695072293ae2fade2a5117ffaf5289" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.862179 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64ef2df27afdc20eba11f439c7c1e0d14695072293ae2fade2a5117ffaf5289"} err="failed to get container status \"c64ef2df27afdc20eba11f439c7c1e0d14695072293ae2fade2a5117ffaf5289\": rpc error: code = NotFound desc = could not find container \"c64ef2df27afdc20eba11f439c7c1e0d14695072293ae2fade2a5117ffaf5289\": container with ID starting with c64ef2df27afdc20eba11f439c7c1e0d14695072293ae2fade2a5117ffaf5289 not found: ID does not exist" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.862367 4843 scope.go:117] "RemoveContainer" containerID="56c95398d38d18c9a8cb3d786281ca482af50feb29b6d3c0ddf97b34bd555531" Mar 18 13:47:35 crc kubenswrapper[4843]: E0318 13:47:35.863199 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56c95398d38d18c9a8cb3d786281ca482af50feb29b6d3c0ddf97b34bd555531\": container with ID starting with 56c95398d38d18c9a8cb3d786281ca482af50feb29b6d3c0ddf97b34bd555531 not found: ID does not exist" containerID="56c95398d38d18c9a8cb3d786281ca482af50feb29b6d3c0ddf97b34bd555531" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.863228 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c95398d38d18c9a8cb3d786281ca482af50feb29b6d3c0ddf97b34bd555531"} err="failed to get container status \"56c95398d38d18c9a8cb3d786281ca482af50feb29b6d3c0ddf97b34bd555531\": rpc error: code = NotFound desc = could not find container \"56c95398d38d18c9a8cb3d786281ca482af50feb29b6d3c0ddf97b34bd555531\": container with ID starting with 56c95398d38d18c9a8cb3d786281ca482af50feb29b6d3c0ddf97b34bd555531 not found: ID does not exist" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.863243 4843 scope.go:117] "RemoveContainer" containerID="78a96a98d3e18fd2737e6a52a17969d1ba7cd5bd259cad8d3e1f89f9a259e1f1" Mar 18 13:47:35 crc kubenswrapper[4843]: E0318 13:47:35.863572 4843 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78a96a98d3e18fd2737e6a52a17969d1ba7cd5bd259cad8d3e1f89f9a259e1f1\": container with ID starting with 78a96a98d3e18fd2737e6a52a17969d1ba7cd5bd259cad8d3e1f89f9a259e1f1 not found: ID does not exist" containerID="78a96a98d3e18fd2737e6a52a17969d1ba7cd5bd259cad8d3e1f89f9a259e1f1" Mar 18 13:47:35 crc kubenswrapper[4843]: I0318 13:47:35.863621 4843 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a96a98d3e18fd2737e6a52a17969d1ba7cd5bd259cad8d3e1f89f9a259e1f1"} err="failed to get container status \"78a96a98d3e18fd2737e6a52a17969d1ba7cd5bd259cad8d3e1f89f9a259e1f1\": rpc error: code = NotFound desc = could not find container \"78a96a98d3e18fd2737e6a52a17969d1ba7cd5bd259cad8d3e1f89f9a259e1f1\": container with ID starting with 78a96a98d3e18fd2737e6a52a17969d1ba7cd5bd259cad8d3e1f89f9a259e1f1 not found: ID does not exist" Mar 18 13:47:36 crc kubenswrapper[4843]: I0318 13:47:36.995734 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92bb2bd5-4d49-457c-9054-522965d7f404" path="/var/lib/kubelet/pods/92bb2bd5-4d49-457c-9054-522965d7f404/volumes" Mar 18 13:47:49 crc kubenswrapper[4843]: I0318 13:47:49.984546 4843 scope.go:117] "RemoveContainer" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" Mar 18 13:47:49 crc kubenswrapper[4843]: E0318 13:47:49.985432 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.165062 4843 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564028-td7b4"] Mar 18 13:48:00 crc kubenswrapper[4843]: E0318 13:48:00.166288 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bb2bd5-4d49-457c-9054-522965d7f404" containerName="extract-utilities" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.166310 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bb2bd5-4d49-457c-9054-522965d7f404" containerName="extract-utilities" Mar 18 13:48:00 crc kubenswrapper[4843]: E0318 13:48:00.166336 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bb2bd5-4d49-457c-9054-522965d7f404" containerName="registry-server" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.166346 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bb2bd5-4d49-457c-9054-522965d7f404" containerName="registry-server" Mar 18 13:48:00 crc kubenswrapper[4843]: E0318 13:48:00.166395 4843 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bb2bd5-4d49-457c-9054-522965d7f404" containerName="extract-content" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.166409 4843 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bb2bd5-4d49-457c-9054-522965d7f404" containerName="extract-content" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.166671 4843 memory_manager.go:354] "RemoveStaleState removing state" podUID="92bb2bd5-4d49-457c-9054-522965d7f404" containerName="registry-server" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.167562 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-td7b4" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.171230 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.171329 4843 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vxqqk" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.171471 4843 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.178900 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564028-td7b4"] Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.300998 4843 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdkd7\" (UniqueName: \"kubernetes.io/projected/83612e9b-2d63-4cd1-a8a4-52da2283bab3-kube-api-access-mdkd7\") pod \"auto-csr-approver-29564028-td7b4\" (UID: \"83612e9b-2d63-4cd1-a8a4-52da2283bab3\") " pod="openshift-infra/auto-csr-approver-29564028-td7b4" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.403876 4843 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdkd7\" (UniqueName: \"kubernetes.io/projected/83612e9b-2d63-4cd1-a8a4-52da2283bab3-kube-api-access-mdkd7\") pod \"auto-csr-approver-29564028-td7b4\" (UID: \"83612e9b-2d63-4cd1-a8a4-52da2283bab3\") " pod="openshift-infra/auto-csr-approver-29564028-td7b4" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.424924 4843 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdkd7\" (UniqueName: \"kubernetes.io/projected/83612e9b-2d63-4cd1-a8a4-52da2283bab3-kube-api-access-mdkd7\") pod \"auto-csr-approver-29564028-td7b4\" (UID: \"83612e9b-2d63-4cd1-a8a4-52da2283bab3\") " pod="openshift-infra/auto-csr-approver-29564028-td7b4" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.497929 4843 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-td7b4" Mar 18 13:48:00 crc kubenswrapper[4843]: I0318 13:48:00.959535 4843 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564028-td7b4"] Mar 18 13:48:00 crc kubenswrapper[4843]: W0318 13:48:00.960310 4843 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83612e9b_2d63_4cd1_a8a4_52da2283bab3.slice/crio-77a56f9e10fbfe524f33ce90875b174ebc2d226f13fdcebaa2d008611c8476cf WatchSource:0}: Error finding container 77a56f9e10fbfe524f33ce90875b174ebc2d226f13fdcebaa2d008611c8476cf: Status 404 returned error can't find the container with id 77a56f9e10fbfe524f33ce90875b174ebc2d226f13fdcebaa2d008611c8476cf Mar 18 13:48:01 crc kubenswrapper[4843]: I0318 13:48:01.009273 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564028-td7b4" event={"ID":"83612e9b-2d63-4cd1-a8a4-52da2283bab3","Type":"ContainerStarted","Data":"77a56f9e10fbfe524f33ce90875b174ebc2d226f13fdcebaa2d008611c8476cf"} Mar 18 13:48:03 crc kubenswrapper[4843]: I0318 13:48:03.032091 4843 generic.go:334] "Generic (PLEG): container finished" podID="83612e9b-2d63-4cd1-a8a4-52da2283bab3" containerID="5321c7080d7fe258eeb2bf576161cb39b7d43e5e59ceebeccf98ecedce850487" exitCode=0 Mar 18 13:48:03 crc kubenswrapper[4843]: I0318 13:48:03.032224 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564028-td7b4" event={"ID":"83612e9b-2d63-4cd1-a8a4-52da2283bab3","Type":"ContainerDied","Data":"5321c7080d7fe258eeb2bf576161cb39b7d43e5e59ceebeccf98ecedce850487"} Mar 18 13:48:03 crc kubenswrapper[4843]: I0318 13:48:03.984326 4843 scope.go:117] "RemoveContainer" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" Mar 18 13:48:03 crc kubenswrapper[4843]: E0318 13:48:03.984754 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:48:04 crc kubenswrapper[4843]: I0318 13:48:04.382158 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-td7b4" Mar 18 13:48:04 crc kubenswrapper[4843]: I0318 13:48:04.492091 4843 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdkd7\" (UniqueName: \"kubernetes.io/projected/83612e9b-2d63-4cd1-a8a4-52da2283bab3-kube-api-access-mdkd7\") pod \"83612e9b-2d63-4cd1-a8a4-52da2283bab3\" (UID: \"83612e9b-2d63-4cd1-a8a4-52da2283bab3\") " Mar 18 13:48:04 crc kubenswrapper[4843]: I0318 13:48:04.500596 4843 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83612e9b-2d63-4cd1-a8a4-52da2283bab3-kube-api-access-mdkd7" (OuterVolumeSpecName: "kube-api-access-mdkd7") pod "83612e9b-2d63-4cd1-a8a4-52da2283bab3" (UID: "83612e9b-2d63-4cd1-a8a4-52da2283bab3"). InnerVolumeSpecName "kube-api-access-mdkd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:48:04 crc kubenswrapper[4843]: I0318 13:48:04.594811 4843 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdkd7\" (UniqueName: \"kubernetes.io/projected/83612e9b-2d63-4cd1-a8a4-52da2283bab3-kube-api-access-mdkd7\") on node \"crc\" DevicePath \"\"" Mar 18 13:48:05 crc kubenswrapper[4843]: I0318 13:48:05.055006 4843 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564028-td7b4" event={"ID":"83612e9b-2d63-4cd1-a8a4-52da2283bab3","Type":"ContainerDied","Data":"77a56f9e10fbfe524f33ce90875b174ebc2d226f13fdcebaa2d008611c8476cf"} Mar 18 13:48:05 crc kubenswrapper[4843]: I0318 13:48:05.055402 4843 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77a56f9e10fbfe524f33ce90875b174ebc2d226f13fdcebaa2d008611c8476cf" Mar 18 13:48:05 crc kubenswrapper[4843]: I0318 13:48:05.055082 4843 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-td7b4" Mar 18 13:48:05 crc kubenswrapper[4843]: I0318 13:48:05.455232 4843 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-tclkp"] Mar 18 13:48:05 crc kubenswrapper[4843]: I0318 13:48:05.463675 4843 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-tclkp"] Mar 18 13:48:06 crc kubenswrapper[4843]: I0318 13:48:06.994073 4843 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8953032d-0dee-49d0-803c-c43b5e93542a" path="/var/lib/kubelet/pods/8953032d-0dee-49d0-803c-c43b5e93542a/volumes" Mar 18 13:48:11 crc kubenswrapper[4843]: I0318 13:48:11.513109 4843 scope.go:117] "RemoveContainer" containerID="ab64d1cc939eb1bab9f7e20ec9ac5c6d68808dd396469aebcf57d9d8746553ed" Mar 18 13:48:15 crc kubenswrapper[4843]: I0318 13:48:15.984152 4843 scope.go:117] "RemoveContainer" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" Mar 18 13:48:15 crc kubenswrapper[4843]: E0318 13:48:15.984926 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:48:28 crc kubenswrapper[4843]: I0318 13:48:28.984330 4843 scope.go:117] "RemoveContainer" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" Mar 18 13:48:28 crc kubenswrapper[4843]: E0318 13:48:28.985162 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:48:41 crc kubenswrapper[4843]: I0318 13:48:41.984887 4843 scope.go:117] "RemoveContainer" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" Mar 18 13:48:41 crc kubenswrapper[4843]: E0318 13:48:41.985904 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" Mar 18 13:48:56 crc kubenswrapper[4843]: I0318 13:48:56.990831 4843 scope.go:117] "RemoveContainer" containerID="16f53861bd16b48f621a2985e668d874207929653724a047380ef48b8700ab7a" Mar 18 13:48:56 crc kubenswrapper[4843]: E0318 13:48:56.992826 4843 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wstcq_openshift-machine-config-operator(f5a185c4-48ac-4f51-99be-0a9418d9e53f)\"" pod="openshift-machine-config-operator/machine-config-daemon-wstcq" podUID="f5a185c4-48ac-4f51-99be-0a9418d9e53f" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515156526533024460 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015156526534017376 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015156512413016510 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015156512414015461 5ustar corecore